Wednesday, July 13, 2022

What Happens when Big Brother Meets Big Tech


Author and law professor Maurice Stucke warns that as fundamental privacy rights vanish, your personal data can and will be used against you.

University of Tennessee law professor Maurice Stucke, author of “Breaking Away: How to Regain Control Over Our Data, Privacy, and Autonomy” has been critical as tech firms have grown into giant “data-opolies” profiting from surveillance and manipulation. In a conversation with the Institute for New Economic Thinking, he warns that legislative inaction and wider government complicity in this surveillance are eroding fundamental rights to privacy along with the ability of federal agencies to regulate Big Tech.

Lynn Parramore: Concern over privacy is increasing right now, with people worrying about different aspects of the concept. Can you say a bit about what privacy means in a legal context? With the digital revolution, privacy obviously means something different than it did 50 years ago.

MS: Yes, privacy is not a single unitary concept. There are different strands. There’s bodily privacy and decisional privacy – the right to make important decisions about one’s life, like whether to have a child or not, without governmental interference. Within the bucket of decisional privacy would also be marriage, contraception, and things of that nature. There’s intellectual privacy (such as what one reads, watches, or thinks) and associational privacy (such as the freedom to choose with whom one associates). Informational privacy is another strand, where you can control your personal information, including the purpose for which it is used.

There used to be the idea that data protection and privacy are fundamental human rights.

Numerous supporters of privacy rights have argued that U.S. Constitution should protect an individual’s right to control his or her personal information. One of the earlier Supreme Court cases involving informational privacy tested that belief. New York passed a law requiring doctors to disclose to the government their patients’ name, age, and address when they were prescribed certain drugs. All of this information was collected in a database in New York. A group of patients and their prescribing doctors challenged the law, contending that it invaded their constitutionally-protected privacy interests. The case was decided in 1977 -- before the Internet and cloud computing. The Supreme Court, however, did not perceive any threat to privacy implicit in the accumulation of vast amounts of personal information in computerized data banks or other massive government files. The Court instead noted how the mainframe computer storing the data was isolated, not connected to anything else. Today, the data are not collected and maintained on some isolated mainframe. A torrent of data is being collected about us that we may not even have thought about. When you go to purchase gas at the local station, for example, you may not think of the privacy implications of that transaction. But there are powerful entities that collect vast reservoirs of first-party data from customers, and also sources that are reselling it, like the data brokers.

Congress, unlike the Supreme Court, recognized in the 1970s that the privacy of an individual is directly affected by the government’s collection, use, and dissemination of personal information and that the government’s increasing use of computers and sophisticated information technology has greatly magnified the harm to individual privacy. The preamble of the Privacy Act of 1974, enacted by Congress, states that privacy is a fundamental right protected by the Constitution. It was a landmark law in seeking to provide individuals greater control over their personal data in government files.

But the Supreme Court, on two occasions when it had the opportunity, declined to hold that the Constitution protects informational privacy as a personal and fundamental right. A majority of the justices just punted. They said that even if one assumed that such a right existed, it did not prevent the government from collecting the information it sought in both cases. Justices Scalia and Thomas were blunter in their concurring opinion: they simply argued that there is no constitutional right to informational privacy.

LP: What are some of the ways we are most vulnerable to government intrusion into our personal data right now?

MS: Well, the state can tap into the surveillance economy.

There are significant concerns about virtual assistants like Alexa. There was a case in Arkansas where someone was murdered in a hot tub, and the defendant had an Alexa device in his house. The government sought from Amazon any audio recordings and transcripts that were created as a result of interactions with the defendant’s Alexa device. The government wanted access to the data collected by Alexa to see if there was any incriminating evidence. Alexa records what information you ask it to find. That’s the data it’s supposed to store. But there have also been concerns that Alexa may record more data than it was intended to, such as communications between family members.

Geolocation data is another big concern. Consider the Supreme Court’s decision in Carpenter v. United States [the 2018 decision requiring a warrant for police to access cell site location data from a cell phone company]. The Court said there’s a privacy interest in one’s geolocation data under the Constitution’s Fourth Amendment. Our movements, the Court noted, provide an intimate window into our lives, revealing not only where we go, but through them our “familial, political, professional, religious, and sexual associations.” So, how then did the U.S. Department of Homeland Security obtain millions of Americans’ location data without any warrant? The Trump administration simply tapped into the surveillance economy. It purchased access to a commercially-available database that maps our movements every day through our phones and the apps on our phones. Unless you turn off your phone or leave it at home, your phone is tracking you and potentially letting the authorities know where you’re going, how long you stayed there, when you came home, etc.

LP: So our geolocation data can actually be purchased by government officials, bypassing the need for a search warrant?

MS: Exactly. Now the government can just buy it, and that’s even scarier. The current Supreme Court does not appear to view the right to privacy as a personal and fundamental right protected by the Constitution. This is where Europe differs -- its Charter of Fundamental Rights specifically recognizes privacy and data protection as fundamental human rights. Some U.S. states recognize privacy as a fundamental right as well, including California, but not all. That’s one of the concerns with the Court’s overturning Roe v. Wade -- it’s stripping away privacy rights that are inferred by multiple constitutional provisions. The Dobbs v. Jackson Women’s Health Organization decision really shows how a simple change in the composition of the Court can enable it to eliminate or chisel away what had been viewed as a fundamental privacy right. If the Court says you don’t have these rights, that these rights aren’t in the Constitution, you would have to then get a Constitutional amendment to change it. What are the chances of getting a constitutional amendment? I remember when growing up the challenges in getting the states to ratify the Equal Rights Amendment. No one even talks about amending the Constitution anymore.

So now you have the states and federal government tapping into the surveillance economy. The government can be complicit in it and even benefit from the private surveillance economy because now it’s easier to prosecute these cases without getting a warrant.

LP: So far, we’ve been talking about what we do online, but you’ve pointed out that it doesn’t stop there because the line between the online world and the offline world is blurry.

MS: That’s right. For example, Baltimore has a very high murder rate per capita and clears only 32 percent of its homicide cases. Even though Baltimore installed over 800 surveillance cameras and a network of license plate readers, the high crime rate persisted. So a private company offered three small airplanes equipped with surveillance cameras that can cover over 90% of the city at any moment. The pilot program tracked over 500,000 Baltimore residents during the daytime. Ultimately the program was struck down by the U.S. Court of Appeals for the Fourth Circuit as being an unreasonable search and seizure. [Leaders of a Beautiful Struggle v. Baltimore Police Dep't] But nothing stops this private company from going to other communities to institute the same surveillance program. Other courts might take the same view as the dissenting judges in the Baltimore case, namely, that people should not expect any privacy in their public movements, even if they are tracked for weeks or months. As a result, you might have extensive aerial surveillance, in addition to all the other surveillance tools being employed already.

The thing about the Baltimore case (and this is what the seven dissenting judges focused on) is that the company obscured the faces of the individuals, which were represented as “mere pixelated dots.” This was by choice. So what’s the invasion of privacy if the police can only see dots moving across the city? The majority opinion noted how the police could employ its other existing surveillance tools, such as on-the-ground surveillance cameras and license-plate readers, to identify those dots. Moreover, if you see a dot going into a house around 6 pm and emerging in the morning, you can assume that the person lives at that house. So the fact that the aerial surveillance depicts you as a dot is a red herring. The police could just cross-reference the dot with all the other technology that it is already using, like the license plate readers, the street surveillance cameras, and the facial recognition software, to identify who that dot is. That’s a key takeaway. You might think, oh, I can protect my privacy in one avenue, but then you have to think about all the other data that’s being collected about you.

Thus, we should take little comfort when Google says that it will delete entries from a person’s location history if it detects a visit to an abortion clinic. One need only piece together the other data that is currently being collected about individuals, to determine whether they obtained an abortion.

LP: You’ve noted that the Supreme Court is doing away with any notion that there’s a fundamental right to privacy. Is this a sign of creeping authoritarianism?

MS: It could be. You could end up with either an authoritarian state model or a commercial surveillance economy that the government co-ops for its purposes.

We are also running into another problem when you consider the Supreme Court’s recent West Virginia v. EPA decision. The Court cut back on the ability of the federal administrative agencies to regulate absent a clear congressional mandate. That potentially impacts a lot of areas, including privacy. For example, the federal agencies, under the Biden administration, could regulate apps and Big Tech firms, and tell them not to disclose health information to law enforcement. But those regulations could be challenged, using a new weapon with the Court’s EPA decision. The Court might very well strike down such regulations on the basis that privacy protection implicates major social and economic policy decisions, and decisions of such magnitude and consequence rest primarily with Congress, and not with the FTC or any other agency. And because Congress has been incapable of providing a comprehensive privacy framework, you are out of luck, unless your state offers some privacy protections.

LP: What would you like to see Biden doing regarding data protection? You’ve noted the importance of behavioral advertising to this discussion – advertising which allows advertisers and publishers to display highly-personalized ads and messages to users based on web-browsing behavior.

MS: Behavioral advertising is why these companies are tracking us all across the web. We need to address the fundamental problem of behavioral advertising and the collection of all this data. One issue is to what extent can the FTC use its authority under the Federal Trade Commission Act of 1914 to promote privacy and give individuals greater control over their data after the Court’s recent EPA decision.

The second issue is how to create a robust framework that actually protects our privacy. If you just say, well, companies can’t collect certain kinds of data, it’s not going to be effective. Facebook, for example, can make so many inferences about an individual just from their “likes.” It could discern their age, gender, sexual orientation, ethnicity, religious and political views, personality traits, intelligence, happiness, and use of addictive substances -- so much information by something seemingly innocuous. The more people “like” something, the more accurate the information is and the more personal it can become, such that Facebook can know more about an individual than that individual’s closest friends. If you prohibit behavioral advertising, you’re – hopefully – going to then lessen the company’s incentive to collect that data in order to profile you and manipulate your behavior.

Robust privacy protection means giving individuals greater control over what’s being collected, whether or not that data can be collected, and for what limited purposes it can be used.

LP: Can we really put the behavioral advertising genie back in the bottle now that it has become so pervasive, so key to the Big Tech business model?

MS: Absolutely. A business model can be changed. Most people are opposed to behavioral advertising, and that has bipartisan support. Senator Josh Hawley [R-MO], for example, offered the Do Not Track Act, which centered on data collected for behavioral advertising.

There’s also bipartisan support for anti-trust legislation to rein in these data-opolies. The House did a great report about the risks that these data-opolies pose to our economy and democracy, and there were several bipartisan bills on updating our antitrust laws for the digital economy. All the bills had bipartisan support and went through the committee. Unfortunately, they’re still being held up for a floor vote. There was even a recent John Oliver show about two of the proposed bills, and the legislation still hasn’t gotten through. This is the fault of Republican and Democratic leadership, including Schumer and Pelosi. Big Tech has spent millions of dollars lobbying against these measures and they’ve come up with these bogus commercials and bogus claims about how this legislation is going to harm our privacy.

In Europe, they’re getting this legislation through without these problems, but in the U.S., you’ve also got the Supreme Court and many lower courts chipping away at the right to privacy and the ability of the agencies to regulate in this area. The agencies can move faster than Congress in implementing privacy protections. But the status quo benefits these powerful companies because when there’s a legal void, these companies will exploit it to maximize profits at our expense.

Behavioral advertising is not about giving us more relevant ads. The data is not being used solely to profile us or predict our behavior. It’s being used to manipulate. That is what the Facebook Files [an investigative series from the Wall Street Journal based on leaked documents] brought to the fore. Facebook already tells advertisers how it can target individuals who have just had a recent breakup, for example, with advertising for certain products. They can maximize advertising profits by not just predicting what people might want but by manipulating them into emotional states in which they are more likely to make certain purchases. The Facebook Files showed that Facebook’s targeting actually causes teenage girls to develop eating disorders. It’s depressing when you think about it.

LP: People are increasingly thinking about how to protect themselves as individuals. What steps might be effective?

MS: There are some small steps. You can support a search engine that doesn’t track you, like DuckDuckGo. Cancel Amazon Prime. Avoid Facebook. But avoiding the surveillance economy is nearly impossible. If you don’t want to be tracked, don’t bring your phone with you. Of course, Carpenter v. United States is instructive on that point. The Court noted how “nearly three-quarters of smartphone users report being within five feet of their phones most of the time, with 12% admitting that they even use their phones in the shower.” Some people even bring them into the shower! It’s not realistic to force people to forgo their phones if they want their privacy. Realistically there are very few protections, and it’s very, very hard to opt out because even seemingly benign bits of information that you wouldn’t think would incriminate you can be very telling when they are combined with other data.

New York did a study about how much health information is being transmitted every day to Facebook, and it’s staggering. Facebook receives approximately one billion events per day from health apps alone on users, such as when someone opened the app, clicked, swiped, or viewed certain pages, and placed items into a checkout. All of these health-related apps are continually sending the data to Facebook, most likely without the individual’s knowledge. So, you might think you’re going to avoid Facebook, but if you’re on a popular app or using a smartwatch, it may very likely be sending detailed, highly sensitive information about you – including when you are menstruating or wanting to get pregnant – to Facebook and the other data-opolies.

We’re moving into a situation where our every movement can be tracked. Just look at China. We don’t have to imagine what the counterfactual is: China is actively investing in the surveillance of its citizens. There it’s mostly the government. Here in the U.S., you could say, well, the government is not doing that. But here the government doesn’t have to. These powerful firms are already doing it, and some of the government agencies are complicit in that surveillance economy.

LP: So we’re really not as different from China as we might like to think.

MS: Right. The companies that are surveilling us are largely unaccountable. Google and Facebook have committed numerous privacy violations. As the technology improves, the invasiveness will get even creepier. You’re going to have technologies that read a person’s thoughts and decipher their emotions -- and not even just decipher their emotions but predict and manipulate their emotions. To see what’s on the horizon, just look at the influx of patented technology. It’s scary.

After the initial reaction to the Supreme Court’s recent decisions has subsided, we need to consider the broader implications of these rulings. Hopefully, people will, even if they don’t agree philosophically or ideologically with the dissenting justices, be concerned with what the majority is doing. Will the Court make other personal decisions about myself and my family? What is to stop some states and this Court from deciding whom I can marry? What birth control can I use if any? To what extent are my rights, including the right to be left alone, protected? We’re seeing a steady erosion happening now. History teaches us that anything is possible. Germany was said to be the land of poets and thinkers – a nation that would never, ever accede to something like the Nazi Party. Totalitarianism was supposed to be beyond the realm of possibility.

Privacy legislation seems unlikely right now, and things are looking bad on so many levels. The economy has tanked. Inflation is eating away at our paychecks and savings. Gun violence. Global warming. Greater mistrust across political lines. Greater tribalism and rancor. No wonder most Americans believe that the country is heading in the wrong direction. It seems like we’re incapable of building or achieving anything. One wonders whether we are approaching the decline of civilization. But the thing about human events is that you could have remarkable change coming from unexpected places. Consider the Berlin Wall. It was for decades a fact of life: people thought their children and grandchildren would have to live in a city and country divided by this physical and ideological wall. Then all of a sudden, the wall was gone. It wasn’t the politicians that negotiated this to happen. It was the thousands of Germans who had enough of the Stasi, the surveillance state, and the repression of their freedoms. Meaningful privacy change requires people to say, I’m not going to tolerate what these companies are doing. I’m not going to tolerate the government engaging in surveillance.

I don’t want to seem defeatist. Just look at the California Consumer Privacy Act of 2018 and California Privacy Rights Act of 2020. There, a real estate developer spearheaded a revolution in privacy legislation. California was the last place one would expect this to occur -- the home of Google, Apple, and Facebook. But the developer spearheaded privacy reform by threatening a ballot. And when that 2018 legislation proved to be insufficient, that same real estate developer was able to get a ballot for amending and strengthening the law, and the majority of the Californians voted in favor of it. The 2020 statute is complex, over 50 pages long. There was a lot of lobbying by Big Tech against it, but the people got it.

We don’t have to accept the status quo. We can change things in small part through our behavior. If you don’t like Google, then don’t use it. If you switch to DuckDuckGo, it’s not going to be that great at first, but as more people switch to it, it’s going to get better through network effects. If you don’t like Facebook, then delete your account – but recognize that it’s not sufficient. You’ve still got to support privacy legislation. Congress can get it done. They were able to pass other legislation, like requiring federal judges to disclose conflicts, rather quickly. There’s no reason they shouldn’t be able to do this except for the lobbying and all the money that’s being thrown around. If the people push it, it can be done.

People can have an awakening that things are not all right. Young people could have an awakening about just how precarious our rights are and not take them for granted. Maybe they will see that our democracy is on not on cruise control and it’s not just operating on its own. It takes everybody getting involved on a local level and saying, I’m not going to take this any longer. Change can occur, but only if we demand it.


Read More

No comments:

Post a Comment