Artboard 1
Outside contributors’ opinions and analysis of the most important issues in politics, science, and culture.
It’s the tech bro equivalent of a 1950s B-movie: Evil data scientists betray the simple trust of an unwitting, socially benevolent company to snatch private customer data and turn it against them and the world! Mark Zuckerberg’s apology for what has happened at Facebook more or less follows that script.
Don’t get me wrong — I’m sure Zuckerberg is very sorry that his share price has plummeted and that the #DeleteFacebook movement is gaining momentum, putting billions of dollars in shareholder value at risk.
But this story is not about Facebook’s rules being broken, or the apps that seek to discern a more algorithmically predictable “you” through inane clickbait quizzes about which philosopher or dog breed you like best. It’s not, fundamentally, about Cambridge Analytica, or even about Facebook. If you delete Facebook because you’re worried about privacy, you’ll have to delete almost every other app and platform too, because almost everything else on the internet is operating the same way.
This story, at root, is about what you don’t know you’re sharing simply by being online. It’s about how companies take that data and sell it and use it for purposes that you have no say in.
Much commentary in the US has suggested that there is no way out of the dystopia that we’ve constructed for ourselves, short of deleting Facebook and turning away from Google. But that can be hard to do, so interconnected have Facebook and other platforms become with all sorts of internet services people depend on.
But Europe is about to point to a better way of balancing the interests of technological innovation and privacy concerns. It’s been undercovered by the US media, but the era of the data robber barons will be massively disrupted on May 25, when the European Union’s General Data Protection Regulation Act, enshrining data protection as a fundamental human right, goes into effect.
US companies are in the sights of EU regulators
Great, you say, but we’re not Europe. And US federal regulation is the product of the naive view that we read and actively assent to the privacy policies on the websites we visit. (Policies that, among other things, don’t cover or explain the algorithms scraping our data because those are protected as trade secrets.) Carnegie Mellon’s Lorrie Faith Cranor has calculated that it would take us some 180 to 300 hours per year if we actually read those texts.
But global US companies, especially those using cloud-based marketing and ad technology that tracks European users’ browsing habits, face significant regulatory exposure.
So how, exactly, will the new European rules, known as GDPR for short, work? Let’s scroll back to what happens when you log on to your friendly social media site. You tick a box agreeing to share all the data generated by your virtual self from the moment you show up online, and in return, you get to access to the good stuff.
From here on in, everything you do is recorded and can be packaged and sold to whoever might find it useful. In the case of Facebook, it’s not what you share with whom when you are on the site or the apps and quizzes you take that are the only issues; they form just a small part of what you are giving up.
The point is that Facebook sees it all, owns it all, and can sell it all. Your data is the cryptocurrency of the digital economy; you have no idea how it is being used to sell you stuff, intentionally, by a tech company — or used against you by being stolen.
If my location at a concert in Seattle on a specific date and time is being sold for money to some travel website, as is the fact that I live in New York, I should know that; but I don’t.
The argument a tech company lawyer might offer, namely that consent has been given through a pop-up form filled with paragraph after paragraph of 6-point-size legalese is, frankly, bullshit. And before you say, actually, yes, it does really authorize consent — well, the EU simply disagrees. The EU is saying, We are not letting companies get away with this charade anymore.
Europe sees private data collection and US governmental snooping as interconnected
European officials have been more concerned about the state of data privacy and regulation than their American counterparts for quite some time, but the immediate origins of the new data protection law can be traced to the fury over the extent of US surveillance in the years after 9/11. On January 28, 2014 — European Data Protection Day (yes, it’s been celebrated in Europe since 2006) — EU vice president and justice commissioner Viviane Reding said in a speech that European citizens were “in shock” over the extent of the National Security Agency snooping.
She argued that “data collection by companies and surveillance by governments are connected, not separate.” She continued:
Reding also said it would make no sense for the EU to assert fundamental rights for EU nationals, or a particular geographic region, but not for anyone else. Given the open nature of the internet, there had to be one data protection act to rule them all. It was a warning to US companies that they would not evade the reach of European law simply by being located in the US.
To put it another way, as long as one US cookie sat on a European national’s computer monitoring online behavior, the company that put it there was going to be held responsible.
Despite intense lobbying by US tech, the European Parliament and Council was not dissuaded from passing what Reding has called the “gold standard” in data privacy protection. The law was approved in April 2016, with a two-year grace period before it would be enforced. It applies to any company transacting with an EU national for goods or services — regardless of whether there is a payment involved.
It covers any kind of marketing tech or advertising technology. As a result, these regulations apply to most global US corporations you can think of and, potentially, to thousands of US businesses doing business in Europe that you can’t.
Protections the new law provides
Consider some of what the new data protection law guarantees to EU citizens — again, as of May 25:
A right to know what’s being done with your data, and a right to access it: EU consumers will have the right to obtain from, say, Facebook, confirmation that your data is being collected and held, where, and for what purpose. They also have a right to request and receive a copy of that data, free of charge, in an electronic format, upon request.
Firm guidelines for consent: For your information to be collected, you must give a clear “yes” answer to a clearly presented request. What’s more, it must be as easy for you to withdraw consent as it is to give.
Privacy by design: Privacy is not something a company adds on as an afterthought; any online service that needs personal data in order to function must collect the minimum amount of data necessary for that purpose. No hoovering up everything just because it might be useful in a future context or sellable to a third party.
A right to be “forgotten”: You will be able to request that a company erase your personal data, stop disseminating it, and have third parties stop processing your data too. There are public interest exemptions to this right, such as archiving historical and scientific data. When it comes to balancing the right to be forgotten against freedom of expression or freedom of information, the law says that “it is necessary to interpret notions relating to that freedom, such as journalism, broadly.” What this vague exhortation will mean in practice, observers think, is litigation. There is no shared legal definition for freedom of information in Europe, and the limits of journalism are likely to be tested by anyone with deep pockets and something to hide.
Data portability: You have the right to take your data from one company and give it to another in an easy-to-read format. This is a huge deal in the areas of education, health care, and insurance, Want to change doctors? Just give the new one permission to access your data set. Your old doctor no longer has your records.
Breach notification: Companies must notify you of any security breaches within 72 hours. No more, “Oh, by the way, we were hacked six months ago.”
Data protection officers: The new law requires that companies be able to demonstrate compliance with its requirements through internal documentation if the data collected is relatively modest, and through a data protection officer if the data is voluminous.
Penalties: Failure to comply with the law will be assessed in fines that can top out at 4 percent of a company’s global annual revenue for the preceding year, or €20 million (roughly $24.5 million), whichever is greater. A 4 percent fine on Facebook’s global revenue would be a staggering punishment.
Reaction in the US to this new law has either been denial of this reality — it’s going to be another Y2K moment, all hype, no consequence — or quiet panic.The research company Forrester predicts that 80 percent of companies affected by GDPR will not be compliant come May 25, and that of these, 50 percent will be intentionally noncompliant, meaning that they’ve weighed the costs and risks and decided to roll the dice. Another research giant, Gartner, predicts we will see a multimillion-dollar fine for regulatory noncompliance before 2020.
The challenge for companies is technical: Compliance requires a system that enables active consent, maximal transparency, and enhanced security, all of which require thinking about how to use “distributed ledgers” and blockchain-based technologies to validate transactions.
In truth, the technical solutions aren’t that hard to pull off. (
In 2010, Zuckerberg told TechCrunch’s Michael Arrington that the “social norm” regarding privacy had “evolved over time,” and that Facebook’s role was to reflect what those norms were. In short, people were comfortable sharing their personal information, so why worry about privacy?
But norms are shifting once more. Looking back, we can frame the development of digital behavior into three phases: First, there was a naiveté phase, where consumers didn’t really understand the technology and what it meant. Then there was the careless phase, where people saw data rights or privacy as either unimportant or an acceptable price of entry to all the good, free stuff. Now it is clear we are entering the demand phase, which sees the emergence of a more savvy, engaged, and alarmed digital consumer — and related movements to create and enforce consumer rights.
Europe is far ahead of the United States in shifting into that final phase. But the arrival of the new data protection rules on May 25 may serve as a wake-up call: Global US companies can’t have one set of gold-standard privacy rules for European customers and another set of fool’s gold rules for US customers and expect Americans not to cry foul.
Facebook has done Americans and the world a favor in showing how people’s data is really used. Now Europe can show us what successful, ethical privacy regulation looks like.
Trevor Butterworth is vice president of research for CynjaTech, a company specializing in data protection and privacy. He is the author of the white paper “GDPR: Threat or Opportunity.” He is also the executive director of Sense About Science USA.
The Big Idea is Vox’s home for smart discussion of the most important issues and ideas in politics, science, and culture — typically by outside contributors. If you have an idea for a piece, pitch us at [email protected].
Sourse: vox.com