Facebook’s very bad year, explained

Facebook’s very bad year, explained

Facebook started 2018 talking about bringing people together by showing users more “meaningful posts” from their friends and family. It’s ending 2018 explaining why it was sharing information about those friends and families with dozens of companies without users’ consent.

Over the past year, the social network has found itself at the center of a growing storm over a wide array of issues, ranging from data privacy to Russian meddling to fake news. The company and CEO Mark Zuckerberg have issued multiple apologies for its missteps, and yet the scandals keep coming.

Just this week, the New York Times reported that Facebook had let companies such as Spotify and Netflix read users’ private messages, and Washington, DC, Attorney General Karl Racine sued Facebook for letting the political consulting firm Cambridge Analytica access data from some 87 million users.

“We have great products here that people love,” Zuckerberg said in a call discussing Facebook’s quarterly earnings in January. That’s becoming progressively less the case.

It’s not entirely clear yet what Facebook’s complete 2018 story will be, but there is at the very least a pattern: Facebook does the bad thing, hides the bad thing, and then when the bad thing becomes public, it says it’s sorry and offers up explanations, only to either keep doing that bad thing or repeat the cycle related to a different bad thing.

That’s all left it unclear as to whether Facebook can, or is willing to, fix itself.

The start of the year was relatively smooth

Facebook had a fairly normal start to the year.

Its first big announcement of 2018 was that it would show people more posts from their friends and families in their News Feed in response to criticism that it was overprioritizing content from businesses, media, and brands. In a post, Zuckerberg said he wanted Facebook to be “good for people’s well-being.”

The company also said it would do better about making sure news was from “trusted sources” and prioritizing local news and putting out posts on social media and democracy. It said it was getting ready for new privacy rules out of Europe because it “takes data protection and people’s privacy very seriously.”

In February, special counsel Robert Mueller indicted 13 Russian nationals and three Russian entities, focusing primarily on a Russian troll farm called the Internet Research Agency, for their political propaganda efforts in the US, including on social media platforms such as Facebook. The focus was on the Russian actors, though, not the platforms they used.

The same month, Wired published a long piece about Facebook’s “hellish” two years, but the tone was that it might turn around. Facebook had “evolved” and come to realize some of its responsibilities. Facebook, perhaps, was getting better.

Except it wasn’t.

Then Cambridge Analytica hit

On March 16, Facebook made a sudden announcement that it was suspending a relatively obscure political consultancy, Strategic Communication Laboratories, and its data analytics firm, Cambridge Analytica, from its platform. On March 17, we found out why: The New York Times and the Guardian published a pair of blockbuster stories outlining how Cambridge Analytica had harvested private information from more than 50 million users without their permission.

The now-defunct firm had worked with multiple political campaigns, including Donald Trump’s 2016 presidential bid, and claimed to be able to create “psychographic” profiles to create personality profiles for voters. Cambridge Analytica got the data from a researcher who made a personality quiz app on Facebook that collected information on users and their friends.

After the Cambridge Analytica scandal broke, lawmakers, regulators, and users all over the world were, understandably, outraged. The Federal Trade Commission said it would launch an investigation into whether Facebook’s handling of data violated a 2011 consent order it had with the company.

Facebook said it was sorry and promised to do better. It literally took out full-page newspaper ads apologizing.

In April, Facebook admitted that 87 million users had been affected by the Cambridge Analytica scandal, and Zuckerberg went to Washington. He testified before both Senate and House of Representatives and fielded a wide range of questions from lawmakers, including its efforts to combat Russian disinformation and fake news, user privacy, and potential monopolistic practices.

What became clear in the hearings was that US lawmakers seem confused about what Facebook does, what its problems are, and how to fix it — in other words, don’t hold your breath if you’re anticipating big tech regulation from the US.

As part of its apology tour, Facebook banned the Russian trolls from the IRA, said it would make ads and pages more transparent, and put out a splashy video saying it would do better.

Zuckerberg runs into troubles in Europe

In May, it was European lawmakers’ turn to take a crack at Zuckerberg, who appeared before the European Parliament. The good news: European politicians seem to have a much better hold on the ins and outs of Facebook and approached Zuckerberg with tough, skeptical questions. The bad news: Zuckerberg got about 10 minutes, at the end of the hearing, to respond.

May was also the month that the General Data Protection Regulation (GDPR), a new privacy and data collection law, went into effect in Europe. Facebook made a show of complying.

The same month, the Guardian mentioned a lawsuit brought against Facebook in the US by an app developer named Six4Three that alleged its data policies favored some companies over others. The story was largely missed, but one British lawmaker paid attention, and in December, he obtained and released more than 200 pages of documentation from the suit. Among the revelations: Zuckerberg and his team discussed how to make money off user data, and Facebook discussed “whitelist” agreements with multiple companies to help them access user information.

Over a later data breach, Facebook could also potentially face a $1.6 billion fine out of Europe.

The scandals just keep coming

Month after month, through the summer and into the fall, revelations about Facebook kept coming.

In June, the Times reported that Facebook gave some 60 device makers access to user data, including Huawei, a Chinese telecommunications company that US intelligence has expressed concerns about for years.

Also over the summer, Facebook revealed bugs in features that let users decide whom they shared content with and whom they blocked. It made announcements about flagging and deleting suspicious activity ahead of the 2018 midterms and tackling accounts out of Russia and Iran. It also said the Department of Justice, the FBI, and the Securities and Exchange Commission were looking into its affairs as part of the Cambridge Analytica probe.

But it became a sort of one step forward, two steps back scenario.

On August 6, Facebook banned right-wing conspiracy theorist Alex Jones; on August 13, the federal government filed charges saying that Facebook had violated the Fair Housing Act by allowing ads to discriminate against certain groups. (Facebook later said it was removing the ads.) And in September, the American Civil Liberties Union alleged that Facebook let employers target job ads just to men.

In April, Facebook said it would self-implement the Honest Ads Act, legislation that would require more transparency about who’s buying political ads on its platform. As the midterms approached, people were still able to buy and place political ads and put them under anyone’s names, including Vice President Mike Pence and the Islamic State.

Facebook’s very bad year, explained

The concerns about data privacy and security go well beyond Cambridge Analytica

Over the course of the year, it’s become increasingly clear that Facebook’s security and privacy issues go far beyond Cambridge Analytica — and that the company is never going to come out and say what its problems are, or fix them.

In September, Facebook released a “security update” saying a breach had exposed the data of 50 million users. It eventually revealed that about 30 million users’ “access tokens” had been stolen that hackers could use to take over people’s accounts.

Then in December, Facebook said it has exposed up to 6.8 million people’s private photos in another leak. That breach had also happened in September, but Facebook waited about six weeks to mention it.

It’s not clear if Facebook actually can — or wants to — get better

What’s become increasingly clear throughout the year is that Facebook might not want or have the ability to fix itself. While publicly it’s constantly apologizing, privately, it’s still acting shady.

In November, the Times detailed how Facebook, including Zuckerberg and chief operating officer Sheryl Sandberg, had sought to downplay and deny recent scandals around it, including Cambridge Analytica and Russian meddling. The report also outlined how Facebook hired the Republican consulting firm Definers to conduct and spread opposition research about its detractors, including highlighting their ties to liberal billionaire George Soros, a maneuver many called anti-Semitic. Sandberg also came under fire for reportedly asking whether Soros had shorted Facebook’s stock.

The Wall Street Journal reported in November that Zuckerberg had told Facebook executives earlier in the year that his company was at war.

And that war has continued: Third-party reports for the Senate Intelligence Committee on Russian interference released on Monday said that Facebook and fellow tech giants Twitter and Google had done the “bare minimum” to provide the committee data and information. And on Wednesday, the Times reported that Facebook had allowed companies such as Spotify and Netflix to access users’ private messages and provided access to other user data for some 150 companies between 2010 and, you guessed it, 2018. In response to the Times report, Facebook said none of the features or partnerships gave access to people’s information without their permission and tried to explain the message access.

Facebook keeps saying that it’s not selling user data, but it’s making money off it by letting outside parties take a peek.

This is hurting Facebook

Facebook has paid a price. Employee morale is down, calls for quitting Facebook are growing louder, and the founders of two of its most popular products — WhatsApp and Instagram — have resigned. Facebook’s stock price has declined by more than 20 percent this year, and Zuckerberg lost an estimated $15 billion.

Despite all that, it appears that Facebook’s leadership continues to believe its behavior is the best course of action. It’s a business, and providing dubious access to user data, engaging in and allowing shady political activity, and hiding errors until the last minute, it seems, is perceived as more lucrative than the alternative.

Eventually, Facebook may be forced to really reckon with what’s happened — because of law enforcement actions, fines, regulation, user revolt, or something else. Thus far, through all the scandal, it’s charging ahead. There will surely be another apology soon.

Sourse: vox.com

No votes yet.
Please wait...

Leave a Reply

Your email address will not be published. Required fields are marked *