How Trump changed Facebook

This story is part of a group of stories called

Uncovering and explaining how our digital world is changing — and changing us.

At one point in time, Facebook’s relationship with politicians was relatively uncontroversial.

But after the 2016 US elections, everything changed.

Early in the campaign, then-presidential candidate Donald Trump tested the limits of Facebook’s rules against hateful speech, at the same time that the company became a vehicle of political exploitation by foreign actors.

Facebook’s first test: dealing with a 2015 Facebook post from Trump calling for a “total and complete shutdown” of Muslims entering the US. While some inside the company saw a strong argument that Trump’s comments violated Facebook’s rules against religious hate speech, the company decided to keep the post up. Until then, most Facebook employees had never before grappled with the possibility that their platform could be used to stoke such division by a political candidate for the highest position of office.

“What do you do when the leading candidate for president posts an attack … on [one of the] the biggest religion[s] in the world?” former Facebook employee and Democratic lobbyist Crystal Patterson told us.

And it wasn’t just national politicians Facebook had to worry about, but foreign adversaries, too. Despite CEO Mark Zuckerberg’s initial post-election comments dismissing the “pretty crazy idea” that fake news on the platform could have influenced the elections, it soon became clear that propaganda from Russian Facebook accounts had reached millions of American voters — causing an unprecedented backlash and forcing the company to reckon with its culpability in influencing global politics.

Over time, Zuckerberg would acknowledge Facebook’s role as what he called “the Fifth Estate” — an entity as powerful as the government and media in shaping the public agenda — while at the same time trying to minimize the company’s role dictating the acceptable terms of political speech.

To offload the burden of political responsibility going forward, Facebook formed the Oversight Board in 2018, a Supreme Court-like body it set up to weigh in on controversial content decisions — including how to deal with Trump’s account. But the board is new, and we’re still learning how much power it has over Facebook. How much responsibility does Facebook still have to dictate the terms of its own platform? And can the board go far enough to change the social media platform’s underlying engine: its recommendation algorithms?

We explore these questions about Facebook’s role in moderating political speech in our fourth episode of Land of the Giants, Vox Media Podcast Network’s award-winning narrative podcast series about the most influential tech companies of our time. This season, Recode and The Verge have teamed up over the course of seven episodes to tell the story of Facebook’s journey to becoming Meta, featuring interviews with current and former executives.

Listen to the fourth episode of Land of the Giants: The Facebook/Meta Disruption, and catch the first two episodes on Apple Podcasts, Google Podcasts, Spotify, or wherever you get your podcasts.

Will you support Vox’s explanatory journalism?

Millions turn to Vox to understand what’s happening in the news. Our mission has never been more vital than it is in this moment: to empower through understanding. Financial contributions from our readers are a critical part of supporting our resource-intensive work and help us keep our journalism free for all. Please consider making a contribution to Vox today.

Sourse: vox.com

No votes yet.
Please wait...

Leave a Reply

Your email address will not be published. Required fields are marked *