Amy Klobuchar takes aim at 12 vaccine misinformation influencers

This story is part of a group of stories called

Uncovering and explaining how our digital world is changing — and changing us.

As the Covid-19 vaccine rollout continues across the US, some lawmakers are concerned that ongoing misinformation and disinformation campaigns are exacerbating vaccine hesitancy. Now, two senators are turning their attention to the vaccine misinformation superspreaders that push the bulk of conspiracy theories and lies on social media — and asking the social media giants to take more aggressive action.

“For too long, social media platforms have failed to adequately protect Americans by not taking sufficient action to prevent the spread of vaccine disinformation online,” wrote Sens. Amy Klobuchar (D-MN) and Ben Ray Luján (D-NM) in a Friday letter to Facebook CEO Mark Zuckerberg and Twitter CEO Jack Dorsey, which was viewed by Recode. “Despite your policies intended to prevent vaccine disinformation, many of these accounts continue to post content that reach millions of users, repeatedly violating your policies with impunity.”

In particular, the senators urged the companies to take action against 12 anti-vaccine influencers — 11 individuals and one couple — who spread anti-vaccine content on the internet. These accounts include Robert F. Kennedy Jr., who has pushed distrust in vaccines, and Joseph Mercola, an online alternative medicine proponent who was recently flagged by the Food and Drug Administration for promoting fake Covid-19 cures, including through his still-active Twitter account.

Sign up for The Weeds newsletter

Vox’s German Lopez is here to guide you through the Biden administration’s unprecedented burst of policymaking. Sign up to receive our newsletter each Friday.

These 12 entities were identified in a report published last month by the Center for Countering Digital Hate, a nonprofit focused on online hate and misinformation. To find these 12 influencers, researchers identified 10 private and 20 public anti-vaccine Facebook groups, whose sizes ranged between 2,500 and 235,000 members. The researchers then analyzed links posted in these groups and tracked the sources of their links.

They found that up to 73 percent of that content, including posts sharing it across Facebook, came from websites affiliated with these 12 superspreaders, who have built reputations in the anti-vaccine online world through multiple accounts on various social media services. More broadly, up to 65 percent of anti-vaccine content on both Facebook and Twitter identified by the researchers seemed to come from these entities. At the time of the report’s publication in March, nine of these superspreaders were active on Facebook, Instagram, and Twitter.

Related

Facebook is finally cracking down hard on anti-vaccine content. It is facing an uphill battle.

In their Friday letter, the senators asked for more details on the platforms’ approach to content moderation, and for explanations of why the content shared by these 12 superspreaders does or does not violate Facebook’s and Twitter’s rules. The senators also sought more information on the companies’ investment in content moderation for communities of color, rural communities, and non-English speaking communities, pointing out that some of the content posted by the 12 superspreaders “targets Black and Latino communities with tailored anti-vaccine messages.”

In response to the pandemic and the vaccine rollout, Facebook and Twitter have altered their approach to content moderation and health misinformation. Facebook, which also owns Instagram, has banned anti-vaccine misinformation and misinformation about Covid-19 that could lead to “imminent physical harm,” and the company says it has removed more than 12 million pieces of content that violate this threshold. Facebook has also conducted research into vaccine-hesitant comments on its service.

“Working with leading health organizations, we’ve updated our policies to take action against accounts that break our Covid-19 and vaccine rules — including by reducing their distribution or removing them from our platform — and have already taken action against some of the groups in this report,” Facebook spokesperson Dani Lever told Recode. She added that the company had connected 2 billion people to resources from health authorities.

Twitter has taken a two-pronged approach of removing the most harmful vaccine misinformation and labeling other misleading tweets.

Generally, these approaches have focused on individual pieces of content, not the broader behavior of particular influencers across the internet. That means that vaccine misinformation superspreaders have more leeway to spread distrust without necessarily outright sharing false claims about vaccines. Instead, they can promote “health freedom” to encourage people to not get vaccinated, present vaccine news in a misleading light, use social media to link to misleading claims on their own websites, and simply raise questions in order to sow doubt.

Update, April 19, 2021, 2:10 pm ET: This piece was updated to include a comment from a Facebook spokesperson.

Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.

If you value Vox, we have an ask

To understand the news, you have to understand the systems that shape society. Our reporters and editors spend hours finding data, doing research, and talking to experts to clearly explain these systems, including historical context, problems, and potential solutions. Our aim is to give people clear information that empowers them to shape the world in which they live. Please consider making a contribution to Vox today, from as little as $3, to help us keep our work free for all.

Sourse: vox.com

No votes yet.
Please wait...

Leave a Reply

Your email address will not be published. Required fields are marked *