The Anarchists Combatting the Far Right on YouTube |

The Anarchists Combatting the Far Right on YouTube |

Last year, a few members of the Metropolitan Anarchist Coordinating Council, a New York City-based group, began delving into the deeper recesses of YouTube. They quickly discovered videos by self-proclaimed Fascists, would-be race warriors, and ethno-state evangelists that lurked among old World Series broadcasts, bike-repair tutorials, and footage of cute animals. Many of these jarring videos came from groups such as the Traditionalist Worker Party, the Rise Above Movement, and American Renaissance, an online publication. Their sentiments were mainly nativist, their tone mostly grim, and their pronouncements almost uniformly apocalyptic. They presented white Americans as besieged by sinister forces. They predicted, and sometimes appeared to look forward to, racial strife and violence.

The members of the coördinating council became familiar with the videos while keeping track of the far right. Last spring, the council formed a working group to organize counter-demonstrations during far-right rallies. They decided a few months later that the same group should also counter far-right recruitment attempts on the Internet. The result is free software called No Platform for Fascism that will assist anyone who wants to register a complaint about these videos with YouTube. The plug-in can be downloaded for use, installing an icon of red and black flags on the user’s browser toolbar. Clicking on that icon brings up a list of far-right videos identified as being in violation of YouTube’s community guidelines, which do not allow for the promotion of violence or incitement of hatred based upon attributes like race, ethnicity, or religion. Users are then able to send to YouTube pre-filled electronic complaint forms that describe how the plug-in creators believe each video has run afoul of the company’s policies.

The coders and researchers in the No Platform working group would not speak publicly; they wanted to avoid being targeted for retaliation. But other members of the council described the plug-in as designed to broaden awareness of anarchist thought, thwart the spread of far-right beliefs, and insure that white-supremacist and racist ideas are contested wherever they appear. The campaign is also a digital proxy for the physical confrontations that have taken place as protesters have battled members of far-right groups in cities such as Berkeley and Charlottesville. “It’s a way of showing people that Antifa or anti-Fascism doesn’t just have to be these dramatic street confrontations,” Matthew Whitley, a member of the coördinating council, said. “It’s also just about continuing to follow where they have a presence and making it so there’s no particular space or territory in which they feel safe and privileged to organize.”

When the plug-in was first released, in December, the coördinating council identified ten videos that it found to be especially problematic. Emblematic of those was “About Vanguard America,” produced by a group of that name whose members took part in the violence-marked Unite the Right rally in Charlottesville, Virginia, last year. (James Alex Fields, Jr., who killed a woman named Heather Heyer when he drove his car into a crowd of people protesting against that rally, had been photographed earlier in the day wearing the Vanguard’s unofficial uniform and standing in a phalanx of men holding homemade shields, including some emblazoned with the group’s symbol.) The Vanguard’s six-minute video features a man speaking as written phrases flash upon the screen. Those include “The Lying Press,” “Imagine a Muslim Free America,” “America is a White Nation,” “Stop the Blacks,” “Fascism is the Next Stop for America,” and “Blood and Soil,” which was a slogan in Nazi Germany with connotations of racial purity.

Since December, the working group has identified forty-eight videos as objectionable, members said, and the plug-in has been used to lodge nearly ten thousand complaints. Some of those videos appear to have been removed by the people who posted them. YouTube has removed thirteen videos, including “About Vanguard America,” working-group members said. The company has marked eleven others, including “Rise of the Sleeping Giant: White European Nationalism,” by European American Vanguard (which asserts that the “liberal left” wants to carry out “white genocide”), and “Atomwaffen Division Tribute,” by the National Socialist Heroes (which calls for “race war now”), as “inappropriate or offensive to some audiences.”

The working group has about half a dozen core members, Whitley said, who keep tabs on far-right organizations and look for new videos to add to the plug-in. Sometimes, he said, those members are joined by friends during YouTube viewing parties. “Of course, it’s kind of gruelling to have just a few people spend hours and hours watching Holocaust denial and neo-Nazi imagery,” he said. “They’ve tried to temper that by having these events where they can do a little bit of work and also share food and watch entertaining cat videos.”

Members of the coördinating council have not been the only ones tracking how the far right uses YouTube. Last summer, Zack Exley, a fellow at Harvard’s Shorenstein Center on Media, Politics and Public Policy, published a paper saying that the right dominates the genre of “political rants” on YouTube, and that the platform “has a large number of right-wing channels that collectively have millions of viewers who are exposed to theories too extreme even for talk radio.” Then in March, Zeynep Tufekci, who studies social movements, privacy, and surveillance, wrote in a Times Op-Ed titled “YouTube, the Great Radicalizer” that its algorithms and autoplay features seem to reward incendiary content and promote fringe ideas by recommending ever more extreme videos, even to those who are not seeking them.

YouTube, which says its mission is to “give everyone a voice and show them the world,” has always seen itself as a platform for free expression, which means that rooting out hate speech may present a complicated task for the company. A “trusted flagger” program allows groups such as the Anti-Defamation League and the Institute for Strategic Dialogue to report videos that violate community guidelines. Thousands of YouTube content moderators review complaints and remove videos when there is a violation. Last summer, YouTube announced that videos that contain inflammatory religious or supremacist content but do not clearly violate the company’s policies would be accompanied by a warning––the “inappropriate or offensive” label that has been applied to some far-right videos––and would not be recommended by YouTube, would be ineligible for comments, and would not be allowed to earn money from ads. In December, YouTube’s C.E.O., Susan Wojcicki, suggested that the company might begin using technology that it had developed to locate violent extremist videos to also identify videos containing hate speech.

I asked YouTube about what role complaints made through the No Platform plug-in may have played in the removal of videos, and how the company addresses objections to far-right videos. A YouTube spokeswoman, who declined to be quoted by name “for security reasons,” wrote that complaints from the plug-in are reviewed like any other complaints and that the company works to quickly review and remove videos that violate its policies. “We know there’s more to do here, and we’re committed to getting better,” the spokeswoman said. “We’re making progress in our fight to prevent the abuse of our services, including hiring more people, investing in machine learning technology, and working with more experts.”

On a recent afternoon, several coördinating-council members showed up on Eighth Avenue in Manhattan outside the New York City office of Google, which owns YouTube. The idea, they said, was to spread word about their campaign, both among the public and among Google employees, who they thought might include people sympathetic to their efforts. For about an hour, they gave out copies of flyers, including one that read “YouTube Hosts Nazis” and another with the statement “YouTube, we are watching!” Many of those emerging from the building ignored the leaflets, but some accepted them. A few people walking down Eighth Avenue paused to exchange opinions. One woman pushing a stroller said that she was relieved that the group was opposing, not promoting, the far right. A man who was leaving the Google building gazed curiously at the coördinating-council members. “Are you Antifa?” he asked. Informed that they were, he said that he did not agree with them but wished them well. Some coördinating-council members warned that far-right groups, finding it difficult to organize rallies and appearances in the real world, would become more likely to depend on social-media platforms like YouTube to spread their views. “It’s increasingly difficult for them to hold public events, but we know many people are radicalized by their presence on the Internet,” Sarah Olle, who was among those distributing flyers, said. “We want to head that off before it happens, and getting those videos taken down will be like stamping out the last few embers of a fire.”

Sourse: newyorker.com

No votes yet.
Please wait...

Leave a Reply

Your email address will not be published. Required fields are marked *