This story is part of a group of stories called
Uncovering and explaining how our digital world is changing — and changing us.
Hours before the House antitrust subcommittee hearing featuring testimony from the CEOs of Facebook, Google, Amazon, and Apple, a blog post from TikTok chief executive Kevin Mayer proclaimed that all platforms should “disclose their algorithms, moderation policies, and data flows to regulators” and challenged the app’s competitors to follow suit. This is quite a call to arms — and one that was obviously carefully timed.
In the post, Mayer makes a broad call for competition between social media companies and argued that TikTok could be a positive force for the United States, one that would protect its user data, with or without new regulation. And along those lines, Mayer also promised that TikTok would be more upfront about its algorithms and content moderation. Ultimately, he said, TikTok would be a model for how other social media companies could be more transparent, a commitment that echoed recent calls for TikTok to become this very example.
The post comes as TikTok faces concerns over potential security risks related to its parent company, the Beijing-based company ByteDance. Earlier this month, Secretary of State Mike Pompeo even threatened to “ban” TikTok, though it’s unlikely the Trump administration could actually do this on its own; Joe Biden’s presidential campaign also recently instructed its staff to delete the TikTok app from their phones. Meanwhile, assessing the true risk of the app remains difficult. As Shira Ovide wrote at the New York Times earlier this month, “politicians, like American tech bosses, engage in fear-mongering about Chinese tech so often that it’s hard to know when to believe them.”
TikTok, for its part, says that no foreign government plays a role in its moderation.
“Our content and moderation policies are led by our US-based team in California and aren’t influenced by any foreign government,” a TikTok spokesperson told Recode. “At our virtual Transparency and Accountability Center, guests can see firsthand how we moderate and recommend content.”
After all, other US social media companies stand to benefit from action against TikTok. Facebook is currently preparing to fully launch a music-based product called Reels with Instagram, and is even reportedly recruiting TikTok stars to promote the competing service. In his recent blog post, Mayer accused Facebook of making attacks “disguised as patriotism and designed to put an end to our very presence in the US.” Meanwhile, influencers who’ve gained massive audiences on the app are being wooed away to other rivals, like the Los Angeles-based music app Triller.
But now, in an apparent effort to allay concerns over its platform, TikTok is on a quest to prove that it is transparent about how it handles content. With Mayer’s request for all social media companies to disclose their algorithms, it’s obvious that TikTok wants to appear more transparent than Facebook and others. However, it’s not entirely clear that these efforts will address the many other concerns about TikTok.
“TikTok has become the latest target, but we are not the enemy,” Mayer wrote in the post. “The bigger move is to use this moment to drive deeper conversations around algorithms, transparency, and content moderation, and to develop stricter rules of the road.”
As evidence, Meyer pointed to the TikTok Transparency Center, which was announced back in March. The center in Los Angeles will purportedly provide some experts “the actual code that drives our algorithms,” Mayer said, and also let them observe content moderation in real time. Mayer argues that this new initiative puts TikTok “a step ahead of the industry.” That announcement was followed by a blog post in June that explained some of the basics of the company’s For You algorithm, which powers one of the popular parts of the TikTok app. The company is also opening another transparency center in Washington, DC, and hiring for positions meant to interface with the federal government.
It’s unclear if TikTok’s recent efforts will be enough to quell apprehensions about the platform. Several experts told Recode that they questioned whether TikTok’s pledge to disclose how its algorithms work will actually reveal much meaningful information, such as what type of content the company’s system chooses to amplify.
“Revealing the code is helpful and certainly more than other platforms have shared in the past,” said Kelley Cotter, a postdoctoral scholar at Arizona State University who studies public understanding of algorithms. “Revealing the code will not, in itself, tell us if the algorithm has an influence.”
Others wondered if TikTok can reveal details about its algorithms without exposing the personal data of its users. According to Nicolas Kayser-Bril, a journalist at AlgorithmWatch, the machine learning algorithms used by social media platforms are dependent not only on the code that operates them but also on training data that can influence how they operate. “In the case of TikTok’s algorithms, the training data probably contains highly personal information from users, which should not be revealed as such, even to researchers,” said Kayser-Bril.
Alongside its push for greater transparency around its algorithms and content moderation, TikTok is also working hard to distance itself from ByteDance and the Chinese government. TikTok made this case in a recent statement to Vox:
However, not everyone is convinced that this new push toward transparency is enough to assuage worries that TikTok might be used as a tool for foreign influence. After all, regardless of how much we know about how TikTok recommends content, the company is also collecting massive amounts of data about millions of users in an effort, some say, to make its AI even more powerful for a variety of purposes.
“From a national security perspective, there’s concern around using that data for espionage purposes, blackmail,” Kiersten Todt, a scholar at the University of Pittsburgh Institute for Cyber Law, told Recode. “Artificial intelligence is only as good as the data that goes into it. So if the Chinese government has the most data of any other country in the world, then what it can produce from an AI perspective could potentially give it a tremendous advantage.”
But regardless of the true security risks of TikTok, fear over the app may have prompted a new standard for what it means for a social media company to be transparent with its users. If TikTok lives up to its transparency promises, other social media companies may very well feel pressure to follow suit.
Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.
Support Vox’s explanatory journalism
Every day at Vox, we aim to answer your most important questions and provide you, and our audience around the world, with information that has the power to save lives. Our mission has never been more vital than it is in this moment: to empower you through understanding. Vox’s work is reaching more people than ever, but our distinctive brand of explanatory journalism takes resources — particularly during a pandemic and an economic downturn. Your financial contribution will not constitute a donation, but it will enable our staff to continue to offer free articles, videos, and podcasts at the quality and volume that this moment requires. Please consider making a contribution to Vox today.
Sourse: vox.com