G/O media will make more AI-generated stories despite critics

Why G/O Media thinks we should have more stories written by bots.

G/O Media CEO Jim Spanfeller at a 2022 conference.
Piaras Ó Mídheach/Sportsfile for Collision via Getty Images

Peter Kafka covers media and technology, and their intersection, at Vox. Many of his stories can be found in his Kafka on Media newsletter, and he also hosts the Recode Media podcast. Part of The rise of artificial intelligence, explained

In early July, managers at G/O media, the digital publisher that owns sites like Gizmodo, the Onion, and Jezebel, published four stories that had been almost entirely generated by AI engines. The stories — which included multiple errors and which ran without input from G/O’s editors or writers — infuriated G/O staff and generated scorn in media circles.

They should get used to it.

G/O executives, who say that AI-produced stories are part of a larger experiment with the technology, plan on creating more of them soon, according to an internal memo. And G/O managers told me they — and everyone else in media — should be learning how to make machine-generated content.

“It is absolutely a thing we want to do more of,” says Merrill Brown, G/O’s editorial director.

G/O’s continued embrace of AI-written stories puts the company at odds with most conventional publishers, who generally say they’re interested in using AI to help them produce content but aren’t — for now — interested in making stuff that is almost 100 percent machine-made.

But it’s easy to see a future where publishers looking at replacing humans increasingly rely on this tech. Or, if you’d like a less dystopian projection, a future where publishers use robots to churn out low-cost, low-value stuff while human journalists are reserved for more interesting work.

In a note sent to top editors at his company last Friday, Brown said that editors of Jalopnik, a car-focused site, and the pop-culture site A.V. Club are planning to create “content summaries or lists that will be produced by A.I.” Brown’s memo also notes that the Associated Press recently announced a partnership with OpenAI, the buzzy AI company that created ChatGPT.

A different internal G/O note, produced earlier this month, calls for “2-3 quality stories” made by AI to run on Jalopnik and the A.V. Club on July 21. Brown told Vox that document, which was published after the first set of machine-generated stories ran — and which notes that AI engines “alone (currently) are not factually reliable/consistent” and will need human assistance — “has nothing whatsoever to do with publishing or editorial deadlines.“

But Brown and G/O Media CEO Jim Spanfeller both argue that AI will be transformative for the media industry — like the internet was in the last couple decades, or maybe more so — and that ignoring it would be a terrible mistake.

“I think it would be irresponsible to not be testing it,” Spanfeller told me.

Spanfeller and Brown say their AI-written stories aren’t the only way they want to use the tech. Like many publishers, they bring up the idea that reporters could use AI to do research for a story; Spanfeller also says he wants to use AI to automate some tasks humans currently perform on the business side of his company, like preparing basic marketing plans for advertisers.

But G/O employees, who tell me they don’t want to talk on the record for fear they’ll be disciplined by managers, say they’ve received no information from their managers about any use of AI — except a heads-up that the AI-written stories were going to appear on the site on July 5, which was sent the same day the stories ran.

G/O journalists tell me they’re upset about the execution of the stories — a bot-written item about how to watch all the Star Wars movies in chronological order had errors, for instance — but even more so, the fact that they exist at all.

“It’s a disaster for employee morale,” a G/O journalist told Vox.

Brown now says the next round of stories will receive input from the top editors at each publication. “We won’t do another editorial project that I can possibly imagine, without an [editor-in-chief] overseeing and reviewing it,” he told me.

Spanfeller and Brown also say they won’t use AI to replace G/O’s staff. “Our goal is to hire more journalists,” Spanfeller said. (Spanfeller notes that, like other media companies — including Vox Media, which owns this site — G/O has laid off employees because of this “crappy economic market” — but called it a “de minimis amount of reduction.”)

That argument doesn’t persuade G/O staff, who say they assume G/O will inevitably use the tech to replace them.

“This is a not-so-veiled attempt to replace real journalism with machine-generated content,” another G/O journalist told me. “G/O’s MO is to make staff do more and more and publish more and more. It has never ceased to be that. This is a company that values quantity over quality.”

Other newsrooms that have tried out AI-generated stories have since pulled back. CNET, which generated headlines when it admitted that dozens of stories it published were machine-made (and full of errors), has since said it won’t use made-from-scratch AI stories. BuzzFeed, which briefly saw its stock shoot up when it announced its enthusiasm for AI earlier this year — and months later shut down its entire BuzzFeed News operation — produced an embarrassing series of “travel guides” that were almost entirely produced by AI. But a PR rep now says the company won’t make more of those.

And while both Insider and Axios have said they are exploring using generative AI to help journalists do their work, executives at both publications say they won’t use stories written entirely by bots. At the moment, at least.

“Definitely looking at every aspect of AI augmenting our work but don’t see any upside in wholly AI-generated content right now,” Axios editor-in-chief Jim VandeHei wrote in an email to Vox. “Seems like all danger, no upside until A LOT more is known.”

But there’s definitely at least one upside to machine-made content: It costs next to nothing. And it’s worth noting that there are many, many outlets publishing stories, written by actual humans, that promise to tell you, as the Gizmodo AI story did, how to watch Star Wars movies in order. Among them: Space.com, Rotten Tomatoes, Reader’s Digest, PC Magazine, the Wrap, and Vanity Fair.

And for at least a few days, Google ranked Gizmodo’s machine-made output among the top results for “star wars movies” queries. That’s something Brown noted when he told me that he’s learned that AI content “will, at least for the moment, be well-received by search engines.”

Which points out both the appeal and the limitations of this kind of stuff: There’s some audience for it. And Google — for now — will steer people to sites that make it, which translates to page views and at least the potential for ad revenue.

But making the exact same content producible by dozens of other people — or an unlimited number of robots — doesn’t build long-term value for your publication. And whatever financial return you earn will keep shrinking as more people and bots make the same thing, creating more competition and pushing ad prices down. (Unless, of course, Google decides that it’s better off not sending people away from its results page at all — like it now does for “What time is the Super Bowl” results.)

It’s also worth noting that the Gizmodo machine-made stories have since fallen way down on the Google rankings (perhaps because of the scrutiny those search results generated).

Years ago, I worked for Spanfeller when he was the publisher of Forbes.com, where he also produced a lot of content that wasn’t created by his employees, like republished stories from news wires, consultancies, and other outside sources. Spanfeller estimates that his staff produced around 200 stories each day but that Forbes.com published around 5,000 items.

And back then, Spanfeller said, the staff-produced stories generated 85 to 90 percent of the site’s page views. The other stuff wasn’t valueless. Just not that valuable.

Spanfeller says he thinks that could play out again with AI stories, imagining a scenario where “there’s value to the site, there’s value to the end user for AI-generated content — whatever that means.”

But he says the stuff the humans on his staff do will be much more valuable than the work the robots do. “I don’t think this is an existential moment for journalism.”

Source: vox.com

No votes yet.
Please wait...

Leave a Reply

Your email address will not be published. Required fields are marked *