Save this storySave this storySave this storySave this story
“Black Mirror,” the anthology series best known for dreaming up dystopian uses for near-future technology, took aim at its own network in the timeliest episode of its most recent season. Settling on her couch after a difficult stretch at work, a woman named Joan (Annie Murphy) logs on to Streamberry, a barely veiled stand-in for Netflix, and stumbles upon a TV show based on the events of her day: “Joan Is Awful,” starring Salma Hayek. The program proceeds to ruin her life, but it’s nothing personal; Streamberry, which runs on cutting-edge algorithms, made “Joan Is Awful” with no human input. Not a single writer or actor is involved in the production: the scripts are churned out by artificial intelligence, and the performances are elaborate deepfakes. The “Black Mirror” episode, which débuted in the midst of the ongoing Writers Guild of America strike, struck an immediate chord—unsurprisingly so, given that concerns about A.I. have become a flash point in the union’s negotiations with the studios. One member of the Screen Actors Guild, which has joined the writers on the picket line, called the episode “a documentary of the future.” But Joan’s travails left me wondering whether Streamberry was too rosy a portrait of where Hollywood is headed. Even in this bleakly automated vision of entertainment as Hell, there’s still some semblance of risk and innovation.
To survey the film and television industry today is to witness multiple existential crises. Many of them point to a larger trend: of Hollywood divesting from its own future, making dodgy decisions in the short term that whittle down its chances of long-term survival. Corporations are no strangers to fiscal myopia, but the ways in which the studios are currently squeezing out profits—nickel-and-diming much of their labor force to the edge of financial precarity while branding their output with the hallmarks of creative bankruptcy—indicate a shocking new carelessness. Signs of this slow suicide are all around: the narrowing pipelines for rising talent, the overreliance on nostalgia projects, and a general negligence in cultivating enthusiasm for its products. Writers and actors have walked out to demand fairer wages and a more equitable system, but they’ve also argued, quite persuasively, that they’re the ones trying to insure the industry’s sustainability. Meanwhile, studio executives—themselves subject to C-suite musical chairs—seem disinterested in steering Hollywood away from the iceberg. This is perhaps because the landscape is shifting (and facets of it are shrinking) so rapidly that they themselves have little idea of what the future of Hollywood might look like.
The apocalyptic vibes are of fairly recent vintage. The 2007-08 W.G.A. strike, for example, didn’t, and couldn’t, anticipate the ways in which the Internet, and then the tech giants, would upend the television industry. Even back then, the writers took issue with the compensation structure for Web-hosted content, but the union was mostly bargaining with companies that were firmly rooted in Hollywood and its traditions. The streaming wars, of which writers and actors rightly see themselves as collateral damage, have introduced players like Apple and Amazon, for whom content is only a tiny portion of their broader business strategies—a value-add for iPhone users or Prime subscribers. Together with Netflix, the move-fast, break-things, maybe-fix-later crowd has brought with it the Silicon Valley playbook of burning up investor or reserve cash now in the hopes of profit tomorrow, and in the process has forced some of Hollywood’s most storied studios, most notably Disney and Warner Bros., into billions in debt to stay competitive.
Some of the first Cassandras to draw the public’s attention to this slo-mo self-sabotage were the striking writers. W.G.A. members have expressed alarm not only that their profession has become devalued and unstable through low pay but also that the paths that allowed newcomers to eventually become showrunners, which have existed for the past half century, have been eroded by the studios. On the podcast “The Town,” Mike Schur, the creator of “The Good Place” and the co-creator of “Parks and Recreation” and “Brooklyn Nine-Nine,” identified some of the skills beyond writing scripts—such as editing, sound mixing, and color correction—that he learned about from his mentor Greg Daniels on his first episodic writing job, on “The Office.” Schur’s apprenticeship took place not just in the writers’ room but on set—a location from which TV writers are increasingly being shut out. Schur notes that approximately eleven members of the “The Office” ’s writing staff went on to become first-time showrunners, including Mindy Kaling and B. J. Novak, in an example of the system working as it should. Today’s mini rooms make it so that fewer writers are hired and that their stint on a show is often over by the time the cameras start rolling, rendering it more challenging for neophytes to build the kind of résumé that enables them to advance in the industry. The dismantling of this ladder is all the more counterintuitive given that the scarcity of experienced showrunners during the content boom has been a known problem for years.
The movies may be in grimmer shape. The industry’s pursuit of I.P. at the expense of originality has all but trained younger audiences not to expect novelty or surprise at the multiplex, assuming that they’re going to the theatre at all. Hollywood has never been known for overestimating the audience’s intelligence, but it’s hard not to wonder how it is supposed to be inculcating a love of cinema in children—that is, future moviegoers—when the splashiest films on offer are explicitly buckets of regurgitation. Early summer gave us the live-action “Little Mermaid,” the latest in Disney’s cannibalization of its archives. The animated film was one of the first movies that I remember seeing, and it really did feel magical. Even then, Ariel’s coming of age was criticized in some circles for making its heroine a boy-crazy shopaholic, but the fact that millennials continue to lovingly mock it years later attests to its endurance as a classic. The 1989 film swelled with passion and longing, transported us to unseen worlds, and left us with indelible characters and glorious earworms. It convinced girls that it was O.K. to want more (even if it was more thingamabobs). There’s a measure of progress, to be sure, in recasting Ariel as a Black mermaid played by Halle Bailey, but the mixed reviews all but confirm that it’s a dead-eyed knockoff of the original . It’s true that early impressions can forge a childhood attachment to almost anything, but the way that kids grow to love movies as adults is by offering them, well, great films and relevant themes, instead of a parade of listless remakes with stories that were meant to speak to a generation three decades ago.
Perhaps we should have seen the breakdown of the star-making machine as a preview of things to come. Screen celebrity is nearly as old as the movie industry itself, but the device seems to have stopped minting household names some time in the past twenty years. The movie stars of yesteryear are still the movie stars of today. A study by the National Research Group, a market-research firm that specializes in entertainment, life style, and technology, found that, of the twenty actors most likely to pull audiences to a theatre, only one was under the age of forty (Chris Hemsworth), and the average age of that group was fifty-eight. (The sexagenarian Tom Cruise topped the list.) In lieu of finding and launching the next Denzel Washington or Julia Roberts, studios have poured millions into digitally de-aging graying A-listers. Hollywood stardom is becoming something unthinkable at any other era of its existence: a gerontocracy.
I.P. is again to blame. Franchises killed the movie star. Spider-Man can be played by Tobey Maguire or Andrew Garfield or Tom Holland, Batman by Michael Keaton or Christian Bale or Ben Affleck or Robert Pattinson or Michael Keaton again. An industry famous for worshipping youth is more clueless than ever about what to do with its young people. (What’s “a Tom Holland movie”? Who can say?) And, for those emerging actors who used to see background work as an entry point to a notoriously gate-kept industry, it turns out the studios might prefer to digitally scan their likeness instead, possibly locking them out of opportunities for more days on set.
After “Top Gun: Maverick” broke box-office records, sequels may have been seen as the key to luring audiences back to theatres, which have been languishing since the pandemic. But this summer’s long-in-the-tooth franchises—“Mission: Impossible,” “Transformers,” “Indiana Jones,” and “The Fast and the Furious”—have performed just satisfactorily, if not disappointingly. “Barbie,” meanwhile, saw the director Greta Gerwig infuse the half-century-old blond blank slate with her own idiosyncratic anxieties to produce a Zeitgeist-capturing film with an unmistakable authorial imprimatur. But Hollywood’s ignoring the obvious takeaway, which is that viewers appreciate novelty. Instead, Mattel has announced that it will follow up “Barbie” by raiding its toy closet for more I.P., and has put dozens of projects based on its products into development.
Trends in television are no less dispiriting, with networks soliciting “visual Muzak,” as some in the industry have put it. The TV writer Lila Byock told my colleague Michael Schulman this spring that the streamers are most eager for “second-screen content”: shows to have on in the background while the viewer presumably scrolls through their phone. In a recent interview, the actor and director Justine Bateman said that network notes now request that shows be less engaging so that distracted audiences won’t lose track of the plot and turn them off.
Even the strike’s prolongation suggests an inane shortsightedness. Film and television are already losing the competition for eyeballs to video games and the Internet. The Bloomberg journalist Lucas Shaw has reported that “people spend more time (and money) on video games than they do on movies, and they spend more time watching YouTube than any other TV network.” The lack of new scripted programming on the broadcast networks as a result of the strikes is forecast to hasten their coming obsolescence. The momentum from the successes of “Barbie” and “Oppenheimer” in bringing audiences back to theatres has been squandered by the pushing back of many of this year’s releases to 2024. But, the longer the studios stretch out the strike, the more likely it is that consumers will form new leisure habits on TikTok or Animal Crossing.
The disruption that Netflix and the streaming wars have unleashed on the entertainment industry in just the past decade has been so unpredictable that it seems foolish to predict only doom, although that’s certainly where the arrows are pointing. But even Hollywood’s boosters have to admit that, since the streaming era, movies and television feel less special, labor conditions (for writers, actors, and below-the-line crew members) have plummeted, and the industry’s turbulent mergers and layoffs call into question which legendary institutions will still stand in another ten or twenty years. I won’t purport to know how to fix Hollywood, but the answer doesn’t seem to lie in highlighting the industry’s creative torpor and timidity while driving away the people with the institutional knowledge to transform an idea into hours of spectacle, comfort, provocation, or maybe even art made by hundreds or thousands of people. Perhaps Hollywood isn’t willing to bet on its future, but it can at least stop from actively working against it. ♦
Sourse: newyorker.com