In the fall of 1963, an enterprising young anthropologist named Richard Lee journeyed to the Dobe region of the northwest Kalahari Desert, in southern Africa. He was there to live among a community known as the Ju/’hoansi, which was made up of approximately four hundred and sixty individuals, split among fourteen independent camps. This area of the Kalahari was semi-arid and suffered from drought every two or three years, leading Lee to describe it as “a marginal environment for human habitation.” The demanding conditions made the territory of the Ju/’hoansi less desirable to farmers and herders, allowing the community to live in relative isolation well into the twentieth century.
As Lee would later explain, the Ju/’hoansi were not completely cut off from the world. When he arrived, for example, the Ju/’hoansi were trading with nearby Tswana cattle herders and encountered Europeans on colonial patrols. But the lack of extensive contact with the local economy meant that the Ju/’hoansi still relied primarily on hunting and gathering for their sustenance. It was commonly believed at the time that acquiring food without the stability and abundance of agriculture was perilous and gruelling. Lee wanted to find out whether this was true.
Nearly sixty years later, Tim Cook, the C.E.O. of Apple, faced a vexing labor issue of a different type. After months of plans and postponements, by the spring of 2022, Cook was ready to demand that Apple employees spend at least a few days each week at their desks in the company’s massive Cupertino, California, headquarters. The protest came swiftly. An employee group called AppleTogether wrote an open letter that expresses distinct displeasure with Cook’s plan: “Our vision of the future of work is growing further and further apart from that of the executive team.” The letter details, among other concerns, the time lost to commuting, the difficulty of achieving “deep thought” in distracting open offices, and the infantilizing nature of the rigid work schedules: “Stop treating us like school kids who need to be told when to be where and what homework to do.”
As I’ve studied and written about the awakening of knowledge workers spurred by the disruptions of the coronavirus pandemic, I’ve come to see how Lee’s mid-century study of the Ju/’hoansi and Apple’s current struggle over remote work inform each other in unexpected ways. The insights from the movement within anthropology that Lee’s work sets in motion, if applied with care, can help us understand the frustrations that grip not just the protesting Apple employees but the millions of other knowledge workers who are feeling exhausted by their jobs.
After fifteen months of field research, extending from the fall of 1963 into the early winter of 1965, Lee was ready to present his results to the world. Working with his longtime collaborator Irven DeVore, he organized a splashy conference in Chicago the following spring. It was called “Man the Hunter,” and it promised to provide anthropology with its “first intensive survey of a single, crucial stage of human development—man’s once universal hunting way of life.” The clamor around the event was such that the eminent French anthropologist Claude Lévi-Strauss travelled to America to attend.
Lee stole the show with a paper that described the results of his time spent among the Ju/’hoansi. It opens by repeating the common assumption that hunter-gatherer life is “generally a precarious and arduous struggle for existence,” then methodically presents data to undermine that idea. The community that Lee studied turned out to be well fed, consuming more than two thousand calories a day, even during a historic drought in Botswana. As Lee summarizes:
The Dobe-area Bushmen live well today on wild plants and meat, in spite of the fact that they are confined to the least productive portion of the range in which Bushman peoples were formerly found. It is likely that an even more substantial subsistence base would have been characteristic of these hunters and gatherers in the past.
Equally striking was the observation that the Ju/’hoansi appeared to work less than the farmers around them. According to Lee’s data, the adults he studied spent, on average, around twenty hours a week acquiring food, with an additional twenty hours or so dedicated to other chores–providing abundant leisure time. Lee concluded that Hobbes may have had it wrong: “[L]ife in the state of nature is not necessarily nasty, brutish, and short.”
The influence of “Man the Hunter” was profound. Mark Dyble, a lecturer in evolutionary anthropology at University College London, told me that the main conclusions of Lee’s work became “the dominant paradigm for a long time.” The South African anthropologist James Suzman, in his recent book, “Work: A History of How We Spend Our Time,” describes the gathering as “one of the most talked-about conferences in the history of modern anthropology.” It’s now common to hear someone quip, often with a dash of contrarian zeal, that our prehistoric ancestors had easier lives than we do.
Not surprisingly, reality turned out to be more complex than the portrayals of prehistoric easy living promoted by Lee’s biggest boosters. Critics pointed out that the effort expended to obtain food by hunter-gatherers varies significantly depending on the environment, and that the definition of “work” used by subsequent interpreters of Lee may have been biased toward activity away from home, leading to an undercount of other types of domestic labor. It’s also important to recognize the academic vogues of the time. Some of the most effective promoters of the vision of a relaxed Paleolithic—such as Marshall Sahlins, who wrote a classic 1966 paper titled “The Original Affluent Society”—were proud supporters of a theoretical position called substantivism, which argues that the principles of neoclassical economics are not fundamental to human nature. The implications of Lee’s careful calorie counts and detailed work diaries, gathered in the desert camps of the Kalahari, turned out to provide excellent fodder for the political radicalism of the nineteen-sixties.
Despite these important caveats, however, in the years since Lee’s splashy conference, more studies have been conducted and more nuance has been introduced into this literature, reinforcing the idea that there is some useful knowledge to be gained about the working lives of early humans from the study of extant hunter-gatherer communities. In the past couple of years alone, for example, three major books drew on anthropological research on modern hunter-gathers to make arguments about our prehistoric ancestors. These include, in addition to James Suzman’s “Work,” the social historian Jan Lucassen’s “The Story of Work” and the Times best-seller “The Dawn of Everything,” co-authored by David Wengrow and the late David Graeber.
To connect these developments in anthropology to the protesting employees at Apple, we must first take a close look at what their grievances are really about. At the moment, the employees who signed the letter are hoping to preserve the ability to work from home, but I believe that a more general principle is at stake. Knowledge workers were already exhausted by their jobs before the pandemic arrived: too much e-mail, too many meetings, too much to do—all being relentlessly delivered through ubiquitous glowing screens. We used to believe that these depredations were somehow fundamental to office work in the twenty-first century, but the pandemic called this assumption into question. If an activity as entrenched as coming to an office every day could be overturned essentially overnight, what other aspects of our professional lives could be reimagined?
This reasoning better explains the energy that propels groups such as AppleTogether to resist a return to pre-pandemic worklife. The battle for telecommuting is a proxy for a deeper unrest. If employees lose remote work, the last highly visible, virus-prompted workplace experiment, the window for future transformation might slam shut. The tragedy of this moment, however, is how this reform movement lacks good ideas about what else to demand. Shifting more work to teleconferencing eliminates commutes and provides schedule flexibility, but, as so many office refugees learned, remote work alone doesn’t really help alleviate most of what made their jobs frantic and exhausting. We need new ideas about how to reshape work, and anthropology may have something to offer.
Drawing from the subfield initiated by Lee’s research, we can reconstruct something like a deep history of the human relationship to work. We can then identify the points where modern office life clearly differs from the type of work styles that likely dominated for the vast majority of our species’ time on earth. It stands to reason that humans are well adapted to the efforts that occupied our time for the first several hundred thousand years of our existence. Therefore, we might find discomfort or stress at the points where our modern jobs most diverge from our Paleolithic experience.
We must, of course, take care with any such investigation. The human brain is malleable, making it hard to distinguish between activities for which humans might be fundamentally wired, and those for which our minds are simply adjusting to meet the demands of the moment.“Anthropologists are wary of assuming that a particular cognitive ability represents an adaptation to a particular task,” Dyble explained. But, even with these caveats in mind, we can approach this exercise as an interesting thought experiment. We need new ideas about how to reform knowledge work, and comparative anthropology seems to be as good a place as any to seek some original thinking. It didn’t take much reading into this literature before I identified several places where modern work significantly differed from our ancient past. Intriguingly, these friction points, once isolated, are remarkably effective at both highlighting why modern work has become so alienating and exhausting, and surfacing ideas about how to fix it.
Around the same time that Lee travelled to the Kalahari, another young anthropologist, named James Woodburn, found his way to Lake Eyasi, on the Serengeti Plateau of East Africa’s Rift Valley. He was there to observe the Hadza people, who, like the Ju/’hoansi, still largely depended on hunting and gathering as their primary means of obtaining food. Woodburn returned to Lake Eyasi frequently for many decades, using his observations as the foundation for his pioneering research on social organization.
Drawing from this field work, Woodburn argued that hunter-gatherer communities like the Hadza often relied on what he called an “immediate-return” economy. As Woodburn elaborates, in such a system, “People obtain a direct and immediate return from their labour. They go out hunting or gathering and eat the food obtained the same day or casually over the days that follow.”
If we now jump forward to our current moment, and consider the daily lot of our protesting Apple employees, we discover a rhythm of activity far different from our immediate-return past. In modern office life, our efforts rarely generate an immediate reward. When we answer an e-mail or attend a meeting, we’re typically advancing, in fits and starts, long-term projects that may be weeks or months away from completion. The modern knowledge worker also tends to juggle many different objectives at the same time, moving rapidly back and forth between them throughout the day.
A mind adapted over hundreds of thousands of years for the pursuit of singular goals, tackled one at a time, often with clear feedback about each activity’s success or failure, might struggle when faced instead with an in-box overflowing with messages connected to dozens of unrelated projects. We spent most of our history in the immediate-return economy of the hunter-gatherer. We shouldn’t be surprised to find ourselves exhausted by the ambiguously rewarded hyper-parallelism that defines so much of contemporary knowledge work.
Another point where work in hunter-gatherer societies differs from our modern efforts is the degree to which the intensity of work varies over time. A 2019 paper in Nature Human Behavior, on which Dyble is a lead author, describes a research study that set out to gather the same style of time measurements made by Lee so many years earlier. Dyble’s team observed the Agta of the northern Philippines, a community well suited for the comparison of different models of food acquisition, as some of them still largely depended on hunting and gathering, and others had shifted toward rice farming. All of them had the same culture and environment, allowing a cleaner comparison between the two strategies. Dyble’s team diverged from the work-diary approach used by Lee, in which the researcher attempts to capture all the activities of their subjects’ day (which turns out to be quite hard), and instead deployed the more modern experience-sampling method, in which, at randomly generated intervals, the researchers record what their subjects are doing at that exact moment. The goal was to calculate, for both the farmers and foragers, the relative proportion of samples dedicated to leisure versus work activities.
“The group engaged entirely in foraging spent forty to fifty per cent of daylight hours at leisure,” Dyble told me, when I asked him to summarize his team’s results, “versus more like thirty per cent for those who engage entirely in farming.” His data validates Lee’s claim that hunter-gatherers enjoy more leisure time than agriculturalists, though perhaps not to the same extreme. Missing from these high-level numbers, however, is an equally important observation: how this leisure time was distributed throughout the day. As Dyble explained, while the farmers engaged in “monotonous, continuous work,” the pace of the foragers’ schedules was more varied, with breaks interspersed throughout their daily efforts. “Hunting trips required a long hike through the forest, so you’d be out all day, but you’d have breaks,” Dyble told me. “With something like fishing, there are spikes, ups and downs . . . only a small per cent of their time is spent actually fishing.”
Once again, when we compare the work experience of hunter-gathers with that of our contemporary Apple employees we find a wedge of insight. Modern knowledge workers adopt the factory model, in which you work for set hours each day at a continually high level of intensity, without significant breaks. The Agta forager, by contrast, would think nothing of stopping for a long midday nap if the sun were hot and the game proved hard to track. When was the last time an Apple employee found herself with two or three unscheduled hours on her calendar during the afternoon to just kick back? To make matters worse for our current moment, laptops and smartphones have pushed work beyond these long days to also colonize the evenings and weekends once dedicated to rest. In the hunter-gatherer context, work intensity fluctuated based on the circumstances of the moment. Today, we’ve replaced this rhythm with a more exhausting culture of always being on.
The final point of difference I observed concerns the nature of the work occupying our time then and now. “[H]ow do you become a successful hunter-gatherer?” Lucassen asks in his magisterial “The Story of Work.” “You must learn it, and the apprenticeship is long.” Drawing from multiple anthropological sources, Lucassen presents a common “schema” for training competent hunters. Young children are given toy hunting weapons to familiarize them with their tools. Next, between the ages of five and seven, they join hunting trips to observe the adults’ techniques. (In general, Lucassen notes, observation is prioritized over teaching.) By the age of twelve or thirteen, children can hunt on their own with their peers and are introduced to more complex strategies. Finally, by late adolescence, they’re ready to learn the details of pursuing larger game. An entire childhood is dedicated to perfecting this useful ability.
Gathering, of course, is just as complex as hunting. Dyble told me that the Agta possess detailed knowledge of dozens of different plant species, including their spatial distribution in the forests and fields surrounding their villages. Other everyday skills, absolutely critical for survival in times long past, require similarly demanding levels of training. I recently watched some of the early seasons of the History Channel survival program “Alone,” some of which were filmed in notoriously damp British Columbia. It struck me how often the contestants, many of whom were world-class survivalists, struggled to get fires started, even with the help of modern ferro rods, which spew showers of sparks. For hundreds of thousands of years, one can assume, humans consistently conjured flames, under all types of demanding conditions, and without the benefit of modern tools. The frustrated survivalists on “Alone” regularly practiced the art of fire-starting. Early humans mastered it.
Given the necessity of complex skills for survival throughout most of our history, it’s not surprising that we enjoy the feeling of doing practical, difficult tasks. “The satisfactions of manifesting oneself concretely in the world through manual competence have been known to make a man quiet and easy,” Matthew Crawford writes in his 2009 ode to skilled handiwork, “Shop Class as Soulcraft.” There’s scientific support for these satisfactions as well. Self-determination theory, a well-cited psychological framework for understanding human motivation, identifies a feeling of “competence” as one of the three critical ingredients for generating high-quality motivation and engagement.
Returning to the context of our protesting Apple employees, we find our instinct for skilled effort once again impeded by modern obstacles. To be sure, knowledge work does often require high levels of education and skill, but in recent years we’ve increasingly drowned the application of such talents in a deluge of distraction. We can blame this, in part, on the rise of low-friction digital communication tools like e-mail and chat. Office collaboration now takes place largely through a frenzy of back-and-forth, ad-hoc messaging, punctuated by meetings.The satisfactions of skilled labor are unavoidably diluted when you can only dedicate partial attention to your efforts. Our ancestors were adapted to do hard things well. The modern office, by contrast, encourages a fragmented mediocrity.
If we want to make office life more sustainable and humane, addressing these mismatches between our nature and our current reality is a good place to start. Consider overload. As argued, our brains struggle when asked to rapidly switch between many different ongoing projects. Knowledge work can be adjusted to better avoid this issue. I’ve previously advocated, for example, for more reliance on pull systems for task assignment in office environments. In such a scheme, you work on only one major objective at a time. When you reach a clear stopping point, you then—and only then—pull in the next thing to tackle from a centrally managed collection. This supports singular focus and the sequential completion of objectives, a rhythm that may be more familiar to our ancient brains.
Another mismatch is the unnatural way in which modern knowledge workers toil at a continually high level of intensity. As noted, throughout our species’ history as hunter-gatherers the pace of our labors was likely more varied. Correcting this issue would require us to address the deeper issue of performativity in office environments. Work that takes place on computer screens tends to be vaguer and more ambiguous than physical efforts. Without the ability to point to a pile of completed widgets to demonstrate our productivity, we’re often forced to fall back on visible busyness as a proxy for worthy labor—leading to our unnatural constant intensity. Remote work might provide some relief here, as it removes you from the physical gaze of your manager, but escaping the office doesn’t eliminate the digital variant of this surveillance that refracts through modern communication tools. Your manager might not literally see whether you’re working, but she can see how quickly you’re responding to her e-mail messages.
There are practical solutions to these issues, too. Last summer, for example, I wrote a column about a philosophy called the Results-Only Work Environment, or ROWE, for short. Innovated by a pair of human-resource employees at the Best Buy corporate headquarters in the early two-thousands, ROWE provides you full autonomy over when and how you accomplish your work. In this scheme, you’re measured only by your results, not your visible activity. ROWE is exactly the type of management philosophy that would enable more natural and autonomous variations in the intensity of your work over time.
The final mismatch I identified concerned the way in which modern knowledge work subverts our instinct for skilled effort. It’s not that knowledge workers lack ability but, instead, that the relentless, mind-warping distraction that defines the modern office makes it difficult to apply these abilities in a satisfying manner. The talented marketing executive wants to focus her energy on writing a brilliant campaign, and would find great fulfillment in doing so, but finds herself instead thwarted by the constant ping of her in-box and demands of her calendar.
One simple but surprisingly effective improvement is to declare that e-mail and chat tools are only for broadcasting information or asking questions that can be answered with a single reply. Any interaction that instead requires multiple back-and-forth exchanges must be deferred to an actual, real-time conversation. Moving every such conversation to its own pre-scheduled meeting, of course, would be absurd: you would soon find your calendar overwhelmed by these discussions. But you might instead institute “office hours.” At a fixed time, every day, you make yourself available for unscheduled conversation—your door is open, your phone turned on, and you’ve logged into a public Zoom meeting. Now when someone sends you a message about a complicated issue, you can politely respond, “This sounds important, grab me at one of my upcoming office hours and we can get into it.” When you eliminate the need to service many different ongoing, back-and-forth interactions, the number of times you feel obligated to check your in-box plummets, improving your ability to find long, uninterrupted stretches to focus on hard things.
The problem with these types of solutions, of course, is that they’re difficult to implement. The ROWE philosophy, for example, can be transformative when properly executed, but requires an extensive commitment to maintain, and, even then, some people are never able to adjust to the accountability that comes with this increased flexibility. Similarly, office hours really can significantly reduce the quantity of urgent communication in peoples’ in-boxes, but the small number of companies I’ve identified that successfully deploy this collaboration strategy depended on unequivocal support from top executives. As with ROWE, these are not fixes that can be casually implemented by individuals. One can hope, however, that understanding the deeper issues motivating these suggestions can help generate the will needed to make progress at the organizational level. When considered in isolation, for example, a pull system for task assignment might come across as fiddly and eccentric. When it’s instead presented against the backdrop of an understanding of a human brain adapted for the sequential completion of goals, it might now seem essential.
It’s safe to assume that, when Richard Lee journeyed to the Kalahari Desert, the travails of office workers were not on his mind. He sought instead to better understand the hunting and gathering activities that had dominated the human experience for so much of our history. My leap from this sober-minded anthropology, with its dense journals and conferences attended by Claude Lévi-Strauss, to pragmatic suggestions about pull systems and office hours might seem an ambitious thought experiment at best, and reckless at worst. In some ways, this is true. But, in other ways, a look back to the deep history of human work seems well suited to the goal of better understanding structural issues currently afflicting the knowledge sector. Those frustrated Apple employees aren’t just arguing about their commutes; they’re at the vanguard of a movement that’s leveraging the disruptions of the pandemic to question so many more of the arbitrary assumptions that have come to define the modern workplace. Why do we follow a factory-style work schedule, or feel forced to perform busyness, or spend more time in meetings talking about projects rather than actually completing them?
If we hope to replace this mishmash of conventions with something more fulfilling and sustainable, it makes sense to start by asking fundamental questions about what “work” meant throughout most of human history. Once we realize the degree to which our minds adapted for immediate-return efforts, with varied pace and plenty of skill, our frustrations with long days filled with frantic e-mailing and schedule-devouring meetings suddenly make sense. We’re built to work. But not this way. In the conclusion of his paper on his time spent among the Ju/’hoansi, Lee argued that through most of our species’ history, in most of the environments in which we have lived, hunting and gathering was a “well-adapted way of life.” Perhaps the time has come to demand something similar from the types of work that take up so much of our time today. ♦
Sourse: newyorker.com