Save this storySave this storySave this storySave this story
In 1979, two M.I.T. computer-science alumni and a Harvard Business School graduate launched a new piece of computer software for the Apple II machine, an early home computer. Called VisiCalc, short for “visible calculator,” it was a spreadsheet, with an unassuming interface of monochrome numerals and characters. But it was a dramatic upgrade from the paper-based charts traditionally used to project business revenue or manage a budget. VisiCalc could perform calculations and update figures across columns and rows in real time, based on formulas that the user programmed in. No more writing out numbers painstakingly by hand.
VisiCalc sold more than seven hundred thousand copies in its first six years, and almost single-handedly demonstrated the utility of the Apple II, which retailed for more than a thousand dollars at the time (the equivalent of more than five thousand dollars in 2023). Prior to the early seventies, computers were centralized machines—occupying entire rooms—that academics and hobbyists shared or rented time on, using them communally. They were more boiler-room infrastructure than life-style accessory, attended to by experts away from the public eye. With the VisiCalc software, suddenly, purchasing a very costly and sophisticated machine for the use of a single employee made sense. The computer began moving into daily life, into its stalwart position on the top of our desks.
As Laine Nooney, a professor of media and information industries at New York University, writes in their recent book “The Apple II Age: How the Computer Became Personal” (University of Chicago Press), VisiCalc kicked off the process of “ ‘computerizing’ business.” By now, of course, everything else has been computerized as well. Computers today are not just personal devices but intimate appendages, almost extensions of our own bodies and brains. We use our smartphones in practically every aspect of life: working, socializing, dating, shopping, reading. Nooney’s book tracks the pivotal years of the shift toward personal computing, epitomized by the Apple II and sped along by consumer software—not just VisiCalc’s spreadsheets but adventure games and greeting-card-design tools—that made the computer a useful and convenient partner in daily life outside of the office, too.
Before there was the personal computer (which gave us the shorthand PC) the term of art for a domestic computing machine was “microcomputer.” The Altair 8800, which débuted, in 1975, was the first “microcomputer kit” to sell more than a few hundred units, according to Nooney. Made by a New Mexico company called MITS, which also sold kits for building rockets and calculators, the Altair emerged from a decentralized community of American computer hobbyists who were accustomed to building their own machines out of components from radios and televisions. “The Altair did not invent the idea of a computer one could personally own. Rather, it tapped into an ambient desire for ownership and individualized use,” Nooney writes. The goal was “to create a technological world fashioned to one’s own desires.” Steve Wozniak and Steve Jobs, the co-founders of Apple, created the next wave of popular microcomputers with their first Apple computer, in 1976, and then the Apple II, in 1977. The latter was the company’s first commercial breakthrough; it went on to sell more than five million units. Wozniak’s technical innovations, such as designing circuits that were able to display different colors on a monitor, were matched by Jobs’s talent for creating a salable consumer product. He insisted that the Apple II be housed in a plastic casing, making it more elegant and approachable than hobbyists’ severe industrial boxes.
The development of the personal computer was iterative and contingent; it was not a matter of destiny but of experimentation in many different directions at once. The Apple II beat out its competitors, including the Commodore PET 2001 and the Tandy Corporation’s TRS-80 Model I, in part because of its open-endedness. Coming from the hobbyist community, Wozniak was used to designing computer hardware for expandability and modification. With the Apple II, purchasing a product off the shelf wasn’t conceived as an end point but as the start of a user’s process of customizing her own machine.
The Apple II looked a bit like a typewriter, with a keyboard extending off a sloped front. Accessories like monitors and drives were stacked on top like children’s building blocks. The user chose her own operating system and display monitor, and whichever appendages she desired, such as a modem or game controllers. To add RAM, she had to open the housing and plug in a microchip card. “Installing the memory expansion card is easy,” the Apple II manual cheerily promised, above a photo of the exposed guts of the computer. As the demands of software and equipment evolved, Apple II owners found that their machines had the flexibility to keep up.
Nooney’s book tells the story of how computers became irrevocably personal, but what’s most striking, revisiting the history of the Apple II, is how much less personalizable our machines have become. Computers today, small enough to fit in the palms of our hands, require much less work on the part of the user. Apple’s iPhones all look more or less the same. Their cases are sealed; when they break or glitch, or when an upgrade is required, we tend to replace them outright and discard the old one. We control their superficial traits—choosing between rose-gold or alpine-green case covers—but make few decisions about how they function. Customizable computer towers, like the Mac Pro, are the domain of professionals and experts—a video editor who needs extra horsepower, for example. The rest of us just flip open our laptops and expect everything to run on its own.
Whatever customization we do engage in has moved to the realm of the digital. We can load apps on our iPhones at the press of a button, but only those that Apple allows into its App Store, which has rigid rules around monetary transactions and content. Some new platforms, such as Mastodon and Urbit, allow users to run their own customizable iterations of social-media software, but doing so requires its own forms of expertise. Otherwise, the likes of Facebook, Instagram, and TikTok dictate our digital experiences in ways we can’t change. Nooney recounts how, over the seventies and eighties, investment began pouring in to technology and software companies from institutions and venture capitalists. Hobbyists and independent, small-scale firms that sold software in Ziploc bags were gradually crowded out by formalized, well-funded corporations advertised in glossy magazines and stocked by department stores.
Computers today are unavoidable fixtures of our lives, but instead of co-creators—modifying, hacking, and programming—we are sheer consumers. Our lack of agency is a boon for Silicon Valley companies, which profit from herding us frictionlessly through their gated infrastructure. Through digital data surveillance, we even become bulk products for the companies to sell in turn. Customization leads to diversity; diversity is less scalable, less easily packaged, and thus less profitable. Our computers may be personal, but they are not solely devoted to serving our needs. ♦
Sourse: newyorker.com