In lieu of an abstract, here is a brief excerpt of the content:

  • When Social Networking Was Social
  • Michael D. Gordin (bio)
Joy Lisi Rankin, A People’s History of Computing in the United States. Cambridge, MA: Harvard University Press, 2018. 336 pages. Figures, maps, notes, bibliography, and index. $29.95.

Two transformative technologies emerged out of World War II, and for decades historians—following the world at large—focused on the wrong one.1 Nuclear fission not only powered the bombs that destroyed Hiroshima and Nagasaki, it promised to either save humanity through cheap atomic energy and medically relevant radioisotopes, or to destroy us all if the U.S.-Soviet Cold War turned hot. The historiography mushroomed like the clouds these weapons released. At first, the work was narrowly technical (how was the bomb made and used?) or political (how was international and/or domestic control to be negotiated?), but with the growth of social and cultural history it came to encompass civil defense, reactor meltdowns, films, protest movements, and much more. We are left with a rich body of scholarship exploring almost every aspect of a technology that was supposed to alchemize everything but ended up leaving intact much of how things were done before scientists split the atom. We never got safe energy too cheap to meter, but we also never got the fiery apocalypse, so we can call it even.

What about that other amazing technology of the war? News about it spread with less fanfare, ironically because it had done a great deal to secure victory for the Allies and disclosure would diminish its future strategic significance. Unlike the atomic bomb, whose cultural power relied upon its being dropped on cities or telegenically detonated before cameras, the digital calculating machine was a placid newborn. Still, these new “computers”—until the end of the war, the term referred to the workers, usually women, who operated calculating machines rather than to the devices—at first made quite a din. The bombes at Bletchley Park and the ENIAC (Electronic Numerical Integrator and Computer, first put to work on 10 December 1945) were massive things, and massively expensive. They crunched numbers, enabling the resolution of thorny numerical calculations crucial for the national security state. Fast forward a couple of decades, and you may well be reading these words not [End Page 159] on paper but on a descendent of those room-sized behemoths that is orders of magnitude more powerful. Possibly it fits in your pocket.

The historiography chronicling this metamorphosis was rather slower to get off the ground than its nuclear counterpart, partly because of the unwillingness of private businesses to risk leaking trade secrets. Plus, I suspect, historians were waiting to see whether these doohickeys were really going to shake things up or not. The early scholarship focused on the hardware: how did the machines actually work, and how did people figure that out? Unless you are an electrical engineer, these studies made for dry reading, while the flashier emergence of “cybernetics” (1948), morphing into “artificial intelligence” (1956), seemed at first too weird to tackle. The narrative began to pick up in the 1960s, as the United States witnessed the boom in minicomputers, then microcomputers, then “personal computers” in the 1970s. The historical consensus was slow to congeal, but it has proved tenacious.2 I expect most teenagers with a smartphone can rattle off a variant of the story.

As Joy Lisi Rankin relates in her forcefully revisionist A People’s History of Computing in the United States, that standard picture—she calls it the “Silicon Valley mythology”—while not entirely inaccurate, is more a wildly successful branding exercise than a true history of how the Atomic Age turned into the Information Age. Her introduction provides an especially clear articulation of the incomplete picture painted by this mythology and what is wrong with it:

This compelling myth tells us that, once upon a time, modern computers were big (and maybe even bad) mainframes. International Business Machines, much more familiar as IBM, dominated the era when computers were the remote and room-size machines of the military-industrial complex. Then, around 1975, along came the California hobbyists who created personal computers and liberated us from the monolithic mainframes. They were young...

pdf

Share