Collaboration

Walter Isaacson’s ‘The Innovators’: How a Gaggle of Geeks Invented the Future

Fixating on the likes of Steve Jobs or Bill Gates misses the crucial truth of the digital revolution, argues Walter Isaacson in a new book mapping the landscape where we all live now.

10.07.14 5:50 AM ET

Walter Isaacson began his career as a reporter, and he still has the journalist’s knack of corralling a lot of information into a coherent, cogent narrative. His joint history of the development of the Internet and the personal computer features a parade of lively personality sketches, each one neatly slotted into a timeline that briskly carries us along from George Babbage’s Analytical Engine in 1834 to the incorporation of Google in 1998. Although Isaacson is best known for his biographies of Steve Jobs, Benjamin Franklin, and other notable individuals, his central point here is that groups, not lone geniuses, drove the digital revolution.

“Most of the innovations of the digital age were done collaboratively,” he writes. “The tale of their teamwork is important because we don’t often focus on how central that skill is to innovation.” Again and again, he shows inventions crucial to the development of the computer and the Internet—transistors, microchips, web browsers—depending equally on three sorts of people: the visionary who comes up with the idea; the engineer who figures out how to make it actually work; and the businessperson who sees it as a product and finds a market for it.

Just how important that last figure is can be seen in a famous quote from the president of DEC, an early leader in the building of smaller (i.e., refrigerator-sized) computers for office use. When it was suggested in 1974 that computers small enough to put on a desktop would be attractive to general consumers, he replied, “I can’t see any reason that anyone would want a computer of his own.” Steve Jobs could, and he could see that Steve Wozniak’s design integrating a keyboard, monitor, and terminal would expand the niche market for computer kits built by hobbyists into a huge consumer market for the personal computer.

There are many such telling anecdotes in The Innovators, and they often spotlight another key issue in the history of the digital age: the question of whether intellectual property should be freely shared in the public domain to encourage widespread adoption, or protected by patent to provide financial incentives for innovation and investment. Isaacson’s answer is that the open-source and proprietary approaches complement each other, which sounds weaselly until you look at the case of Microsoft BASIC, the software that established Bill Gates’s fledgling company. Gates was infuriated when the scruffy hackers at the Homebrew Computer Club started making and passing around copies of his software without paying for it, but that wholesale piracy made Microsoft BASIC the industry standard and forced computer manufacturers to license it.

Gates made Microsoft’s fortune on the 1980 deal with IBM for software to run on its new PC. The license was non-exclusive, which meant Microsoft could sell the software to other computer companies—again making it the industry standard. Gates was following in the footsteps of Robert Noyce, who insisted that Intel retain the rights to the microprocessor the company created for a Japanese calculator. Noyce knew that this programmable chip could be sold to manufacturers of an almost infinite variety of products, from coffeemakers to medical devices.

Noyce and his partner Gordon Moore also pioneered a new corporate culture at Intel. They avoided the trappings of hierarchy: everyone worked in the same cubicles; there were no executive parking spaces. They created an open, unstructured workplace where employees could freely exchange ideas; decisions were made at meetings among equals, not handed down from above, and junior staff were encouraged to resolve conflicts among themselves. Hard-charging director of engineering Andy Groves ensured accountability and discipline, but Intel’s informal style would come to define the digital age.

For all the counter-culture trappings of digital companies, Isaacson makes it clear throughout that government funding for basic research was essential to the development of both computers and the Internet, which were products of what he terms (adding an extra element to President Eisenhower’s famous formulation) “the military-industrial-academic complex.” ARPA Net, precursor to the Internet, was a project of the Defense Department’s Advanced Research Projects Agency (ARPA), which wanted to build a data network to connect the university research centers it funded so as to avoid redundant, expensive duplication of computers.

It was government policy that in 1993 allowed commercial services like America Online access to the Internet, previously restricted to educational and research institutions. That was the digital age’s Big Bang, Isaacson writes: “Computers and communications networks and repositories of digital information were woven together and put at the fingertips of every individual.” That same year saw the explosive growth of the World Wide Web, thanks to creator Tim Berners-Lee’s insistence on open-sourcing all the code he wrote to enable access to computer-stored information anywhere in the world. Berners-Lee could have got rich by patenting Uniform Resource Locators (URLs), Hypertext Transfer Protocol (HTTP), and Hypertext Markup Language (HTML). He chose to let the Web evolve freely.

Isaacson brings his story to a close with Google, the search engine that made finding information on the Web a less daunting experience and, in his view, “became the culmination of a sixty-year process to create a world in which humans, computers, and networks were intimately linked.” Smart phones and social networks await their own historian; they play no part in Isaacson’s tale, which conveniently ends before the digital age’s messier, seamier side came into public view. Microsoft’s anti-competitive practices? Google tracking your search history and sharing your data with third parties? You won’t read about those here. The author is so focused on the bright digital future, when “the interplay between technology and the arts will eventually result in completely new forms of expression,” that present-day economic and privacy concerns seem to have escaped his notice.

Nonetheless, his careful, well-organized book, written in lucid prose accessible to even the most science-challenged, is well worth reading for its capable survey of the myriad strands that intertwined to form the brave new, ultra-connected world we live in today.