ScienceSoft

How Vannevar Bush Engineered the 20th Century

In the summer of 1945, Robert J. Oppenheimer and other key members of the Manhattan Project gathered in New Mexico to witness the first atomic bomb test. Among the observers was Vannevar Bush, who had overseen the Manhattan Project and served as the sole liaison to U.S. President Franklin D. Roosevelt on progress toward the bomb.

Remarkably, given his intense wartime responsibilities, Bush continued to develop his own ideas about computing and information. Just days before the Trinity test, he had published in
The Atlantic Monthly a futuristic account of networks of information knitted together via “associative trails”—which we would now call hypertext or hyperlinks. To this day, Bush’s article—titled “As We May Think”—and his subsequent elaborations of networked information appliances are credited with shaping what would become the personal computer and the World Wide Web. And during his lifetime, Bush was celebrated as one of the nation’s leading prophets of technological change and the most influential proponent of government funding of science and engineering.

Vannevar Bush’s influential 1945 essay “As We May Think” shaped the subsequent development of the personal computer and the World Wide Web. The Atlantic Monthly

And yet, if you watched this year’s Oscar-winning
Oppenheimer, Bush is only a minor character. Played by actor Matthew Modine, he testifies before a secret government panel that will decide whether Oppenheimer, scientific director of the Manhattan Project, should be stripped of his security clearance and banished from participating in future government decisions on sensitive technological issues.

“Try me, if you want to try him,” Bush defiantly tells the panel. Alas, tragedy unfolds when the panel punishes Oppenheimer for his opposition to testing the nation’s
first hydrogen bomb. No more is said about Bush, even though he also opposed the first H-bomb test, on the grounds that the test, held on 1 November 1952, would help the Soviet Union build its own superweapon and accelerate a nuclear arms race. Bush was spared sanction and continued to serve in government, while Oppenheimer became a pariah.

Today, though, Oppenheimer is lionized while Bush is little known outside a small circle of historians, computer scientists, and policy thinkers. And yet, Bush’s legacy is without a doubt the more significant one for engineers and scientists, entrepreneurs, and public policymakers. He died at the age of 84 on 28 June 1974, and the 50th anniversary of his death seems like a good time to reflect on all that Vannevar Bush did to harness technological innovation as the chief source of economic, political, and military power for the United States and other leading nations.

Vannevar Bush and the Funding of Science & Engineering

Beginning in 1940, and with the ear of the president and leading scientific and engineering organizations, Vannevar Bush promoted the importance of supporting all aspects of research, including in universities, the military, and industry. Bush’s vision was shaped by World War II and America’s need to rapidly mobilize scientists and engineers for war fighting and defense. And it deepened during the long Cold War.

Bush’s pivotal contribution was his creation of the “research contract,” whereby public funds are awarded to civilian scientists and engineers based on effort, not just outcomes (as had been normal before World War II). This freedom to try new things and take risks transformed relations between government, business, and academia. By the end of the war, Bush’s research organization was spending US $3 million a week (about $52 million in today’s dollars) on some 6,000 researchers, most of them university professors and corporate engineers.

On its 3 April 1944 cover, Time called Vannevar Bush the “General of Physics,” for his role in accelerating wartime R&D.Ernest Hamlin Baker/TIME

Celebrated as the “general of physics” on the
cover of Time magazine in 1944, Bush served as the first research chief of the newly created Department of Defense in 1947. Three years later, he successfully advocated for the creation of a national science foundation, to nourish and sustain civilian R&D. In launching his campaign for the foundation, Bush issued a report, entitled Science, The Endless Frontier, in which he argued that the nation’s future prosperity and the American spirit of “frontier” exploration depended on advances in science and engineering.

Bush’s influence went well beyond the politics of research and the mobilization of technology for national security. He was also a business innovator. In the 1920s, he cofounded
Raytheon, and the company competed with behemoth RCA in the design and manufacture of vacuum tubes. As a professor and later dean of engineering at the Massachusetts Institute of Technology, he crafted incentives for professors to consult part time for business, setting in motion in the 1920s and 1930s practices now considered essential to science-based industry.

Bush’s beliefs influenced
Frederick Terman, a doctoral student of his, to join Stanford University, where Terman played a decisive role in the birth of Silicon Valley. Another Bush doctoral student, Claude Shannon, joined Bell Labs and founded information theory. As a friend and trusted adviser to Georges Doriot, Bush helped launch one of the first venture capital firms, American Research and Development Corp.

Vannevar Bush’s Contributions to Computing

Starting in the 1920s, Bush began designing analog computing machines, known as differential analyzers. This version was at Aberdeen Proving Ground, in Maryland.MIT Museum

But wait, there’s more! Bush was a major figure in the early history of modern computing. In the 1930s, he gained prestige as the designer of a room-size analog computing machine known as the “differential analyzer,” then considered the most powerful calculating machine on the planet. It was visually impressive enough that UCLA’s differential analyzer had a major cameo in the 1951 sci-fi movie When Worlds Collide.

In the 1940s, despite his busy schedule with the Manhattan Project, Bush set aside time to envision and build working models of a desktop “memory extender,” or memex, to assist professionals in managing information and making decisions. And, as mentioned, he published that pivotal
Atlantic article.

For engineers, Bush carries a special significance because of his passionate arguments throughout his life that all engineers—especially electrical engineers—deserve the same professional status as doctors, lawyers, and judges. Before World War II, engineers were viewed chiefly as workers for hire who did what they were told by their employers, but Bush eloquently insisted that engineers possessed
professional rights and obligations and that they delivered their expert judgments independently and, when feasible, with the public interest in mind.

Vannevar Bush considered engineering not just a job but a calling. John Lent/AP

From the distance of a half century, Bush’s record as a futurist was mixed. He failed to envision the enormous expansion of both digital processing power and storage. He loudly proclaimed that miniaturized analog images stored on microfilm would long provide ample storage. (To be fair, many old microfilm and microfiche archives remain readable, unlike, say, digital video disks and
old floppies.)

And yet, Bush’s ideas about the future of information have proved prescient. He believed, for example, that human consciousness could be enhanced through computational aids and that the automation of routine cognitive tasks could liberate human minds to concentrate and solve more difficult problems.

In this regard, Bush prefigures later computing pioneers like
Douglas Engelbart (inventor of the mouse) and Larry Page (cofounder of Google), who promoted the concept of human “augmentation” through innovative digital means, such as hypertext and search, and enhancing the speed, accuracy, and depth of purposeful thought. Indeed, today’s debate over the harm to humans from generative AI could benefit from Bush’s own calm assessment about the creative, intellectual, and artistic benefits to be gained from “the revolution in machines to reduce mental drudgery.” The subject of human enhancement through digital systems was “almost constantly” on his mind, he wrote in his 1970 memoir, Pieces of the Action, four years before his death. Bush cautioned against hysteria in the face of digitally mediated cognitive enhancements. And he insisted that our technological systems should maintain the proverbial “human in the loop,” in order to honor and safeguard our values in the tricky management of digital information systems.

The fate of human culture and values was not Bush’s only worry. In his later life, he fretted about the spread of nuclear weapons and the risk of their use. Fittingly, as the titular head of the Manhattan Project and, in the 1950s, an opponent of testing the first H-bomb, he saw nuclear weapons as an existential threat to all life on the planet.

Bush identified no ultimate solutions to these problems. Having done so much to enhance and solidify the role of scientists and engineers in the advancement of society, he nevertheless foresaw an uncertain world, where scientific and technological outcomes would also continue to challenge us.

​IEEE Spectrum  

Related Articles

Back to top button