The Idea Factory: Bell Labs and the Great Age of American Innovation by Jon Gertner
Shannon’s self-proclaimed “ultimate machine,” for instance, seemed a jesting commentary on the subject of the meaning of his tinkerings. It was a wooden box with a single switch. A user hit the switch to turn it on, the box top opened and a mechanical hand reached out and turned the switch off, then the hand retreated into the box and the top closed.
“Theseus Mouse is cleverer than Theseus the Greek,” the magazine’s writer noted, “who could not trust his memory but had to unwind a ball of string to guide him out of the labyrinth.” In large part, this observation matched Shannon’s intuition that machines would someday be smarter than men in some respects. Shannon often rhapsodized about the human brain and the inimitable processing power of its billions of neurons. But he believed without question that machines had the potential to do calculations and perform logical operations and store numbers with a speed, efficiency, and accuracy that would soon dwarf our own. It was only a matter of time. Often, he said, he was rooting for the machines.
“It is hoped that a satisfactory solution of this problem will act as a wedge in attacking other problems of a similar nature and of greater significance.” If you could get a computer to play chess, in other words, you could conceivably get it to route phone calls, or translate a language, or make strategic decisions in military situations. You could build “machines capable of orchestrating a melody,” he suggested. And you might be able to construct “machines capable of logical deduction.”
Would you say that these computing machines are capable of “thinking”? Shannon: Depends on how you define “thinking”—memory, decisions, but must all be programmed into the machine.
With Shannon’s startling ideas on information, it was one of the rare moments in history, an academic would later point out, “where somebody founded a field, stated all the major results, and proved most of them all pretty much at once.”44 Eventually, mathematicians would debate not whether Shannon was ahead of his contemporaries. They would debate whether he was twenty, or thirty, or fifty years ahead.
Of course, these two philosophies—that individuals as well as groups were necessary for innovation—weren’t mutually exclusive. It was the individual from which all ideas originated, and the group (or the multiple groups) to which the ideas, and eventually the innovation responsibilities, were transferred. The handoffs often proceeded in logical progression: from the scientist pursuing with his colleagues a basic understanding in the research department, to the applied scientist working with a team in the development side, to a group of engineers working on production at Western Electric. What’s more, in the right environment, a group or wise colleague could provoke an individual toward an insight, too. In the midst of Shannon’s career, some lawyers in the patent department at Bell Labs decided to study whether there was an organizing principle that could explain why certain individuals at the Labs were more productive than others. They discerned only one common thread: Workers with the most patents often shared lunch or breakfast with a Bell Labs electrical engineer named Harry Nyquist. It wasn’t the case that Nyquist gave them specific ideas. Rather, as one scientist recalled, “he drew people out, got them thinking.” More than anything, Nyquist asked good questions.
“With all the needed emphasis on leadership, organization and teamwork, the individual has remained supreme—of paramount importance. It is in the mind of a single person that creative ideas and concepts are born.”40 There was an essential truth to this, too—John Bardeen suddenly suggesting to the solid-state group that they should consider working on the hard-to-penetrate surface states on semiconductors, for instance. Or Shockley, mad with envy, sitting in his Chicago hotel room and laying the groundwork for the junction transistor. Or Bill Pfann, who took a nap after lunch and awoke, as if from an edifying dream, with a new method for purifying germanium.
But in some respects his solitude was interesting, too, for it had become a matter of some consideration at the Labs whether the key to invention was a matter of individual genius or collaboration. To those trying to routinize the process of innovation—the lifelong goal of Mervin Kelly, the Labs’ leader—there was evidence both for and against the primacy of the group. So many of the wartime and postwar breakthroughs—the Manhattan Project, radar, the transistor—were clearly group efforts, a compilation of the ideas and inventions of individuals bound together with common purposes and complementary talents.
Shannon suggested it was most useful to calculate a message’s information content and rate in a term that he suggested engineers call “bits”—a word that had never before appeared in print with this meaning. Shannon had borrowed it from his Bell Labs math colleague John Tukey as an abbreviation of “binary digits.”
With each passing decade, modern technology has tended to push everyday written and spoken exchanges ever deeper into the realm of ciphers, symbols, and electronically enhanced puzzles of representation.
“There is this close connection,” Shannon later said of the link between sending an encoded message and an uncoded one. “They are very similar things, in one case trying to conceal information, and in the other case trying to transmit it.” In the secrecy paper, he referred briefly to something he called “information theory.”
As word spread, Shannon’s slender and highly mathematical paper, about twenty-five pages in all, would ultimately become known as the most influential master’s thesis in history. In time, it would influence the design of computers that were just coming into existence as well as those that wouldn’t be built for at least another generation.
“Bell Labs is no ‘house of magic,’” Kelly warned, echoing the headline of a recent magazine story about the Labs that he had found repellent. “There is nothing magical about science. Our research people are following a straight plan as a part of a system and there is no magic about it.”
If an idea begat a discovery, and if a discovery begat an invention, then an innovation defined the lengthy and wholesale transformation of an idea into a technological product (or process) meant for widespread practical use. Almost by definition, a single person, or even a single group, could not alone create an innovation. The task was too variegated and involved.
If an idea is the most elemental unit of human progress, what comes after that?
“It appears that Transistors might have important uses in electronic computer circuits,” Jay Forrester, the associate director of MIT’s electrical engineering department, wrote to Bown in July 1948. “In view of this fact, we would like to obtain some sample transistors when they become available in order to investigate their possible applications to high-speed digital computing apparatus.”
The solid-state group that Shockley led had been built upon the principles of an open exchange of ideas, and Shockley had apparently ignored those principles.
There was in this strategy “a modicum of self interest,” according to Bown. “Who is in a better position than the originator to recognize and profit from further advances?”
For one thing, AT&T maintained its monopoly at the government’s pleasure, and with the understanding that its scientific work was in the public’s interest. An audacious move to capitalize on the transistor, should it turn out to be hugely valuable, could well invite government regulators to reexamine the company’s civic-mindedness and antitrust status.
Their work together was further buoyed by the exchange of ideas within the larger solid-state group, which would gather sometimes once a day—and at least once a week—in meetings often led by Shockley, to exchange thoughts and review experiments. “I cannot overemphasize the rapport of this group,” Brattain said. “We would meet together to discuss important steps almost on the spur of the moment of an afternoon. We would discuss things freely. I think many of us had ideas in these discussion groups, one person’s remarks suggesting an idea to another.” The group would carry their discussions into lunch in the cafeteria as well. Or they would get in their cars and drive a few miles south along Diamond Hill Road, a narrow, sinuous county highway, to visit a small hamburger joint called Snuffy’s. The Bell Labs cafeteria didn’t serve beer. Snuffy’s did.
Thousands of men and women under Kelly were working intently on military applications, and the public was increasingly aware of their contributions. Press coverage of various aspects of what was known as “the physicists’ war” was glowing. Still, for many members of the technical staff, the wartime work required a difficult philosophical transformation. The ideas of scientists thrive on publication and broad dissemination; but the ideas of engineers, especially during wartime, thrive only if secrecy is maintained.
WE USUALLY IMAGINE that invention occurs in a flash, with a eureka moment that leads a lone inventor toward a startling epiphany. In truth, large leaps forward in technology rarely have a precise point of origin. At the start, forces that precede an invention merely begin to align, often imperceptibly, as a group of people and ideas converge, until over the course of months or years (or decades) they gain clarity and momentum and the help of additional ideas and actors. Luck seems to matter, and so does timing, for it tends to be the case that the right answers, the right people, the right place—perhaps all three—require a serendipitous encounter with the right problem. And then—sometimes—a leap. Only in retrospect do such leaps look obvious.
AT&T’s size and dominating nature raised the question of whether it was actually an “industrial dictatorship” obscured by a scrim of civic-mindedness. “The [Bell] System,” Danielian pointed out, “constitutes the largest aggregation of capital that has ever been controlled by a single private company at any time in the history of business.
At one point during the first few days the freshmen were asked to sell the rights to their future patents, whatever these might be; their research, wherever it took them, was to benefit Bell Labs and phone subscribers. None of the young men refused. And in exchange for their signatures, each was given a crisp one-dollar bill.
Mainly, they were looking for good problems.
The industrial lab showed that the group—especially the interdisciplinary group—was better than the lone scientist or small team.
“It is an instrument capable of avoiding many of the mistakes of a blind cut-and-try experimentation. It is likewise an instrument which can bring to bear an aggregate of creative force on any particular problem which is infinitely greater than any force which can be conceived of as residing in the intellectual capacity of an individual.”
Scientific research was a leap into the unknown, in other words. “Of its output,” Arnold would later say of his group, “inventions are a valuable part, but invention is not to be scheduled nor coerced.” The point of this kind of experimentation was to provide a free environment for “the operation of genius.” His point was that genius would undoubtedly improve the company’s operations just as ordinary engineering could. But genius was not predictable. You had to give it room to assert itself.
Vail also saw it as necessary to merge the idea of technological leadership with a broad civic vision.
“Eventually it came to be assumed within the Bell System that there would never be a time when technological innovation would no longer be needed.
Bell Labs was admittedly imperfect. Like any elite organization, it suffered at times from personality clashes, institutional arrogance, and—especially in its later years—strategic missteps.
The history of technology tends to remain stuffed in attic trunks and the minds of aging scientists.
“an institute of creative technology.”
At the peak of its reputation in the late 1960s, Bell Labs employed about fifteen thousand people, including some twelve hundred PhDs. Its ranks included the world’s most brilliant (and eccentric) men and women.
Bell Labs helped maintain and improve that system, he said, by creating an organization that could be divided into three groups. The first group was research, where scientists and engineers provided “the reservoir of completely new knowledge, principles, materials, methods and art.” The second group was in systems engineering, a discipline started by the Labs, where engineers kept one eye on the reservoir of new knowledge and another on the existing phone system and analyzed how to integrate the two. In other words, the systems engineers considered whether new applications were possible, plausible, necessary, and economical. That’s when the third group came in. These were the engineers who developed and designed new devices, switches, and transmissions systems. In Kelly’s sketch, ideas usually moved from (1) discovery, to (2) development, to (3) manufacture.
Researchers and engineers would find themselves discussing their respective problems in the halls, over lunch, or they might be paired together on a project, either at their own request or by managers. Or a staffer with a question would casually seek out an expert, “whether he be a mathematician, a metallurgist, an organic chemist, an electromagnetic propagation physicist, or an electron device specialist.” At the Labs this was sometimes known as going to “the guy who wrote the book.” And it was often literally true. The guy who wrote the definitive book on a subject—Shockley on semiconductors, John Tukey on statistics, Claude Shannon on information, and so forth—was often just down the hall.
Some young employees would quake when they were told to go ask Shannon or Shockley a question. Still, Labs policy stated that they could not be turned away.
Kelly never mentioned the word “innovation” in his speech. It would be a few more years before the executives at Bell Labs—especially Jack Morton, the head of transistor development—began using the word regularly. What he went on to describe in London, though, was a systematized approach to innovation, the fruit of three decades of consideration at the Labs. To Kelly, inventing the future wasn’t just a matter of inventing things for the future; it also entailed inventing ways to invent those things.
“There is a certain logic in the reasoning that methods which have produced much new knowledge are likely to be the best to produce more new knowledge,” the science historians Ernest Braun and Stuart Macdonald wrote some years after Kelly’s 1950 speech. “Though there is also something paradoxical in the thought that there should be established methods of creating the revolutionary.”
To innovate, Kelly would agree, an institute of creative technology required the best people, Shockleys and Shannons, for instance—and it needed a lot of them, so many, as the people at the Labs used to say (borrowing a catchphrase from nuclear physics), that departments could have a “critical mass” to foster explosive ideas.
What’s more, the institute of creative technology should take it upon itself to further the education and abilities of its promising but less accomplished employees, not for reasons of altruism but because industrial science and engineering had become so complex that it now required men and women who were trained beyond the level that America’s graduate schools could attain. In 1948, Bell Labs began conducting a series of unaccredited but highly challenging graduate-level courses for employees.
Bell Labs employed thousands of full-time technical assistants who could put the most dedicated graduate students to shame. Such assistants sometimes had only a high school diploma but were dexterous enough, mentally and physically, that PhDs would often speak of them with the same respect they gave their most acclaimed colleagues.
“You get paid for the seven and a half hours a day you put in here,” Kelly often told new Bell Labs employees in his speech to them on their first day, “but you get your raises and promotions on what you do in the other sixteen and a half hours.”
“Twice he submitted his resignation to the president of AT&T, stating that important work at Bell Laboratories was not being adequately funded,” a colleague would recall. “In each case, he got the funds.”
he enjoyed the same level of security clearance as the head of the CIA
The tightening alignment between a handful of the largest American corporations and the armed forces—“the huge industrial and military machinery of defense,” as President Dwight D. Eisenhower would call it when he left office a decade later—had already become an enormous business for AT&T, which entrusted its Bell Laboratories and manufacturing divisions at Western Electric to design and manufacture a vast array of secret equipment for the Army, Navy, and Air Force. Most of the industrial work orders related to radar and communications equipment; these were considered vital for national defense.
A countervailing belief, however, little noted at the time but discussed privately among military leaders and AT&T executives—and eventually with Attorney General Clark and President Truman—was that a company that the U.S. government depended upon to help build up its military during the cold war was arguably worth far more intact than apart.
In a private letter, Leroy Wilson, the president of AT&T, pointed out the contradiction. “We are concerned by the fact that the anti-trust suit brought by the Department of Justice last January seeks to terminate the very same Western-Electric–Bell Laboratories–Bell System relationship which gives our organization [its] unique qualifications.”
“Essentially a defensive weapon,” the Bell Laboratories Record explained, “the Nike system will provide defended areas with a far greater degree of anti-aircraft protection than was ever before possible with the more limited ranges and altitudes of conventional anti-aircraft guns." Nike “systems,” essentially clusters of missiles poised for flight, were sited on the outskirts of major U.S. cities and near strategic locations, including Bell Labs’ Murray Hill offices.
The new missiles, outfitted with several antennas, were guided by a complex control system, both in the air and on the ground, that involved radio detection and guidance and required, according to one assessment, approximately 1.5 million parts. Though nuclear arms and communications were often perceived as distinct phenomena—one was military, the other was civilian; one was deadly, the other benign—it was becoming increasingly difficult to separate the atomic age from the information age.
He wanted to limit the Labs’ military contracts so that they would not get in the way of its communications business, yet he harbored no apparent qualms about such endeavors. All were either strategically or financially important to the phone company; all were potentially useful in keeping at bay the antitrust regulators, who still sought to break up the Bell System. The military work could easily be construed as part of the implicit pact between the phone company and the government that allowed it a monopoly.
The year 1953, Fortune magazine proclaimed, would be “the year of the transistor,” when the “pea-sized time bomb,” fashioned from a sliver of purified germanium, finally went into volume production and thus began to erode the electronics industry’s dependence on the vacuum tube.
The paradox, of course, was that a parent corporation so dull, so cautious, so predictable was also in custody of a lab so innovative. “Few companies are more conservative,” Time magazine said about AT&T, “none are more creative.”
THE PUBLICITY AROUND inventions like the solar cell tended to distort public perceptions about the actual work being done inside the Labs. Kelly would often point out that the Labs workforce—including PhDs, lab technicians, and clerical staff—by the early 1950s totaled around nine thousand. Only 20 percent of those nine thousand worked in basic and applied research, however. Another 20 percent worked on military matters. Meanwhile, the rest of the Labs’ scientists and engineers—the majority—toiled on the never-ending job of planning and developing the system.
As yet there wasn’t much in the way of technology out in Palo Alto. Mostly it was apricot orchards and undeveloped land, but it had been Shockley’s hometown for most of his childhood. Also, there was Stanford University,
Shockley only managed to woo one person from Bell Labs. Mostly, he located and hired some promising young scientists from other companies—most notably Gordon Moore, Robert Noyce, Jean Hoerni, and Eugene Kleiner, all four of whom would do much to put Silicon Valley on the map.
Some years later, it would be revealed in congressional testimony that just prior to the 1956 consent agreement, Herbert Brownell, President Eisenhower’s attorney general, had quietly given to AT&T’s general counsel “a little friendly tip” to analyze its operations and identify “practices” that it could compromise on without harming its business.
The government had hinted that in exchange for a fig leaf of compromise from AT&T, it was inclined to drop the suit and allow the company to maintain its monopoly. AT&T offered two fig leaves. The first was its agreement not to enter the computer or consumer electronics markets. The second concession, at least on its face, seemed far more dramatic: The phone company agreed to license its present and future U.S. patents to all American applicants, “with no limit as to time or the use to which they may be put.” In other words, eighty-six hundred or so of AT&T’s U.S. patents “issued prior to January 24, 1956 are in almost all cases to be licensed royalty-free to all applicants.” (All future patents, meanwhile, would be licensed for a small fee.)
The patent giveaway was in fact deceptive. So what if entrepreneurs all over the country now had essentially free access to transistors, microwave long-distance systems, underwater repeaters, solar cells, coaxial cables, and thousands of other devices and industrial processes? The Bell System remained a monopoly. Competitors trying to gain a foothold in the telephone equipment business still had no way in.
“The preeminent discovery of the twentieth century is the power of organized scientific research,” Bello began. “The industrial enterprise that has carried out this mobilization most brilliantly in the U.S.—and indeed the world—is Bell Telephone Laboratories, Inc.”
Mathews argued that Shannon’s theorem “was the mathematical basis for breaking up the Bell System.” If that was so, then perhaps Shockley’s work would be the technical basis for a breakup. The patents, after all, were now there for the taking. And depending on how it played out, one might attach a corollary to Kelly’s loose formula for innovation—namely, that in any company’s greatest achievements one might, with the clarity of hindsight, locate the beginnings of its own demise.
He seemed to have a knack for involving himself—or being right nearby—in situations that made historical ripples. Pierce was apparently the last person at Bell Labs that Bill Shockley spoke with—as recorded in a cryptic entry in Shockley’s diary, for noon on December 31, 1947—before he began working around the clock on his idea for the junction transistor.22 Pierce had suggested that his friend Arthur C. Clarke compose a history of overseas communication,23 and he demonstrated for Clarke, during a visit to Murray Hill, a computer rendition, created by a group of scientists working in acoustics at Bell Labs, of “A Bicycle Built for Two.” This rendition eventually found its way into the film 2001: A Space Odyssey.24 Pierce had also been the supervisor who came by when the mathematicians were playing their guessing game on the blackboard, “Convergence in Webster,” and scratched out the infamous sentence, You Are All Fired.
Pierce had been correct in some respects about the traveling wave tube’s potential. But as he came to understand, inventions don’t necessarily evolve into the innovations one might at first foresee. Humans all suffered from a terrible habit of shoving new ideas into old paradigms. “Everyone faces the future with their eyes firmly on the past,” Pierce said, “and they don’t see what’s going to happen next.”
Pierce let Wells know that one of his science fiction concepts—an atomic bomb—was coming true: America was building one. He had deduced this from the way most of the country’s good physicists were disappearing and being directed to secret laboratories around the country. Pierce told Wells that he and his fellow engineers joked that promising scientists had been “body snatched.”
When someone asked him for his reaction to the Sputnik launch, Pierce said, “It’s like a writer of detective stories going home and finding a body in his living room.”
The passive satellite was certainly a big ball of wax, but the active satellite, in Pierce’s view, was too big a ball of wax. Indeed, Pierce soon found out that the military, through their Advanced Research Projects Administration (ARPA), were thinking about building an extremely expensive active satellite called Advent. “The tendency of ARPA has been to project elaborate and complicated schemes,” Pierce wrote derisively in a memo to Jack Morton after visiting with ARPA’s directors.
In his view, it wasn’t so much that technologies were changing society; rather, a new web of instantaneous information exchanges, made possible largely by the technologies of Bell Labs, was changing society. Pierce was also coming to the realization that other advances—data transmission, home computers, electronic mail, lightwave communications—might soon define the culture far more radically than the Bell System already had. “It is clear that more and more the Bell System will be concerned with sending digital signals, both to enable machines to talk to one another and to enable people to hear distant sounds or to see distant scenes,” he remarked in 1956.
Was there any way of “prying mobile allocations out of the FCC?” he wrote in a 1957 memo. At the time, Pierce was actually thinking ahead by a decade or two, and wondering where Bell Labs should be with mobile phone research by that point.
His mind wasn’t merely photographic, though; it worked in some ways like a switching apparatus: He tied everyone he ever met, and every conversation he ever had, into a complex and interrelated narrative of science and technology and society that he constantly updated, with apparent ease.
To Baker, chemistry was the discipline that made a global communications network feasible. He would often cite examples. By substituting the lead sheathing on telephone cables with a synthetic plastic created by Bell chemists, the Bell System saved “more than the total research budget of Bell Labs for the decade in which the innovation was worked out.”
As Baker later told the New Yorker, the Young Turks “came to Bell with an interest in attacking the hard, fundamental questions of science—something that not many people thought could be done in a place like this.” In those days, Baker explained, it was assumed that such studies were done at the world’s great universities. But Shockley and Pierce used Bell Labs’ resources to create “a new kind of science—one that was ‘deep’ but at the same time closely coupled with human affairs.” In Baker’s view, the Young Turks succeeded for the first time in bridging the gap between the best science of the academy and the important applications that a modern society needed.
In 1956, Fisk responded to Eisenhower’s request to set up a separate commission to figure out how to gather better information about the Soviet Union by suggesting Baker for the assignment. “There was the presumption that the Soviets had become undecipherable, that we would not have enough warning to respond defensively to their threats,” Baker recalled. The result was the Ad Hoc Task Force for the Application of Communications Analysis for National and International Security, otherwise known as the Baker Committee. The committee’s conclusions would be directed to the then five-year-old National Security Agency, a new unit within the Department of Defense charged with securing the country’s information networks and deciphering foreign intelligence. NSA’s very existence was then considered a national secret. So Baker was organizing a committee that did not officially exist to write a top secret report about how to improve an organization that didn’t officially exist either.
His group was not lacking in brainpower. Baker pulled in John Pierce and John Tukey from Bell Labs—“the country’s keenest thinkers,” both of whom now had top secret clearances—along with several other scientists, including the future Nobel physicist Luis Alvarez.
The goal, as stated in Baker’s 1957 description of his committee, was “to search for new concepts of interconversion of information and intelligence.” In other words, his group would consider all the ways that technology now allowed information to be hidden and transmitted—through encoded signals, and even through chemical and biological patterns—and then recommend how America’s intelligence agencies might respond. More specifically, the Baker Committee sought insights into how the United States might develop the ability to crack any imaginable Soviet code. “Our history sustains the belief that for both the national security and the universal freedom of man,” Baker wrote, in a passage that held its own with the most artful cold war rhetoric, “the applications of all science to foreign deciphering political and military communications (basically revealing attitudes of nations toward each other) is a suitable and worthy intent.”
In his history of the NSA, James Bamford described the Baker Report as recommending “a Manhattan Project–like effort to push the USA well ahead of the Soviet Union and all other nations” through the application of information-age tools.
Bamford also noted that one of the committee’s enduring legacies was its recommendation that the U.S. intelligence networks establish “a close yet secret alliance with America’s academic and industrial communities.”
The dilemma was whether it remained in Americans’ best interests to have a regulated phone monopoly such as AT&T—a monopoly that had “an end-to-end responsibility” for telephone service—or whether the phone giant should be dismantled in the expectation of more competition, lower costs, and perhaps an even greater rate of innovation.
The history of relations between the Bell System and the U.S. government seemed to follow a pattern, with truces occurring roughly every other decade.
In the wake of the 1956 agreement, AT&T appeared to be indestructible. It now had the U.S. government’s blessing. It was easily the largest company in the world by assets and by workforce. And its Bell Laboratories, as Fortune magazine had declared, was indisputably “the world’s greatest industrial laboratory.”
Ross recalls, “Kelly set up Sandia Labs, which was run by AT&T, managed by us, and whenever I asked, ‘Why do we stay with this damn thing, it’s not our line of business,’ the answer was, ‘It helps us if we get into an antitrust suit.’ And Bell Labs did work on military programs. Why? Not really to make money. It was part of being invaluable.”
Some of the most farsighted thinkers at Bell Labs had long believed that the phone monopoly might not endure. Mervin Kelly, for one, constantly had that possibility on his mind, from the mid-1940s onward. Their reasoning was neither legal nor philosophical. Popular technologies spread quickly through society; inevitably, they are duplicated and improved by outsiders. As that happens, the original innovator becomes less and less crucial to the technology itself.
Morry Tanenbaum puts it somewhat differently. “Technology would have destroyed the monopoly anyway,” he says. Tanenbaum notes that Bell Labs’ most significant research and development efforts—transistors, microwave towers, digital transmission, optical fiber, cellular telephone systems—all fit a pattern. They took years to be developed and deployed, and soon became essential parts of the network. Yet many of the essential patents were given away or licensed for a pittance. And those technologies that weren’t shared were duplicated or improved upon by outsiders anyway. And eventually, the results were always the same. All the innovations returned, ferociously, in the form of competition.
To Drucker, telecommunications was now just a part of the immense field of information and electronic technology. There were many competitors and many competing ideas in this field. And therefore, going forward, no single lab could on its own provide the new technology for the entire electronics and information industry.
All during 1966 and 1967, Shockley urged the National Academy of Sciences, the organization of America’s most distinguished scientists, to focus more deeply on the question of how heredity affects intelligence. In April 1968, at a meeting of the academy, Shockley charged that the country’s leading thinkers were showing a “lack of responsibility and courage” by not examining correlations of race and intelligence. He delivered a paper at the meeting as well. “An objective examination of relevant data,” he declared, “leads me inescapably to the opinion that the major deficit in Negro intellectual performance must be primarily of hereditary origin and thus relatively irremediable by practical improvements in environment.” He sent copies of the speech to Bill Baker, Jim Fisk, Mervin Kelly, and John Pierce.
By his own choice, Shockley then began transforming himself from the most esteemed solid-state physicist in the world to a fringe eugenicist. He was likewise starting to think that his work on genetics could become far more important than anything he had so far accomplished in his lifetime. Echoing the honorary language of the Nobel Prize, he told friends it was how he would now make “the greatest contribution to the benefit of man.”
Shannon nonetheless remained interested in the implications of his work. His speeches from that era suggest a man quietly convinced that information—how it moved, how it was stored, how it was processed—would soon define global societies and economies. A few years after he entered academia, in 1959, he lectured to an audience of students and faculty at the University of Pennsylvania. “I think that this present century in a sense will see a great upsurge and development of this whole information business,” Shannon remarked. The future, he predicted, would depend on “the business of collecting information and the business of transmitting it from one point to another, and perhaps most important of all, the business of processing it—using it to replace man at semi-rote operation[s] at a factory … even the replacement of man in the things that we almost think of as creative, things like doing mathematics or translating languages.”
In the mid-1980s, however, an award was established in Japan known as the Kyoto Prize that was meant for outstanding contributions in the field of mathematics. Shannon was voted the first recipient. “I don’t know how history is taught here in Japan,” he told the audience when he traveled there in 1985 to give an acceptance speech, “but in the United States in my college days, most of the time was spent on the study of political leaders and wars—Caesars, Napoleons, and Hitlers. I think this is totally wrong. The important people and events of history are the thinkers and innovators, the Darwins, Newtons, Beethovens whose work continues to grow in influence in a positive fashion.”
Betty donated some of the games and juggling toys he had built to the MIT Museum, which in a 2007 exhibit dubbed them Claude Shannon’s Ingenious Machines. His juggling clowns were included, along with games and machines such as THROBAC, the useless but amusing hand-built computer that could calculate in Roman numerals. Theseus, a remnant from the Bell Labs days—the mouse-machine built by Shannon, mostly late at night, which could navigate any maze—was one of the featured museum pieces, too.
One high-ranking Washington insider a cabinet member to several U.S. presidents, wrote to Baker on his eightieth birthday, “You, of course, have been in a position to observe the shortening of the time horizon (and the shrinkage of curiosity) at Bell Labs, since the demise of Ma Bell and the growth of unfettered competition.” Baker was too much of a gentleman to agree. In a long and convoluted letter back—“equally necessary was a versatility and sharing of knowledge for a coherent policy formation in the aggregate,” he wrote in a moment of reminiscence—he seemed mostly intent on revisiting his old intelligence work and recounting its triumphs. He came across as nostalgic for the cold war.
In short order, the 1996 rules created a mad frenzy for telecom equipment and network infrastructure, resulting in absurd stock valuations for some of the companies involved, as well as fraud and malfeasance. Baker viewed the results with disgust. The country’s telecom system, he told a journalist not long after, was “utterly disordered and needs some system of regulation that is publicly and politically acceptable.” He had scorn for the Federal Communications Commission, too, which “has no centralized philosophy or objectives” and seemed to spend its time squabbling.39 His clear message was that it had been a mistake to break the old system up in favor of a more chaotic marketplace. And in his view, as several more years passed, the situation only grew worse. By 2002, the institution Baker had helped build had become unrecognizable to him. “There isn’t any institution,” he dismissively told an interviewer when asked about his former employer. “Bell Labs does not exist as an institution.”
One way to think about the fate of Bell Labs is to think of the institution as something akin to a vast inheritance. While staggering as a combined sum, it somehow becomes more modest once it is split, and then split again, in various ways over time among various descendants. On January 1, 1984, the Bell System breakup officially went into effect. AT&T and Western Electric—now one combined company—were severed from the local phone companies, such as New England Telephone and Southern Bell Corporation. Most Bell Labs employees stayed within AT&T. Yet a significant number (about 10 percent) went to a new research institution called Bellcore, which was established to serve the research and development needs of the new “Baby Bells.”
In sum, it had become difficult, and perhaps unnecessary, for a company to capture the value of a big breakthrough. So why do it? To put it darkly, the future was a matter of short-term thinking rather than long-term thinking. In business, progress would not be won through a stupendous leap or advance; it would be won through a continuous series of short sprints, all run within a narrow track. “In American and European industry,” Odlyzko concluded, “the prospects for a return to unfettered research in the near future are slim. The trend is towards concentration on narrow market segments.”
The Internet, meanwhile, was already becoming a powerful force for communications. When Odlyzko wrote his paper, a small company called Netscape had just gone public, with a valuation that astounded the business world. And yet Netscape’s innovative product—a viewing browser for the World Wide Web—was largely the beneficiary of scientific and engineering advances that had been steadily accruing through academic, military, and government-funded work (on switching and networks, especially) over the past few decades.
The old Murray Hill complex, meanwhile, became Lucent’s global headquarters. By a number of measures—patents and awards, for instance—the company still retained a first-rate industrial laboratory with a skilled staff. And from the start, the prospects for Lucent and Lucent’s Bell Labs were considered promising. The company would design and build the next generation of wireless and wireline equipment. But things went even better than expected, and Lucent’s first few years proved to be the kind of fairy tale that the business press and financial investors adore. As wireless phone services boomed, and as the Internet exploded in popularity, so, too, did the need for telecommunications equipment in the United States and abroad. A host of companies embarked on an extraordinary buildout of the country’s telecommunications and data infrastructure; Lucent, in turn, began reaping enormous profits. Just two years after it split from AT&T, Lucent’s stock valuation—$98.5 billion—was higher than its onetime parent. The next few years became known variously as the telecom boom and the dotcom (a nickname for new web-based companies) boom. The assumption, as one financial columnist described it, was “that the explosive proliferation of dotcoms would send endlessly expanding amounts of data, voice and video streaming across larger and larger networks.” At its peak, Lucent was valued at $270 billion. Bell Labs, in turn, enjoyed ample funding. It seemed like another golden age of communications research was on the wing.
Things fell apart quickly. By 2000, it was understood that the predicted demand for telecommunications switching and transmission equipment was a fantasy. To compound Lucent’s problems, it was soon discovered that the company’s profits had been inflated by a practice of helping outside companies finance purchases of its equipment. The subsequent fallout was devastating. Lucent’s revenue plunged. Its stock price, which had peaked at about $84 a share, fell below $2. The company slashed tens of thousands of jobs, including thousands within Bell Labs. Some researchers and engineers were cast off when the company, desperate to alleviate its losses, divided the Bell Labs’ inheritance into even more parts—to smaller companies that took the name Agere and Avaya, for instance. Others were unceremoniously laid off. In the New Jersey suburbs, workers found they were embarrassed to wear Lucent shirts or hats to the store. In previous years, as the company’s stock price climbed, they would receive slaps on the back. Now they were greeted with angry reprisals of “What happened?” or “I lost a lot of money.” In the end, the company reduced its workforce from a high of 150,000 to about 40,000. And in its omnibus efforts to cut costs and energy consumption, every other light inside the vast buildings at Murray Hill was turned off. The acres of lawns in front of the buildings were mowed less frequently. Meanwhile, the remaining employees—at the company whose engineers perfected the telephone—were asked to limit their calls at work.
The scientific press nevertheless mourned the pragmatic turn the Labs had taken. When Nature, the esteemed British science magazine, discovered that only four researchers were now working in basic physics at the Labs, it ran an article entitled “Bell Labs Bottoms Out.” Meanwhile, grown men who had worked at Bell Labs during its golden age would sometimes confess to driving by the Murray Hill complex and experiencing an emotion close to bereavement. A few would weep.
“The history of modernization is in essence a history of scientific and technological progress,” Wen Jiabao, the premier of China, said recently. “Scientific discovery and technological inventions have brought about new civilizations, modern industries, and the rise and fall of nations.”
But the Silicon Valley process that Kleiner helped develop was a different innovation model from Bell Labs. It was not a factory of ideas; it was a geography of ideas. It was not one concentrated and powerful machine; it was the meshing of many interlocking small parts grouped physically near enough to one another so as to make an equally powerful machine. The Valley model, in fact, was soon so productive that it became a topic of study for sociologists and business professors. They soon bestowed upon the area the title of an “innovation hub.”
“Bell Labs functioned in a world not ours,” he noted. The links between government and business were different in that era; the monopoly was deemed acceptable as well as vital. And the compensation scale for its researchers and managers could never suffice in the modern economy. In Pierce’s era, the top officer at Bell Labs made about twelve times that of the lowest-paid worker; in the late 1990s, it was more typical at large American firms for the CEO to make one hundred times the salary of the lowest-paid worker.
Back in the 1940s and 1950s, moreover, smart and talented graduate students could never be wooed away from the Labs by the prospect of making millions. It wasn’t even thinkable. You were in it for the adventure. “I don’t think I was ever motivated by the notion of winning prizes, although I have a couple of dozen of them in the other room,” Claude Shannon said late in life. “I was motivated more by curiosity. I was never motivated by the desire for money, financial gain. I wasn’t trying to do something big so that I could get a bigger salary.”
Bell Labs managers knew they could support projects—the undersea cable, for example, or cellular telephony—that might require decades of work. The funding stream also assured the managers that they could consistently support educational programs to improve the staff’s expertise and capabilities. And as Morry Tanenbaum, the inventor of the silicon transistor, points out, Bell Labs’ sense of mission—to plan the future of communications—also had an incalculable value that endured for sixty years. The mission was broad but also directed. Bell Labs’ researchers, Tanenbaum notes, had a “circumscribed freedom” that proved to be liberating and practical at the same time.
There was no way around the conclusion. Pierce and his friends were making ideas and things that would either disappear in an instant, or would be absorbed into the ongoing project of civilization. He feared that any memories of the makers would perish, too. “I am afraid that there will be little tangible left in a later age,” Pierce wrote of his world at Bell Labs, “to remind our heirs that we were men, rather than cogs in a machine.”