Highlighted Selections from:

Hybrid zone: computers and science at Argonne National Laboratory, 1946-1992


Yood, Charles N. Hybrid zone: computers and science at Argonne National Laboratory, 1946-1992. Boston, Massachusetts: Docent Press, 2013. Print.

p.3: This dissertation uses the Applied Mathematics Division (AMD) of Argonne National Laboratory (ANL) as a window to explore the emergence of computer and computational science as independent scientific disciplines. The evolution of the computing activities at Argonne reflects broader issues concerning technology, identity, professionalization, and the social organization of science. -- Highlighted may 22, 2014

p.3: While Argonne’s development of digital computer technology is a significant part of this story, I focus on the AMD’s efforts to integrate computers – and their attendant personnel – into the scientific process. In particular, the pursuit of "computational science" required that applied mathematicians be incorporated in all stages of science and engineering practice - from problem formulation to the definition of what constituted a solution. Arguments for such a collaborative structure drew on Cold War rhetoric, debates within the mathematical profession, and issues surrounding the increasing quantification of the sciences. Simultaneously, applied mathematicians sought to define a new research agenda that balanced their duties to provide mathematical expertise to other scientists with their desires to conduct their own research. -- Highlighted may 22, 2014

p.3: Despite the intentions of AMD directors, the interdisciplinary collaboration that computers were supposed to foster failed to materialize as envisioned. The emergence of an independent computer science, technological innovations, and the development of computer expertise by other scientists effectively limited the extent of collaboration. -- Highlighted may 22, 2014

p.3: Beginning in the mid-1970s, though, the development of supercomputers, together with a new federal emphasis on high-speed computer networks created new opportunities for mathematicians, computer scientists, and scientists to work together. Impetus for collaboration was fueled by a number of different national concerns, including the Japanese Fifth Generation program, the need to support the domestic supercomputing industry, and pressures to make supercomputers readily accessible to American scientists. The federal government responded by creating the High Performance Computing program in the late 1980s, followed by the Grand Challenge Program of the 1990s in an effort to foster computational science – considered a third methodology, alongside theory and experiment, for doing science. Along with enabling computational scientists to tackle problems with broad implications for science, economics, and national security, another result was a significant reorientation of computer science research. -- Highlighted may 22, 2014

p.12: Argonne’s main activities have expanded beyond the development of nuclear reactors to include everything from studies of the atomic nucleus to global climate change. It currently employs 2,900 people, including almost a thousand scientists (600 with doctorate degrees), and its operating budget of almost half a billion dollars supports over 200 separate research projects. The lab also proudly proclaims that since 1990 it has worked with more than 600 companies as well as numerous federal agencies and other organizations. -- Highlighted may 22, 2014

p.12: Argonne’s website uses 1990 not only because it is a nice round number. That was the year that Alan Schriesham, Argonne’s director, submitted a funding proposal from the Math and Computer Science (MCS) Division to the Department of Energy; Schriesham’s letter of support stated:

High-performance computing has become firmly established as the third mode of scientific research. Specifically, it has led to the development of computational science as a new methodology of scientific inquiry that complements and broadens the traditional methodologies of laboratory experimentation and theoretical analysis.

This short passage is more than a proclamation of computational science as a new way to do science; it also hints at the historically significant convergence of technology, human organizations, and funding structures that made this new methodology and discipline – computational science - possible. -- Highlighted may 22, 2014

p.14: In this dissertation, I argue that computers are a unique scientific technology in that they have spurred the creation of entirely new scientific disciplines and new methodologies for scientific investigation. -- Highlighted may 22, 2014

p.14: On the one hand, I tell a story about the history and evolution of the Applied Mathematics Division at Argonne. But this narrative simultaneously provides a framework for exploring some of the ways in which computers have provided the material basis on which different professional identities within the sciences have been constructed, and also how computers have provided a material basis for interdisciplinary collaboration. -- Highlighted may 22, 2014

p.15: Whether or not computer science was a science was hotly contested, and not just by those outside the field. As late as 1989, the computer science profession was still issuing reports with titles such as “Computing as a Discipline” in which its authors presented “a new intellectual framework for the discipline of computing.” -- Highlighted may 22, 2014

p.16: Understanding the history of computational science, I propose, requires recognizing that, by its very nature, computing was a social activity. -- Highlighted may 22, 2014

p.16: Tracing the professional trajectory of computer science within the context of changes in both technology and disciplinary identity is but one of my goals. My more ambitious project is to argue that computational science constitutes a “third mode” of scientific inquiry that emerged in the mid to late 1980s. In particular, I argue that computational science is the methodological extension of what scholars refer to as “Big Science.” -- Highlighted may 22, 2014

p.17: And computational scientists, who came from almost every scientific discipline, made their use of simulations the centerpiece in claims for methodological distinction. Scientists, however, had been conducting simulations on high-performance computers since the early 1960s, and yet it would take another twenty-five years before “computational science” exploded onto the scientific scene. What finally crystallized computational science, I argue, was the coalescence in the mid-1980s of a shared vision among scientists, computer scientists, programmers, funding agencies and, above all, the federal government. -- Highlighted may 22, 2014

p.19: While the Grand Challenge subjects might seem open-ended, their methodology was not: computational science would provide the answer. HPPC funding to address Grand Challenge problems, however, was contingent upon a computational science group demonstrating a strong interdisciplinary effort that included academia, government labs, and above all, industrial partners. Thus, the HPCC did much more than endorse computational science as a third methodology; it also institutionalized the socio-scientific environment and funding structures in which it would be developed. Where previously these relationships had often been negotiated and then renegotiated by participants, the HPCC provided the guidelines for not only what kinds of problems would be addressed, but also who would participate and in what manner: Collaborative groups will include scientists and engineers concerned with Grand Challenge areas, software and systems engineers, and algorithm designers. These groups will be supported by shared computational and experimental facilities, including professional software engineering support teams, linked together by the National Research and Education Network. Groups may also create a central administrative base, which can be located anywhere on the network. -- Highlighted may 22, 2014

p.21: While the intellectual lineage of theoretical science is traced back to the pre-Socratics, and the origin of our experimental method is said to be found in the small-scale experiments of Galileo, Boyle, and Newton, computational science is an entirely different kind of animal. This “third branch” of science is the methodological product of a particular way of conducting science that is high-tech, collaborative, interdisciplinary, and very expensive. -- Highlighted may 22, 2014

p.24: Edwards is justifiably critical of these histories, stating: The tropes and plotlines of both genres impose requirements that lead authors to ignore or downplay phenomena outside the laboratory and the mind of the scientist or engineer... There is little place in such accounts for the influence of ideologies, intersections with popular culture, or political power. Stories based on the tropes of progress and revolution are often compatible with these more contingent forms of history. -- Highlighted may 22, 2014

p.24: In response to these weaknesses, Edwards seeks to reintroduce contingency into the history of computing. Change is tied to choice, both political and social. The result is a new take on the history of computing in which the technology is “[rendered]…as a product of complex interactions among scientists and engineers, funding agencies, government policies, ideologies, and cultural frames.” -- Highlighted may 22, 2014

p.25: Established in 1946, Argonne National Laboratory provides an ideal context in which to explore these topics. As part of the national laboratory system created by the Atomic Energy Commission, it was directed to carry out large-scale, multi-disciplinary research projects related to atomic energy. Historian Peter Westwick has documented the centrality of these labs to the landscape of postwar American science as evidenced by both their level of funding and breadth of research. -- Highlighted may 22, 2014

p.25: In 1958 alone, the six AEC multipurpose labs spent some $50 million on basic research in the physical sciences - half as much as all academic institutions combined. -- Highlighted may 22, 2014

p.28: While the advancement of digital computer technology is a significant part of this story in that it allowed more work to be done on computers, more important were the AMD’s efforts to integrate computers – and their attendant personnel – into the scientific process. Early on, the directors of the AMD envisioned a new role for applied mathematicians vis-à-vis scientists and engineers in the development of mathematical models suitable for digital computers. In particular, it was believed that "computational science" required that applied mathematicians be incorporated more directly in all stages of scientific and engineering practices - from problem formulation to the definition of what constituted a solution. -- Highlighted may 22, 2014

p.28: Arguments in favor of such a collaborative structure drew on Cold War rhetoric, debates within the mathematical profession, and issues surrounding the increasing quantification of the sciences. Here, historian Michael Mahoney’s conception of a disciplinary agenda provides a useful analytic tool for analyzing the activities of the AMD. Mahoney defines a disciplinary agenda as: 1) what practitioners of a discipline agree ought to be done; 2) a consensus concerning the problems of the field; 3) their order of importance or priority; 4) the means of solving them; and most importantly, 5) what constitutes solutions. -- Highlighted may 22, 2014

p.30: I argue that computational science produced its own ideology, its own way of conceptualizing how science should be done, and it was this ideology that found purchase in federal funding agencies. To its sponsors, computational science promised to connect and coordinate technologies, people, and disciplines into a well-oiled computational machine that could solve problems deemed significant to its sponsors. As a result, federal funding agencies like the DoE were willing to reorient computer science research in order to support the needs of computational scientists. Computer scientists, in turn, were forced to choose between aligning their research with the goals of the computational science machine or face a sever decline in funding. Some of the implications of this choice are addressed in the conclusion. -- Highlighted may 22, 2014

p.43: When Eckert became the director of the Scientific Computing Bureau at Columbia in 1934, IBM donated punched-card equipment to assist the bureau in their investigations of harmonic analysis and the integration of differential equations. Renamed the Thomas J. Watson Astronomical Computing Bureau to reflect IBM’s contributions, the bureau became a testing ground for new applications of punched-card equipment for scientific and engineering calculations. Eckert himself became one of the chief architects of IBM’s first sequence-controlled calculator, an electronic machine capable of performing a calculation with up to fifty arithmetic steps, and later he organized the design and construction of IBM’s Selective Sequence Electronic Calculator (SSEC) built in 1948. -- Highlighted may 22, 2014

p.43: At Argonne, several members of the computing groups had extensive experience applying punched-card equipment to scientific problems. Foremost among these was Donald (Moll) Flanders, a mathematician who had worked at Los Alamos during the war organizing a computing section with hand calculators (generally Friedens) and tabulating equipment. -- Highlighted may 22, 2014

p.44: After the war, Flanders was hired as a mathematician by the Met Lab to organize a computing group at the laboratory. With the Argonne facilities in transition, for several years he and the staff ran computations on the tabulating department machines at the Museum of Science and Industry in Chicago which was where Argonne’s administrative offices were located. However, this was a stop-gap measure and eventually Argonne was able to acquire two IBM 602s, one IBM 604 calculating machine, and even two IBM Card Programmed Calculators (CPC) or “combo” units as they were called. -- Highlighted may 22, 2014

p.45: The ENIAC, the first general-purpose, programmable electronic digital computer was commissioned by Army Ordnance in June 1943, which wanted to increase computing power at the Ballistics Research Laboratory (BRL). As discussed earlier, it was designed and built by an engineering team at the University of Pennsylvania’s Moore School of Electrical Engineering. -- Highlighted may 22, 2014

p.45: It was a huge device: with its forty panels, 1500 electromechanical relays and 17,000, 8-inch vacuum tubes, it filled a large room. More than twenty different units handled addition, subtraction, multiplication, division, and square roots, while input/output was handled by IBM punched-card equipment. Although programming was labor-intensive, requiring the setting of hundreds of switches and plugging numerous cables, by working at electronic speeds the ENIAC was hundreds of times faster than any other calculating machine. Two, ten-digit decimal numbers could be multiplied in less than 1/300 of a second. -- Highlighted may 22, 2014

p.46: From a mathematical standpoint Flanders had already begun to develop ideas for how an electronic computer might be organized from the point of view of the logical structure. During the war he had produced several long papers on binary arithmetic even before the ENIAC project began, but because they were all agency reports, they remained unpublished. When it appeared that, for a variety of technical reasons, binary rather than decimal representation of numbers would become the basis for computer arithmetic, Flanders saw the computer project as a way to develop further his earlier work while still remaining within the AEC mission. -- Highlighted may 22, 2014

p.47: Von Neumann envisioned a system composed of five different units: a central arithmetic unit to handle the four basic arithmetic operations and some higher order functions such as roots, logarithms, and trigonometric functions; a control unit to ensure the proper sequence of operations and to make the different units operate together; a memory system to store instructions and data, and separate devices to handle input and output tasks. -- Highlighted may 22, 2014

p.48: Framing his report using idealized components with biological analogies had the advantage of distinguishing the logical structure of the computer from its engineering design. This approach allowed designers to separate the different functions of the computer from technological limitations of components available at any given time. A second advantage, and one that was not recognized until later, was that by discussing his system in neurological rather than conventional technological terms, von Neumann circumvented military security which would normally have kept his report secret. -- Highlighted may 22, 2014

p.78: “Interacting with forces pushing us in the same direction, the advent of the high-speed computer has opened the way for an unprecedented mathematization, not only of fundamental scientific research in the physical and biological sciences but also in the management of our industrial and social systems. This is about to assign to mathematics an entirely new part in our civilization with far-reaching implications on what should be taught, how it should be taught, and to whom.” F.J. Weyl. Quoted in [Curtiss, 1957 #257] -- Highlighted may 22, 2014

p.79: The significance of the AMD was not due to its activities as a computing service bureau; applied mathematics “laboratories” had existed in both business and academia for years. Instead, its importance lay in how clearly the AMD’s evolving organization reflected competing visions for how computers could aid in the production of science. Thus, it provides an ideal case study in which to examine how disciplinary agendas, technical change, and institutional structures contributed to the emergence of computational science as a third methodology alongside theory and experiment for investigating natural phenomena. -- Highlighted may 22, 2014

p.80: As computers quickly became indispensable to many scientific investigations, an increasingly difficult question arose: what role should computer specialists and mathematicians play in the construction of new scientific knowledge? This question is more than academic; particular configurations of computer, computer specialist, and scientist had significant implications on how scientific computing, and later computational science, was done. -- Highlighted may 22, 2014

p.81: For AMD directors, computers could be applied to the investigation of almost any natural phenomena. Doing so required above all the creation of new mathematical tools and techniques useful to scientists and suitable for machine computation. For producers of computational tools, computers held a particular meaning that transcended the hardware. Applied mathematicians, especially, saw in the machine a world of possibilities: fundamental research in what would be called the science of computing, the potential for interdisciplinary collaboration, and maybe even heightened professional prestige. -- Highlighted may 22, 2014

p.81: Consumers, on the other hand, had a different understanding of computers that also went beyond the machine. To them, computers were simply a tool. Consumers did not envision computers as a means to professional empowerment or as a basis for interdisciplinary collaboration. Instead computers, and the people who tended them, were to act in a support capacity producing computational solutions as needed. Scientists tended to want the same thing from the providers of computational services that they expected from their maintenance staff: problems solved on demand. -- Highlighted may 22, 2014

p.82: However, for the first two directors of the new division, Donald “Moll” Flanders and William Miller, the service component ascribed to the new division was secondary to what they considered a larger project. To them, it seemed that the computer held considerable promise for energizing old disciplines like numerical analysis, creating new areas of expertise, and possibly destabilizing disciplinary boundaries. The key to such a future lay in the increasing mathematization of research and development in almost every scientific field. -- Highlighted may 22, 2014

p.83: In short, this vision called for an elevated role for mathematicians in the conduct of science. And indeed, the AMD’s early organization and attendant policies can be seen as an attempt to create an interdisciplinary research space in which computation was the centerpiece and mathematical experts played a more fundamental role in the advancement of science. -- Highlighted may 22, 2014

p.83: In its most optimistic conceptualization, it was hoped that the Applied Mathematics Division would provide an organizational framework in which to pursue cutting-edge research in both the science of computing and the application of computers to science. -- Highlighted may 22, 2014

p.84: Computing was very much a social activity. The AMD needed programmers to write the software, key punchers to prepare the punched-cards for the computers, operators to run the machines, and a maintenance staff to keep them running. These different tasks are evident in the AMD’s 1958 organizational chart which lists sections for Mathematical Consultation and Research, Programming Research and Development, Applied Programming, Computer Engineering, and Digital Machine Operations. -- Highlighted may 22, 2014

p.85: Despite the limitations of 1950 computers, it was possible to imagine a future in which the investigation of physical and biological phenomena might be approached almost entirely through computation. No field would be left untouched; not even the life sciences, which had traditionally eschewed most mathematics outside of statistics. -- Highlighted may 22, 2014

p.85: Selling computational services to scientists at Argonne was only half the battle. To a certain extent, computer specialists within the AMD believed in the need for an accompanying reorientation of scientific and engineering practices, such that experts in mathematics played a more central role in the analysis and solution of problems. -- Highlighted may 22, 2014

p.86: Significantly, this early conception of computational science was, above all, collaborative and interdisciplinary; computers would be a technological bridge-builder between disciplines. -- Highlighted may 22, 2014

p.87: But, as the research here shows, over the next decade this planned interdisciplinary research space became contested terrain. On the one hand, producers of computational tools sought their own disciplinary identity. On the other hand, computer users demanded improved computational service. Further complicating the issue was the rapid pace of innovation in technology which eroded the AMD’s monopoly on computers. By 1970, the notion that interdisciplinary collaboration would be fostered by the AMD was supplanted by a new vision that saw computing as a scientific enterprise in its own right and independent of other scientists. -- Highlighted may 22, 2014

p.94: The American entry into World War I in 1917 provided a strong impetus for the development of applied mathematics as many mathematicians supported the war effort. Ballistics was an area of special concern as the type, range, and behavior of military ordnance advanced rapidly, generating a need for new firing tables. Forest Moulton, an astronomer from the University of Chicago with a penchant for using numerical techniques in his calculations, organized a research team for the Office of the Army Chief of Ordnance to apply these tools to ballistics problems. Likewise, Oswald Veblen gathered another thirty mathematicians at the Army’s Aberdeen Proving Ground to work on military problems. This successful coupling of mathematics and applications during the war encouraged companies like GE, RCA, and Bell Telephone Laboratories to hire mathematicians in the 1920s to assist their engineers as they struggled to improve the speed and reliability of electric and communication networks. -- Highlighted may 22, 2014

p.95: It wasn’t until World War II that the professional aspirations of applied mathematicians began to look more promising. By 1939, military and scientific leaders in the United States recognized that the development and deployment of ever more complex weapons systems required considerable mathematical expertise. For mathematicians, the war seemed to offer possibilities for greater post-war influence in the development and conduct of science in America. -- Highlighted may 22, 2014

p.96: In 1942, in response to the tensions within the NDRC, Bush, who now headed the OSRD itself, created a separate entity, the Applied Mathematics Panel (AMP), to coordinate the services of mathematicians and to serve as a clearinghouse for mathematical information useful to the war effort. Headed by the mathematician Warren Weaver, the AMP employed almost three hundred people over the next two and a half years, including such eminent mathematicians as John von Neumann, Richard Courant, Jerzy Neyman, Garrett Birkhoff, and Oswald Veblen. The panel supported work in applied mathematics, especially in the development of statistics, numerical analysis and computation, the theory of shock waves, and operations research. More importantly, the AMP actively promoted the institutionalization of applied mathematics by supporting programs at Brown, Berkeley, Columbia, and NYU. -- Highlighted may 22, 2014

p.97: In 1943, Time Magazine reviewed a recent book by the popular science writer George Gray in which he argued that the contributions of mathematicians to ballistics, aerodynamics, optics, acoustics, and electronics were made in spite of a tendency of powerful mathematicians to denigrate applications as unworthy of their attention. This failure of top-flight mathematicians to sully their hands in applied work was, according to Gray, hampering war efforts. -- Highlighted may 22, 2014

p.97: Such public criticisms stimulated a fierce rejoinder from the mathematical community, but it also rekindled an earlier debate between the American Mathematical Society (AMS) and the OSRD as to what kind of mathematician was suited to the applied nature of war work. In 1942, leaders within the AMS had drafted a memo entitled “Mathematics in War” in which they suggested that the mathematical talents of the nation could best be mobilized by appointing a suitably qualified mathematician to evaluate defense programs and then select competent colleagues to assist in the work. To the AMS, such a qualified mathematician would be, above all, one engaged in “pure” work. Noting that mathematics had been instrumental in the discovery of natural laws and the mastery of nature, the AMS report went further, stating that these gains had been achieved “only through the skillful application of pure mathematics developed without reference to the immediate needs of physics or engineering.” -- Highlighted may 22, 2014

p.100: As Stan Ulam, one of the scientists at Los Alamos who ran preliminary simulations of the hydrogen bomb on the ENIAC recounted: “One could hardly exaggerate the psychological importance of this work and the influence of these results on Teller himself and on the people in the Los Alamos laboratory in general ... I well remember the spirit of exploration and of belief in the possibility of getting trustworthy answers in the future. This partly because of the existence of computing machines which could perform much more detailed analysis and modeling of physical problems.” -- Highlighted may 22, 2014

p.100: Ulam, in his excited recounting, touched on a (perhaps the) critical paradox of scientific computing and later computational science as it would develop: high-speed computing, from its inception, was driven as much by being a solution to a problem as it was a solution in search of a problem. -- Highlighted may 22, 2014

p.103: Flanders’ administrative and mathematical skills gained enough recognition that he was recruited by Hans Bethe, through Richard Courant, to organize a computing section at Los Alamos to assist scientists working on the Manhattan Project. While in New Mexico he gained enormous experience both in terms of how a computational operation might be organized and in how scientists and applied mathematicians could work together. As if anticipating the future, he also spent his spare time developing methods by which binary logic might be used by digital computing devices to perform arithmetic. After the war, Flanders brought these insights to his work at Argonne and sought to apply them through the creation in 1956 of a mathematical laboratory organized around digital computation. -- Highlighted may 22, 2014

p.106: After WWII, the Department of Defense quickly became the largest single supporter of American scientists, to the tune of some $5.5 billion a year by 1960. In addition to this, by the mid 1960s the Atomic Energy Commission had spent another $4 billion since the war on research at the national lab system, with comparable sums invested in facilities and equipment. Other agencies, such as the National Science Foundation, the National Aeronautics and Space Administration, and the National Institute of Health also contributed hundreds of millions of dollars to American scientists during this period. -- Highlighted may 22, 2014

p.106: The unexpected launch of Sputnik, the first man-made satellite, by the Soviet Union on October 4, 1957 reinforced the perception that Americans were falling behind their communist counterparts in science and technology. In this environment of ideological and technological competition, scientists, engineers, and even mathematicians were considered crucial to the very survival of the Western world. -- Highlighted may 22, 2014

p.107: the extent to which groups of mathematicians were also organized as a ready reserve for military purposes. The creation of the Institute for Numerical Analysis, part of the National Bureau of Standard’s National Applied Mathematics Laboratory (NAML) is one such example. Supported primarily by the Office of Naval Research, the mathematical research of the group was justified, in part, by its potential contribution to national defense. Written into the Prospectus of the organization was the statement that the NAML:

“Should undertake to maintain a reservoir of personnel trained in applied mathematics which can be drawn on in case of a national emergency, and should at the same time develop disciplines and tools to facilitate the conversion of the nation’s peace-time scientific manpower to emergency uses.”

It was with this backdrop that discussions about the role of the National Laboratory system took place. The idea of having a reservoir of scientific talent for national emergencies was a key area of consideration in discussions of the role National Laboratories were to play on the national scientific scene.

-- Highlighted may 22, 2014

p.107: Argonne’s Policy Advisory Board was no exception, justifying the lab’s existence in terms of its contributions to national security:

It is the importance of the existence of such going laboratories as a reserve force to be used in any appropriate way in the event of a grave national emergency. The existence of such staffs and facilities would be of obvious usefulness. It seems, furthermore, that the existence of close relationships with the scientific workers in the other laboratories of the region might be just as important in providing a point for rapid and efficient mobilization of scientific manpower and other facilities.

-- Highlighted may 22, 2014

p.117: As Alex Roland has shown in his study of DARPA’s Strategic Computing Program of the 1980s, entire technological trajectories can be shaped by the goals and methods of particularly strong individuals. In the same way that Robert Kahn and Robert Cooper gave vision and voice to Strategic Computing, Bill Miller was the personality that indelibly shaped the AMD’s first decade. -- Highlighted may 22, 2014

p.119: Because Miller is so central to the organization of the AMD and the way it interacted with scientists at Argonne, it is worthwhile to spend some time analyzing his vision. The centerpiece of his report is the assertion that applied mathematicians are crucial to the continued quantification of science and technology and that the main role for computers is to accelerate this trend. -- Highlighted may 22, 2014

p.119: Released on May 5, 1961 Mathematics and Computer Research at Argonne National Laboratory is a manifesto for the direct integration of mathematical expertise in the conduct of science. To be used most effectively, Miller argued that research in applied mathematics and computer sciences should be carried out in close contact with the quantitative sciences. Unfortunately, neither industry nor academia appeared interested in this kind of work. Mathematicians within university math departments, he asserted, “are interested principally in the mathematical problems which have implications on the structure and foundations of mathematics itself,” while industry, in general “is interested in exploiting the practical applications without encouraging research on the truly mathematical problems which arise in applications.” If this situation is allowed to continue, Miller warned that it would be a tragedy both for the quantitative sciences, which rely on applied mathematics, and for the field of mathematics, which he felt would lose one of its prime motivating forces. -- Highlighted may 22, 2014

p.120: Rather than perpetuate this arrangement, Miller proposed an alternative vision whereby applied mathematicians and computers would operate in the “hybrid area between pure mathematics and the quantitative sciences” to the mutual benefits of both fields. Miller’s deliberate use of the term “hybrid” was a rhetorical device that points to his multilayered conception of computational science. At the macro level, he defined a new kind of scientific space that blended old and new styles of scientific research and was closely aligned with the emerging characteristics of post-war Big Science. -- Highlighted may 22, 2014

p.120: The hybrid area was inherently team-based, unified by mathematics, made possible by highspeed computers, and was costly both in terms of equipment and manpower. In addition, the hybrid area was an intellectual crossroads where experiment met theory and physical or biological phenomena were translated into a mathematical language suitable for machine computation. The potential benefits of creating such a space, Miller argued, were enormous. -- Highlighted may 22, 2014

p.121: Miller also envisioned a larger role in science for mathematicians and computers. If a collaborative arrangement could be implemented correctly, it might “provide one of the substantial unifying forces for bringing together many of the diverse areas of research in the nuclear sciences.” In a sense, the hybrid area would encourage the blending of different research traditions. Furthermore, because the hybrid area shared a common language of mathematics and the common tool of the computer, new points of contact between different disciplinary agendas might be made manifest. -- Highlighted may 22, 2014

p.122: Miller also proposed that future progress in scientific computing, and especially the mathematics of machine computation, required a new kind of expert to work in his proposed intellectual and disciplinary space. -- Highlighted may 22, 2014

p.130: Miller had spelled out an ambitious, wide-ranging plan in the 1961 report. In the early days of mainframe computing, his ideas were fresh and seemed as likely to succeed as any other plan. True, Miller was calling for the creation of a new kind of scientist (to populate an already crowded and competitive scientific landscape) and the immediate elevation of that researcher to a position of some prominence within the scientific community. Yet, if his plan proved feasible, the potential gains for science and engineering would be tremendous. If all went well, mathematical expertise would suddenly be highly desired by researchers in all fields. Miller’s “hybrids,” working in the area between pure mathematics and the applied sciences would energize interdisciplinary collaboration, further the mathematization of the sciences, and help make computational science a reality. -- Highlighted may 22, 2014

p.131: Further undermining Miller’s vision was the concurrent emergence of computer science as a recognized scientific discipline which provided applied mathematicians and computer scientists with a professional umbrella under which to work, and not, surprisingly, less inducement to subsume their research interests in the name of service to others. But even before professional considerations entered the picture, applied mathematicians were not being called upon by other scientists to help in problem formulation. The culprit, again, was the advancements in computer technology. As more software tools were developed to handle mathematical problems, previously difficult computational procedures became routine. -- Highlighted may 22, 2014

p.131: And herein lay one of the peculiarities of computer science and applied mathematics - namely its inherent invisibility. A major objective in the implementation of any piece of software is to obscure the inner workings of the algorithms and code so that the user only has to focus on the output of the system. For example, in the course of writing a FORTRAN program, a scientist might include a small subroutine that performs a particular mathematical function. The scientist does not question how the subroutine works or whether it will return the correct answer -that had already been worked out by mathematicians and programmers in the process of creating the software routine. However, once instantiated in software the creativity and intellectual contribution of the mathematicians and programmers becomes invisible to the user. This peculiarity tended to work against computer specialists. In terms of garnering support from computer users for the research activities of the AMD, the invisibility of the work involved in creating computational tools obscured the extent to which fundamental mathematical research was required to produce them. As scientists and engineers relied more on prepared subroutines and applied programmers to handle their computational work, it seemed less and less imperative to support the research activities of mathematicians and computer scientists. -- Highlighted may 22, 2014

p.134: By 1961 Miller was advocating that computing operations at Argonne be centralized, but not go so far as to advocate a “closed shop.” In a “closed shop” operation, the requestor for computer services never gets a chance to interact directly with the machine and must work entirely through programmers. On the other hand, giving scientists free reign with computers in a completely “open shop” would lead to chaos in programming and scheduling, while at the same time eliminating the interdisciplinary collaboration that he hoped would emerge from the computing operation. -- Highlighted may 22, 2014

p.134: Miller instead chose a middle ground whereby the AMD increased its educational services that taught scientists how to use computers and do some programming. To assist scientists in learning how to program, the AMD also built a FORTRAN preprocessor that ran on the lab’s IBM 1401. The preprocessor, amusingly dubbed “DDT”, checked a scientist’s FORTRAN statements to detect and list errors (or bugs!). In this way, scientists could do their own programming and have it debugged prior to submitting it to the 704’s processor queue. Second, in order to forestall efforts to take programmers out of the AMD, Miller reorganized the division. Programmers were now assigned on a long-term basis to work on problems arising in specific divisions. -- Highlighted may 22, 2014

p.135: Behind these initiatives to keep programmers centralized within the AMD was Miller’s conviction that such an arrangement held the most promise for diffusing computational tools and techniques to the entire laboratory. First, by being members of the AMD, these programmers would be under a management which was prepared to look after their professional well-being by keeping them abreast of new developments in their field. Second, as members of the AMD these programmers would be in a better position to obtain mathematical assistance from the Research and Consulting section if they encountered a problem that was beyond their scope. Finally, Miller argued that new mathematical, numerical, and programming techniques would filter down to the programmers in a centrally managed group better than to programmers assigned to groups whose primary interests were not mathematical. -- Highlighted may 22, 2014

p.135: In general, scientists and engineers at Argonne were interested in the computer services of the AMD and were much less interested in promoting the professionalization of programmers or supporting the open-ended research activities of mathematicians. -- Highlighted may 22, 2014

p.136: Every year, members of the AMD, usually Margaret Butler, would go from division to division selling the services of the AMD. This approach was especially difficult in the early days because it required the division desiring computational service to estimate their usage for the entire year, even if they didn’t exactly know how they might use the computers. It was Butler’s job to suggest types of computational problems compatible with each Division’s activities that members of the AMD could address. In a very real sense, then, from the beginning computational science was a solution in search of a problem. -- Highlighted may 22, 2014

p.138: I argue that these initiatives in the early 1960s reflect broader attempts to define computer science as a distinct discipline. Part of the growing pains of any new scientific discipline is deciding what it includes and what it excludes. Subtle changes to the wording appear in the preface to the AMD’s 1961-62 Annual Report which point to the evolving nature of computing and what I contend was a slow bifurcation of computing into research and service components. In contrast to the 1959 Report, which listed the Division’s objectives as “providing mathematical assistance to other scientists in the laboratory” by “conducting research in numerical analysis and other branches of mathematics”, the 1962 Report expands these activities to “conducting research in applied mathematics, theory and practice of computation, and design of computers and information processing machines.” -- Highlighted may 22, 2014

p.141: At the AMD, these organizational maneuvers had surprising and possibly counterintuitive effects. Since World War II, computers had been seen by computer specialists as an integrative technology -one that could help scientists to transcend disciplinary boundaries and facilitate multifarious approaches to doing science. However, as a distinct professional identity began to emerge around computing, and computing technology continued to change, new barriers to collaboration were created. -- Highlighted may 22, 2014

p.151: In almost every way, the reorganization of the AMD along these lines reflects the coalescence of a distinct disciplinary identity among practitioners of the computer sciences, an attempt to define the areas encompassed by this discipline, and its relationship to other researchers at the laboratory. At the AMD, this disciplinary agenda emerged organically, buffeted by budgetary concerns, pressure from other disciplines, the technology of computing, and its own internal dynamic. But clearly the AMD was also responding to changes in the professional status of computer science itself. -- Highlighted may 22, 2014

p.153: In slightly over ten years, the entire notion that computational science would be a collaborative venture, built around the work of mathematicians, programmers, and scientists, had been turned on its ear. The hybrid area proposed by Miller was not populated by applied mathematicians, but rather by programmers. -- Highlighted may 22, 2014

p.158: “Time-sharing” was a technique by which users had the illusion that they had an entire computer with attendant software at their disposal. This included any languages, data, or subroutines that a scientist needed to complete their work. What made this concept feasible was the difference between the speed at which humans work and think and the speed at which computers could fetch and execute hundreds of simple instructions. In the few millisecond between keystrokes, or minutes while a user was thinking, the computer’s processor could handle all the chores required by one user and still handle those of another. To each person using a time-sharing system, it would appear that the computer was theirs alone. In terms of implementation, the difficulty lay in having the computer’s processor keep track of different jobs and different instructions simultaneously - a task which threatened to overwhelm the computer’s capabilities. Yet the desire to develop time-sharing systems was widely held in the early to mid-1960s, finally culminating in the funding in 1963 of Project MAC (“Man and Computer” or “Machine-Aided Cognition”) at MIT by the Defense Advanced Research Projects Agency. -- Highlighted may 22, 2014

p.159: What Givens probably didn’t expect was the way that the promise of time-sharing systems changed the way that scientists perceived the computing activity. As one scientist reported to Argonne’s Computer Needs Committee: “…time sharing represents the beginning of a new era in the impact of the computer on the scientific community. It essentially is the ‘Henry Ford’ of computing and brings the potential of interactive sophisticated computing to the ‘common man’ who heretofore felt that it was too difficult or inconvenient to really concern himself with opinions on the architecture of computing services.” -- Highlighted may 22, 2014

p.166: Diplomatically, Givens was making the argument that the real issue was not one of communication, but rather a contest between realms of expertise: “A great deal is now beginning to be known about computing and programming, although documentation is recognized as drastically behind the state of the art. In such a situation experts are likely to be abrupt and impatient with those who do not acknowledge the existence of highly specialized knowledge. The problem has no easy solution.” -- Highlighted may 22, 2014

p.166: Computer users, Givens suggests, failed to recognize or validate the computer sciences as a legitimate discipline with its own technical language, methods, and research agenda. Instead, they were solely concerned with service, which was only a small part of what the division did. -- Highlighted may 22, 2014

p.168: Ironically, the activities of the Applied Programming section were partially responsible for their own problems. “In some sense” the Committee observed, “these people have worked themselves out of a job by giving training to other divisions and developing computer ‘expertise’, of a sort, in the user’s group. Now, with budgetary problems, the divisions retain their ‘own’ people and cut of the Scientific Applications people of the AMD.” The ability of individual divisions to handle their computing needs in-house further reduced the amount of money available to the AMD since they supported the Computer Center through charges to users. Shockingly, the Review Committee recommended that 2/3 of the present Scientific Applications staff be moved to the divisions they had historically served, a move that the AMD had long resisted. Left behind in the AMD would be a small core of about ten programmers charged with the mission of “making itself an alter-ego for the ANL user community and acting in that capacity to ‘smooth’ the interface between a centralized facility and the users.” In essence, the Committee was suggesting that the AMD do organizationally what Computer Scientists had done professionally - shed their service duties. -- Highlighted may 22, 2014

p.171: However, the very proliferation of these minicomputers opened a new and somewhat unexpected research area for computer specialists. The development of ARPANET in the late 1960s represented the first steps in creating a widespread computer network and focused increasing attention on the creation of smaller, local networks that could then be connected to ARPANET. Argonne, which was scheduled to become an ARPANET node in 1974, was presented with an opportunity to kill two birds with one stone. Noting that Argonne, “ as a typical laboratory, must be suffering from the standard proliferation of minicomputers as experiment controllers and special purpose computers”, the division’s Review Committee suggested that stepping up support, both in program preparation and in data collection facilities, would be an answer to the “increasing and ubiquitous mini-computer population of the laboratory.” -- Highlighted may 22, 2014

p.173: Miller’s proposal of the hybrid area was an attempt to make applied mathematicians central to an interdisciplinary, collaborative, and computational approach to scientific and engineering research. The existence of the hybrid area was conceived against a backdrop of longer historical trends towards the increased mathematization of all the sciences, and its fruition was seen in the development of electronic digital computers that could crunch numbers at unprecedented speeds. -- Highlighted may 22, 2014

p.175: As will be seen in the next chapter, the development of “supercomputers” in the late 1970s and early 1980s provided new linkages between mathematicians, computer scientists, and computational scientists. By the mid 1980s, these groups saw common cause in terms of attracting the vast sums of money needed to pursue supercomputer research and applications. The creation of several high-performance computing initiatives on the national scale, culminating in the $5 billion Grand Challenges program of the 1990s, succeeded in creating the kind of interdisciplinary collaboration Miller had originally proposed in 1961. -- Highlighted may 22, 2014

p.182: “The past decade has seen the emergence of a new way of doing science and engineering. This new mode of ‘computational science’ is poised to join theory and experiment as a third approach to solving scientific and engineering problems.” Argonne High-Performance Computing Research Center, Argonne National Laboratory, April 14, 1992, p.15. -- Highlighted may 22, 2014

p.185: By resurrecting the experimental tradition of their discipline, computer scientists were able to convince funding agencies that support for their work paid dividends for all of the sciences. -- Highlighted may 22, 2014

p.200: Although interest in mathematical subroutines for scientific computing extends back to the days of von Neumann and Princeton’s program to build a computer at the Institute for Advanced Study, the term “mathematical software” was not coined until 1969 by John Rice, a computer scientist at Purdue. In the early 1960s researchers at the University of Toronto, University of Chicago, Stanford, Bell Laboratories, and Argonne were among the first to critically examine mathematical subroutines. At the time, there was no official outlet in referred journals, so these scientists presented their work in technical reports and at conferences. In 1966, the computer scientists J.F. Traub organized the Special Interest Committee on Numerical Mathematics (SICNUM) which by midyear attracted almost one thousand members. -- Highlighted may 22, 2014

p.208: In an article in the Communications of the ACM, Ralston provided several explanations for the disparity in funding between CS and other scientific fields; his comments echo what members of the Computer Science and Engineering Board had told the AUA Institute committee. In particular, he singled out the lack of a presence in Washington, D.C. for the computer science community. While the CSEB had seemed like a step towards filling this vacuum, the National Academy of Science disbanded the board shortly after its creation. This political impotence was likely to continue, Ralston argued, unless computer scientists began to seek positions on major committees and boards that influenced federal science policy. In addition, Ralston called on the American Federation of Information Processing Societies (AFIPS) to establish a permanent presence in Washington to lobby on behalf of computer scientists. An AFIPS office would bring visibility to the discipline and also provide a unified voice when external advice was sought by the government “on matters such as the relative position of computer science in the constellation of scientific disciplines.” -- Highlighted may 22, 2014

p.215: In a 1978 panel discussion on “Computer Science in a Decade” J. Hartmanis, of Cornell’s Computer Science Department, argued: “... computer science is a brand new species among all the known sciences and that it fundamentally differs from the older science ... in large parts of computer science the classic paradigms from physical sciences or mathematics do not apply and [thus] we have to develop and understand the new paradigms for computer science research. The fundamental difference between, say, physics and computer science is that in physics, we study (to a large extent) a world which exists and the main objective is to explain the existing (and predict new observable) phenomena. Computer science, on the other hand, is primarily interested in what can exist and how to describe and analyze the possible in information processing. It is a science which has to conceptualize, to create the intellectual tools and theories to help us imagine, analyze, and build the feasible…. Computer science is indeed a different intellectual discipline than we have ever encountered before.” -- Highlighted may 22, 2014

p.216: Whether this was a commonly held position among computer scientists is difficult to tell. What is clear is that in their quest for recognition as a distinct discipline, practitioners increasingly sought to frame their work as incorporating the two established research traditions of theory and experiment. The importance of this reorientation should not be underestimated. By drawing on the cultural/scientific cache of experiment, computer scientists were able to position themselves as legitimate contributors to science, as contenders for funding, for representation on scientific advisory panels, and as equal partners in collaborative research projects. -- Highlighted may 22, 2014

p.218: The growing demand for skilled computer scientists by industry was leading to a drain of quality people away from academia and experiments. In many cases, top people were going to places like Xerox PARC, General Motors, and Bell Laboratories because these companies had state-of-the-art experimental computing facilities. The result was that over two hundred faculty positions in computer science went unfilled in 1979, and this, in turn, jeopardized the future of computing in the U.S. -- Highlighted may 22, 2014

p.221: In summary, the Feldman Report was significant for several reasons. In calling for increased funding and disciplinary recognition, the authors subsumed theoretical research under the rubric of experimental science. This had several benefits: experiments could be shown to have direct applications while theory was amorphous; and experiments were expensive and thus required a tremendous infusion of money for the entire discipline. -- Highlighted may 22, 2014

p.221: Finally, the Report was crucial because it called on universities, industry, and the federal government to work together to support cutting edge computer research. -- Highlighted may 22, 2014

p.221: The decision to emphasize experimental computer science began to pay dividends quite quickly. Within a year the president of the ACM, Peter Denning, reported that changes could be seen across the board. The government seemed more willing to shift resources into computer science research; the NSF had created an Industry/University Cooperative program, an experimental research center program, and had dedicated more money for researchers to use experimental machines. In addition, the NSF and the Advanced Projects Research Agency (ARPA) were cooperating to fund research, while several industries had also made either cash grants to universities or provided them with advanced equipment at deep discounts. -- Highlighted may 22, 2014

p.228: American computer scientists applauded the experimental nature of the Fifth Generation Project and even suggested that the program’s organization and funding structure was enlightened. The project was arranged as a consortium of eight firms (Fujitsu, Hitachi, Nippon Electric Corporation, Mitsubishi, Oki, Sharp, and Toshiba) and two national laboratories (the government-owned Nippon Telephone and Telegraph’s Musashino Laboratories and MITI’s own Electrotechnical Laboratory). These participants provided hand-picked researchers who were then relocated to a state-of-theart facility in Tokyo called the Institute for New Generation Computer Technology (ICOT). ICOT itself was funded entirely by the government through MITI as a way to encourage the industries to provide their top researchers to the project without them also having to assume the risk entailed in such a lofty project. Each week the researchers from ICOT would return to their companies to keep them abreast of developments, and contracts were let to participating firms based on their interests and strengths. -- Highlighted may 22, 2014

p.230: What was needed, according to the authors, was a substantial, well-organized answer to the Japanese initiative. Given the limited resources available to any single body, any true response must be cooperative and include industry, academia, and governmental research organizations. -- Highlighted may 22, 2014

p.230: In summarizing their conclusion, the authors laid out several different courses of action. The first (and unacceptable) one was to maintain the status quo. Their second suggestion was to form an industrial consortium to meet the Japanese challenge, but this might entail rewriting federal laws that prohibited monopolies. A third suggestion was to set up a national laboratory similar to Los Alamos for the development and study of computer technologies. A “National Center for Knowledge Technology” such as this would serve as a clearing house for new innovations as well as being “an expression and institutional embodiment of national will” similar to NASA’s Kennedy Spacecraft Center. A final option suggested by Feigenbaum and McCorduck was that the United States “can prepare to become the first great agrarian postindustrial society.” -- Highlighted may 22, 2014

p.231: The first initiative can be seen as a direct answer to the Japanese work in artificial intelligence. In 1983, the Defense Advanced Research Projects Agency (DARPA) launched a ten-year, $1 billion program to develop machine intelligence. Although plans for such a program had been in the making for years, DARPA had a hard time selling it to Congress until the Fifth Generation. Robert Cooper, one of the chief architects of the program, admitted “We trundled out the Japanese as the arch-enemies” and used them “unabashedly” in private conversations with legislators. This approach found a receptive audience within the halls of Congress, which formally approved the Strategic Computing Initiative (SCI) in the Defense Appropriations Act of 1984. -- Highlighted may 22, 2014

p.235: There is some question as to when the term “computational science” was coined. Some claim that it was Wilson, himself, who coined it in 1986 in a paper entitled “Basic Issues for Computational Science.” However, the term “computational” before “science” was not new; in 1966 William Miller (previously head of the AMD at Argonne and now at Stanford) started the Journal of Computational Physics, which is still published today. While the origin of the term is debatable, what is not questioned is that the mid 1980’s witnessed a concerted effort by computational scientists to distinguish themselves as a different kind of researcher with different kinds of needs. In a paper published in 1987, Wilson attempted to give a definition of the science and to outline the kinds of problems it faced. Significantly, Wilson made it clear that this was a new science and thus it was experiencing growing pains. In his opinion, computational science dated back to the 1930s and the use of electro-mechanical computers to do science. In contrast, the experimental and theoretical sciences were hundreds, (or thousands) of years older. But, he said, the newness of this methodological approach to science did not detract from its significance. -- Highlighted may 22, 2014

p.239: Wilson made this coherent statement about computational science in 1987, but this was more an effort in pulling together ideas and statements that had already been in use for years. What is important about the emergence of computational science as a distinct discipline is that its spokesmen brought with them a certain cultural and scientific cachet. Computer science had produced no Nobel Laureates; computational science had. Thus, it is here, at the leading edge of computer technology, that computer scientists and research scientists finally found common cause. -- Highlighted may 22, 2014

p.242: Throughout the 1960s and 1970s, the United States was the undisputed leader in high-performance computing. Foreign access to this technology was tightly controlled by the federal government and according to one Control Data executive, it was used “as the carrot or the stick in the U.S. government’s effort to reward or punish other governments in the realm of foreign policy.” -- Highlighted may 22, 2014

p.242: All of these elements were manifest at the 1983 “Frontiers of Supercomputing” conference co-sponsored by Los Alamos National Laboratory and the National Security Agency and attended by 165 representatives of academia, industry, and government. Tellingly, the keynote address was by Admiral B.R. Inman of the Office of Naval Research; the second presentation was by New Mexico Senator Jeff Bingaman. Both linked supercomputing directly to national security and long-term economic competitiveness. -- Highlighted may 22, 2014

p.279: Networking also began to attract Congressional attention. Specifically, Senator Albert Gore, Jr., of Tennessee had become enamored of the possibility of creating an “information superhighway” based on computer networks. Under his leadership, in 1986 Congress directed the OSTP to examine the state of computer networks in the United States. In early 1987, President Ronald Reagan’s science advisor, William Graham, also became interested in networks. -- Highlighted may 22, 2014

p.281: Congress was especially interested; in August 1988 Senator Gore held a series of hearings on the FCCSET report after which he asked the OSTP to formulate a plan by which the recommendations might be implemented. However, what was considered precedent setting by computer scientists was not the report itself, but divergent groups that came together to create it: “It is the first time in the history of the field that senior people from all the major federal agencies having an interest in computing, and the senior people as the OSTP, got together and produced an executive summary on what the problem is and what can be done. That’s never happened before.” -- Highlighted may 22, 2014

p.283: Significantly, the manner in which this research effort is structured is a direct reflection of the new cultural and scientific power wielded by computational science, and stimulating computational science research was the primary goal of this program. Taking a page from SCI, again it would be applications that drove the technology. The architects of SCI firmly believed that if the end goal could be defined in enough detail, it would guide the development of the technologies and techniques needed to attain that goal. But, in the case of the HPC program, and the Software Technology and Algorithms component in particular, the end goal was much more open-ended. -- Highlighted may 22, 2014

p.283: Rather than a quest for machine intelligence, this component was framed in the context of solving “Grand Challenges.” Coined by Ken Wilson in 1987, and placed at the core of the HPC, Grand Challenges were “fundamental problems in science and engineering, with broad economic and scientific impact that could be advanced by applying high performance computing resources.” -- Highlighted may 22, 2014

p.288: The following year, supporters of HPC believed they had much to celebrate; the chief Congressional sponsor of HPC, Al Gore, was now the Vice-President of the United States. Surely in this changed environment HPC could expect to see some money flow its way. In this, they were not mistaken. Under the leadership of Clinton and Gore, HPC was expanded to include Communications (networking) as a central component of the program. Reintroduced as an integrated program that would help create the Internet and put a computer in every classroom, the High Performance Computing and Communication (HPCC) Initiative was passed in 1993. -- Highlighted may 22, 2014

p.288: Funded at almost $5 billion over five-years, it should not be surprising with a program this large that it meant different things to different people. HPCC supported particular lines of technological development (parallel computers and high-speed networks) and endorsed a particular organization of labor to solve Grand Challenge problems. Collaboration was mandatory; researchers in academia and government laboratories were expected to work closely with their counterparts in industry to explore new computer technologies and then apply these to problems deemed important by the state. By supporting such work, the federal government absorbed much of the risks involved in developing new computer architectures while also speeding up the process by which experimental machines became available to scientists, engineers, and industrial users. -- Highlighted may 22, 2014

p.298: by the beginning of the 1990s, computational science, both as a discipline and as a methodological mode of inquiry, had succeeded in shaping national priorities and science policy. -- Highlighted may 22, 2014

p.300: Despite the rhetoric of equality that infused HPCC, it was clear that some teammates were more equal than others. This point did not escape Rick Stevens, the Director of Argonne’s MCS Division and the person responsible for implementing HPCC programs at the lab. “The fundamental challenge in the ‘Grand Challenge’,” he mused, “is their interdisciplinary nature.” Stevens’ observation suggests that if we look behind the smooth face of collaboration that proponents of HPCC presented, we can see something more about the social organization of computational science. In particular, I argue that computational science, as methodology, ideology, and discipline significantly altered the directions of computer science research. My assertions are preliminary and are based on evidence from the Math and Computer Science Division at Argonne, which is a different environment from that of academic computer scientists. Nonetheless, I believe my conclusions are applicable to academia as well. -- Highlighted may 22, 2014

p.301: computational scientists promised to solve important problems for their sponsors, and the close alignment of their agenda with national needs translated into power within collaborative projects. Computational scientists would be the quarterbacks directing research on Grand Challenge problems. -- Highlighted may 22, 2014

p.328: In particular, I argue that as the “computational science ideology” found purchase in federal funding agencies, it reoriented computer science research towards its own ends. -- Highlighted may 22, 2014

p.331: As Shapin and Schaffer note, “scientific activity, the scientist’s role, and the scientific community have always been dependent; they exist, are valued, and supported insofar as the state or its various agencies see point in them.” I suggest that the rapidity with which computational science became established as a discipline had much to do with its clear focus on applications that the state valued. -- Highlighted may 22, 2014

p.332: I think that some of the blame (perhaps not consciously) was attributed to computer scientists and their focus on the machine-as-object-of-study. In contrast, to computational scientists the computer was the tool and they put the potential contributions to the state of using this tool front and center. Computational science, however, was not just about applying computer technology - it was a methodology unto itself. Moreover, it was a methodology that was enabled by the kinds of large-scale, interdisciplinary collaboration which characterized big science. As a methodology, it was appealing to state sponsors because it promised to link and coordinate disparate elements - people, technologies, disciplines - into a scientific problem-solving machine. -- Highlighted may 22, 2014