You are on page 1of 9

Journal of Philosophy, Inc.

On the Stability of the Laboratory Sciences Author(s): Ian Hacking Source: The Journal of Philosophy, Vol. 85, No. 10, Eighty-Fifth Annual Meeting American Philosophical Association, Eastern Division (Oct., 1988), pp. 507-514 Published by: Journal of Philosophy, Inc. Stable URL: http://www.jstor.org/stable/2026809 Accessed: 11/02/2010 17:30
Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at http://www.jstor.org/page/info/about/policies/terms.jsp. JSTOR's Terms and Conditions of Use provides, in part, that unless you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you may use content in the JSTOR archive only for your personal, non-commercial use. Please contact the publisher regarding any further use of this work. Publisher contact information may be obtained at http://www.jstor.org/action/showPublisher?publisherCode=jphil. Each copy of any part of a JSTOR transmission must contain the same copyright notice that appears on the screen or printed page of such transmission. JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact support@jstor.org.

Journal of Philosophy, Inc. is collaborating with JSTOR to digitize, preserve and extend access to The Journal of Philosophy.

http://www.jstor.org

THE JOURNAL OF PHILOSOPHY


VOLUME LXXXV, NO. 10, OCTOBER

1988

ON THE STABILITY OF THE LABORATORY SCIENCES*

IS

our knowledge of nature unstable, constantlyyielding to re-

futation or revolution? You might think so, to judge by recent philosophy. In this summary I propose a framework in which to understand the manifest fact that science since the seventeenth century has, by and large, been cumulative. The framework does not question the chief insights of Karl Popper or Thomas Kuhn, but places them in a larger perspective. My theme is the stability of "mature" laboratory sciences. It concerns only those sciences in which we investigate nature by the use of apparatus in controlled environments, and are able to create phenomena that never or at best seldom occur in a pure state before people have physically excluded all "irrelevant" factors. By laboratory science I do not mean just the experimental side of a science. My topic is the stabilizing relationship between theory and experiment.
STABILITY

Many mature sciences are pedagogically stable. We learn geometrical optics when young, the wave theory as teenagers, Maxwell's equations on entering college, some theory of the photon in senior classes, and quantum field theory in graduate school. Newton's rays of light particles are omitted, as are many other byways. Each of these stages is taught as if it were true. The sophisticated teacher may add, "but not really true." Each earlier stage is at best approximately true. Some scientific realists embrace this (and an implied stability), holding that science converges on the truth. The implication is that earlier stages are approximately true. Paul Feyerabend rejected the
* To be presented in an APA symposium on The Philosophical Significance of Experimentation, December 28, 1988. Patrick Heelan will be co-symposiast, and Peter Galison will comment; see this JOURNAL, this issue, 515-524 and 525-527, respectively, for their contributions. 0022-362X/88/8510/0507$00.80 ? 1988 The Journal of Philosophy, Inc.

507

508

THE JOURNAL OF PHILOSOPHY

approximation idea on the grounds that successive stages are incommensurable. Before Nancy Cartwright, few on either side examined actual practices of approximation. One can argue that her multiplicity of possible approximations both toward and away from the truth suffices to show that all such considerations about approximation are jejune. Here is a telling example due to S. S. Schweber. In 1981, workers at the University of Washington devised the Penning trap, which contains a single electron in a definite space. Everything they did was planned according to and can be explained by the prerelativistic (pre-Dirac) theory of the electron. It is not clear that it can be done otherwise. For those purposes, that old account of the electron is better than any other; it is the account which is true to the facts: true to the experiment and its applications. One suggestive idea about how stability arises relies on the observation that theories and laboratory equipment evolve in such a way that they match each other and are mutually self-vindicating. Such symbiosis is a contingent fact about people, our scientific organizations, and nature. Another contingency is that new types of data can be produced, thought of as resulting from instruments that probe more finely into microstructure, data which cannot be accommodated by established theory. This creates space for a mutual maturing of new theory and experiment, but does not necessarily dislodge an established mature theory, which remains true of the data available in its domain. 'Data', 'theory', 'experiment', 'equipment'-these are familiar words, but, to expound this notion of stability, we require a finer taxonomy.
ELEMENTS OF LABORATORY EXPERIMENT

Thanks to a large number of recent studies by philosophers, historians, and ethnographers of experimental science, we have much richer sources of material about the laboratory than was available a decade ago. The welter of colorful examples makes it hard to produce any tidy formal characterization of experiment. Hence, our powers of generalization are limited. I shall try to return some degree of abstraction to the philosophy of science by listing some familiar elements in laboratory experimentation. (1) There is a question or questions about some subject matter. (A question answered at the end of the experiment may not even have been posed at its commencement; conversely, what prompted the work may disappear not only from the official write-up, but even from living memory). Questions range from those rare ones emphasized by philosophers ("Which of these two competing theories is false?") to the commonplace ("What is the value of this quantity?" or

PHILOSOPHICAL SIGNIFICANCE OF EXPERIMENTATION

509

"Does treating X with Y make a difference?"). When a question is about a theory, I shall speak of the theory in question. (2) Established or working theories or background bodies of knowledge or assumptions about the subject matter. These are of at least three kinds. (2a) Background knowledge and expectations which are not systematized and which play little part in writing up an experiment. (2b) Theory of a general and typically high-level sort about the subject matter, and which, by itself, may have no experimental consequences. Call this systematic theory. (2c) What in physics is commonly called phenomenology, what R. B. Braithwaite called Campbellian hypotheses, and what many others call bridge principles. I shall speak of topical hypotheses. Hypothesis is used in the old-fashioned sense of something more readily revised than theory. It is overly propositional. I intend it to cover whole sets of approximating procedures in the sense of Cartwright, and what Kuhn called the "articulation" of theory in order to create a potential mesh with experience. That still ignores a more tacit dimension, the skills used by the "phenomenologist" to create that articulation in practice. Topical is meant to connote both the usual senses of 'current affairs' or 'local', and also to recall the medical sense of a topical ointment applied to the surface of the skin, i.e., not deep. Here are two extreme examples from physics: in the case of a measurement of local gravitational acceleration (3 c below), the systematic theory is Galilean mechanics, and there is today no self-conscious "phenomenology," no formulated bridge principles or topical hypotheses. In the case of superstring theory, a potential grand unified theory of many dimensions, topical hypotheses connect this structure with something that happens in our three or four dimensional world. (3) There is the materiel of the experiment. Commonly this breaks down into three parts, each associated with a set of instruments. (3a) There is a target that is prepared by some devices. (3b) There is apparatus that is used to interfere with the target in some way. (3c) There is a detector that determines some effects of the interference. I arbitrarily restrict the word 'apparatus' to (b), and will use instrument generically for (a-c). In elementary chemistry one mixes two substances to observe their interaction. They are the target. I would call litmus paper a detector, but not human observation, which I put in (5) below. The apparatus of Atwood's machine for

510

THE JOURNAL OF PHILOSOPHY

determining local gravitational acceleration is a tuning fork with a brush on one prong. It is dropped so that the brush sweeps out a curve on the detector, a plate of glass with whitewash on it. There is no target materiel. Or is it the gravitational field? (4) There are theories or at least background lore about the materiel. They help us design instruments, calculate how they will work, debug them, and run them. The phenomenology of each instrument may differ. Seldom (never?) is the phenomenological theory of an instrument the same as the theory in question (1) or the systematic theory (2b). It may overlap with the topical hypotheses (2c). For example, the theory of the tuning fork has nothing much to do with the theory of gravitational acceleration. Nowadays in big science, people who design detectors and people who prepare targets commonly have very different expertise. (5) Data generators. In the past these were usually people like George Atwood measuring the length of the successive inflection points of the curve swished out by the brush at the end of the tuning fork. Now we have printouts of automatic readings. A camera that takes micrographs from an electron microscope once was a detector, now it is a data generator. (6) Data: the physical, material records produced by a data generator. (7) Data processing: a catch-all name for distinct activities that have appeared at different stages in the history of science. (7a) Data assessment: e.g., the calculation of a probable error, using a formal procedure that is theory-neutral. There are also estimates of systematic error based on theories of the detector, apparatus, and target, and on deductions from topical hypotheses. (7b) Data reduction: indigestible or unintelligible information is transformed by statistical techniques into manageable quantities or displays. (7c) Data analysis, well-described by Peter Galison for high energy physics: the "events" under study-preserved, e.g., as tracks on phoonce selected and given preliminary analysis by tographs-were semi-skilled labor. With the advent of very fast detectors, a tape of events had to be analyzed by computer. The technicians were trained, and later the programs were written, in the light of both topical hypotheses and theories of the instruments. Computer simulation of missing bits of data, and image enhancement, provide further examples of data processing. (8) Interpretation of the (processed) data: in the simplest cases this is a single stage: a series of trials on Atwood's machine gives us a set of pairs of distances (between swishes) and times; we then com-

PHILOSOPHICAL

SIGNIFICANCE

OF EXPERIMENTATION

511

pute the gravitational acceleration g according to the Galilean formula g = 2s/t2. We need no topical hypotheses. Unfortunately, reasoning like that used for Atwood's machine is the model for too much philosophical discussion of theory and experiment.
EXTENDING PIERRE DUHEM'S THESIS

How is an "acceptable" experimental result obtained? Duhem observed that, if an experiment or observation is persistently inconsistent with a systematic theory, you need not abandon the theory, for in principle you can revise the theory of the instrument. In his example, revise astronomy or revise the theory of the telescope. Now, if you did the latter, you would probably rebuild your telescope, creating a substantially different instrument! Andy Pickering importantly advances Duhem's idea by adding the materiel to the items that can be modified. He regards the systematic theory, the instruments, and theories of the instruments as three plastic resources which the investigator brings into a mutual adjustment. His example has two competing theories in question: free charges come either in units of e, the charge of the electron, or else l/se (free quarks). In the background (2a) these are the only two possibilities. The materiel is an upgraded Millikan oil-drop device. The initial results of the experiment described seemed to show pretty much a continuum of minimum electric charges. The experimenter modified both his theory of the apparatus and the apparatus (he revised the phenomenological theory of condenser plates, and repositioned the plates). The ensuing data were interpreted as refuting the theory of free quarks. Duhem emphasized my elements (2) and (4). Pickering restored us to experimental practice by adding (3). Robert Ackermann attends to yet other elements, adding not only the instruments (3) but also the data (6) and interpretation (8). Like Duhem and unlike Pickering, he has a passive attitude to instruments, treating them as if they were "off-the-shelf" devices in the way that a navigator would use a chronometer, or a cell biologist a nuclear magnetic resonance spectrometer. What were once experimental and thoroughly "plastic" instruments become reliable technology. In Ackermann's account, instruments produce data that are literally "given." The data are not theory-laden; they are material artifacts, photographs, or inscriptions, the productions of instruments. Theory enters when they are interpreted. Science is a dialectical affair of fitting data and theory together. Data that at one time are just "noises" may later be interpreted by a new theory. Thus, after the theory of pulsars was in place, older astrophysical records were shown to be rich in evidence of pulsars. Those records were not

512

THE JOURNAL OF PHILOSOPHY

theory-laden, but their interpretation is, on Ackermann's view, a matter of theory. Thus, Duhem, Ackermann, and Pickering point to different kinds of interplay among some of the elements (1-8). In fact, all eight are plastic resources. We can (1) change questions, or more commonly modify or sharpen them in midexperiment. Data (6) can be abandoned or selected without fraud. Data processing (7) is almost embarrassingly plastic. Those familiar with the logic of statistical inference will be well aware of the hazards of data reduction (7b). Data analysis (7c) is plastic in an entirely different way. Its computer programs are highly susceptible to modifications in the theory of target, apparatus, or detector-not to mention material changes made in the way that those systems operate. One aim of developing a taxonomy such as (1-8) is to describe a complex pattern of adjustments which concludes with stable science resistant to revision.
MATURITY AND STABILITY

Here is a very liberal adaptation of Ackermann's picture of maturing science. A collection of kinds of instruments evolves, hand in hand with theories that interpret the data that it produces. Ackermann calls a collection producing data that comes to fall under a systematic theory an instrumentarium. As a matter of brute contingent fact, instrumentaria and systematic theories mature, and data uninterpretable by theories are not generated. There is no drive for revision of the theory, because it has acquired a stable data domain. What we later see as limitations of a theory are not even perceived as data. Thus, geometrical optics takes no cognizance of the fact that all shadows have blurred edges. The fine structure of shadows requires an instrumentarium quite different from lenses and mirrors, together with new systematic theory and topical hypotheses to interpret it. But is not at most one theory true, the old mature one, or the aspiring new one that takes account of the additional data domain? The metaphysical doctrine of the unity of science demands that. But think instead of the theories being different representations of several aspects of "the same reality." Niels Bohr tried to mitigate the shock of such a way of thinking by invoking complementarity, but we should reach further than that. New sense is given to the idea of incommensurability: these theories are incommensurable in Kuhn's intended sense of "no common measure." The measure of the mature theory is its data domain, which it fits within tolerances of error; the new theory tries to measure up to a new data domain. Philosophers from Susan Stebbing have mocked A. S. Eddington's remark that he had two tables before him. Well, there is only one

PHILOSOPHICAL SIGNIFICANCE OF EXPERIMENTATION

513

table. But we should think out a semantics of representation, in which many commonsense beliefs about the table are literally true, as were many of the things said by 1920s physics, and as are those held by quantum field theory. There is one table, and many incommensurable truths about it. So much I take to be a liberal extension of Ackermann's suggestions. Two things are importantly missing. One is the long haul of getting to a mature theory, which I believe is best described by an equally liberal extension of Pickering's account of apparatus and apparatus theory as a plastic resource. The other is the role of those elements, in my taxonomy, which Ackermann does not mention. The chief practical indeterminacy of science lies not in the possible things one can do with the world or in the possible systematic theories one may entertain, but in the manifold of models and approximations which constitute topical hypotheses. Topical hypotheses are underdetermined. A theory does not say how it will be articulated to mesh with the world, nor does processed data say how they will be interpreted by topical hypotheses. There are lots of ways to do it, and every phenomenologist has a battery of such techniques. Kuhn has importantly emphasized that learning a science is not learning the systematic theory but learning how to do the problems at the end of the chapter. A casual inspection of many textbooks will show that, aside from certain mathematical tricks of calculation, these problems are typically training in how to use what I have been calling topical hypotheses. A mature science achieves a canonical set of approximations, the glue that holds it together, and which enable us to say that the theory is true of its data domain. The pain in hardworking science is the construction of new topical hypotheses. That is the "puzzle" to which so much "normal science" is addressed.
THESES AND QUERIES

(1) Scientific realism. All that I have said is consistent with scientific realism about entities. My description of mature and successor normal science strongly resembles Duhem's antirealism about theories. I attribute to him the conception that nature, and even "mechanics" or "optics," is too complex to admit of a single unified description. One can at best aim at characterizing an aspect of parts of nature, and this is achieved by complementary mature theories that need not be commensurable. But 'aspect' is no longer merely a metaphor, for it is to be explicated in terms of the structure of instruments, processed data, and topical hypotheses. (2) Truth. Do we need a new semantics for real science, one that is based on the locution, 'true to the facts'-not the facts about some

514

THE JOURNAL OF PHILOSOPHY

metaphysical world, but the facts about the phenomena created by experimentalists? (3) Kuhnian revolutions. Not all scientific revolutions are Kuhnian, witness "the" scientific revolution of the seventeenth century. It has been argued that there was a scientific revolution in geology leading up to plate techtonics, but lacking the stage of crisis which precedes paradigm shift. Conjecture: revolutions with Kuhnian structure are of two sorts. One occurs when a paradigm is imposed on a preparadigmatic field. The other sort occurs in mature sciences precisely when a new instrumentarium generates data outside an established data domain. That fits the "function" that Kuhn finds for measurement in the physical sciences, and his own study of black body radiation. (4) Laboratory technology and the unity of science. When an instrument becomes an off-the-shelf device for one branch of science, it can often be incorporated, after painful adjustments, into another. X-ray diffraction, designed for crystallography, engenders molecular biology. The instrument theory of one science becomes built into the practice of another. Insofar as topical and instrument theories interact, there is a resultant unification of data domains, and hence an apparent unification or at least congruity among sciences.
IAN HACKING

University of Toronto

You might also like