8/2/2019 Epistemology of Spec to Meter http://slidepdf.com/reader/full/epistemology-of-spec-to-meter 1/15 The Epistemology of a Spectrometer Author(s): Daniel Rothbart and Suzanne W. Slayden Source: Philosophy of Science, Vol. 61, No. 1 (Mar., 1994), pp. 25-38 Published by: The University of Chicago Press on behalf of the Philosophy of Science Association Stable URL: http://www.jstor.org/stable/188287 . Accessed: 16/10/2011 05:32 Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at . http://www.jstor.org/page/info/about/policies/terms.jsp JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact [email protected]. The University of Chicago Press and Philosophy of Science Association are collaborating with JSTOR to digitize, preserve and extend access to Philosophy of Science. http://www.jstor.org
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
The Epistemology of a SpectrometerAuthor(s): Daniel Rothbart and Suzanne W. Slayden
Source: Philosophy of Science, Vol. 61, No. 1 (Mar., 1994), pp. 25-38Published by: The University of Chicago Press on behalf of the Philosophy of Science AssociationStable URL: http://www.jstor.org/stable/188287 .
Accessed: 16/10/2011 05:32
Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at .http://www.jstor.org/page/info/about/policies/terms.jsp
JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of
content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms
of scholarship. For more information about JSTOR, please contact [email protected].
The University of Chicago Press and Philosophy of Science Association are collaborating with JSTOR to
digitize, preserve and extend access to Philosophy of Science.
Department of Philosophy and Religious StudiesGeorge Mason University
AND
SUZANNE W. SLAYDEN
Department of ChemistryGeorge Mason University
Contraryto the assumptions of empiricist philosophies of science, the theory-laden character of data will not imply the inherent failure (subjectivity, circu-larity, or rationalization)of instrumentsto expose nature's secrets. The successof instruments is credited to scientists' capacity to create artificial technologicalanalogs to familiar physical systems. The design of absorption spectrometersillustrates the point:Progress in designing many modem instrumentsis generatedby analogically projecting theoretical insights from known physical systems tounknown terrain. An experimental realism is defended.
1. Introduction. Why should scientists trust the reliability of modem
instrumentsto expose unobservablephysical structures?According to em-
piricists, instrumentsfunction to magnify our physiologically limited sen-sory capacities by "causally" linking the specimen's sensory propertiesto accessible empirical data; such data in turn are validated by the same
empiricist standardsused to access ordinary (middle-sized) phenomena.
Empiricists have given scant attention to instruments as a separate topicof inquiry on the grounds that the epistemic value of instrumentsreduces
to the epistemology of commonsense experience.Yet even the dictum that sensory data are theory-laden has the effect
of minimizing the philosophical significance of instruments.Many critics
of empiricism work within the empiricist distinction between the subjec-tivity of theory and the apparent objectivity of data. Such a distinction
assumes a naive understandingof scientific instrumentaldesign. Once we
overcome this empiricist conception of instrumentaldesign, the theory-laden characterof data will not imply the inherent failure (subjectivity,
*Received September 1992; revised April 1993.
tWe greatlyappreciatecomments on earlier drafts from Rom Harr6,Mary Hesse, Emmett
Holman, and especially an anonymous referee for this journal.tSend reprintrequeststo Daniel Rothbart,Departmentof PhilosophyandReligious Studies,
George Mason University, 4400 University Blvd., Fairfax, VA 22030, USA.
Philosophy of Science, 61 (1994) pp. 25-38
Copyright ? 1994 by the Philosophy of Science Association.
circularity, or rationalization) of instruments to expose nature's secrets.
Rather than a warrantfor its subjectivity, the theory-laden character of
data reveals the instrument's success at exposing real-world structures.In this paper, we argue that the success of instrumentspartially results
from artificial technological replicas of various physical systems familiar
to scientists at a given time. Progress in designing many, though not all,instruments is generated by analogical projections of components from
known physical systems to unknown terrain. Instrumentationenables sci-
entists to expand their limited theoretical understandingto previously hid-
den domains. We argue against both a skepticism and naive realism of
scientific instruments in favor of an experimental realism that interprets
instrumentsas analogs to natural systems.Toward these goals, we explain how instruments are conceived as an-
alogical replicas of real-world systems (section 2), examine the design of
absorptionspectrometers (section 3), respond to the skeptic's charge that
unobservable structuresare inaccessible (section 4), evaluate the provoc-ative data/phenomena distinction by Bogen and Woodward (section 5),and briefly propose an experimental realism based on instrumentation
(section 6).The recent resurgence on the nature of instrumentation is partially ev-
ident in the works of Ackermann (1985), Baird and Faust (1990), Bogenand Woodward (1988, 1992), Franklin(1986), Galison (1987), Gooding
(1986, 1988), Ramsey (1992), Shapin and Schaffer (1985), and Woolgar
(1988). In exploring the natureof scientific instruments, we will examine
the works of a few of these authors.
2. InstrumentsDesignedas Replicasof Nature. Intended o avoidthedebacles of naive empiricism, Ackermannexplains instrumentationwithin
the frameworkof an evolutionary epistemology. Theories evolve in waysthat best adapt to the environmental niches of "facts"; the data domains
are the socially sanctioned depiction of such facts about the world. In-
strumentsfunction as epistemic intermediariesbetween theories and data.
Through instruments the influence of interpretationis broken, presum-
ably, by refining and extending human sensory capacity. But in the end,Ackermann's epistemology is strikingly empiricist. The primaryfunction
of instruments is to break the line of influence from theory to fact by
groundingthe subjectivityof interpretation n the intersubjectivityof fact.
Consequently, the authenticity of data domains is not grounded on anytheoretical constructs, but stems rather from socially negotiated sensorycontent (1985, 127-131).
However, Ackermann's rationale for instrumentshas little bearing on
the design of modem instruments. First, data are not always identified
with perceptual experiences (Bogen and Woodward 1992, 593). Typi-
cally, the experimenterreads graphic displays, digital messages, or coded
charts directly from the instrumental readout. The computer-controlledvideo display and the more common printer/plotter, for example, employ
language that is accessible only to the trained technician. For example,a photomultiplierreadout device transforms the radiantenergy of a signalinto electrical energy while simultaneously increasing the generated cur-
rent a millionfold. The current from the device flows to either a chart
recorderof numbersor a series of milliammeters. Within spectral analysisthe prevalence of visual data, for example, the yellow from a sodium
flame, has been replaced in modem spectrometersby discursive readouts.
Second, the empiricist conception of extraordinaryphenomenon has no
place in modem instrumentation.Typically, the phenomenon of interest
is a set of physical interactions between the specimen and experimentalconditions. All experimental properties that are instrumentally detected
are tendencies, or conditional manifestations of the specimen, to react to
certainexperimentalstimuli. The specimen has tendencies manifestedonlyif certain humanly designed experimental conditions are realized (Harre
1986, chap. 15). Although such conditions are teleologically determined,the tendencies are
groundedon the
specimen'sreal
physical structure,which exists independently of human thought. The phenomenon of in-
terest is not entirely generated exclusively by external physical structures
and not entirely by internal conceptualization.
Third, the reliability of instruments must be credited to their design as
artificialanalogs to naturalsystems. The physical sequence of events from
specimen structureto data readout constitutes a technological analog to
multiple naturalsystems based on underlying causal models of real-world
phenomena. The instrument'sdesigners typically dissect, restructure,and
reorganize natural systems for the purpose of projecting powerful theo-retical analogs to unexplored terrain.The instrument thus can expose pre-
chanics) to his cloud chamber for the reproductionof thunderstorms,co-
ronae, and atmospheric electricity.
J. J. Thompson and researchersat the Cavendish laboratoriesgave the"same"instrumentsa new theoreticalrationale. Rather thanimitatingcloud
formations, Thompson intended to take natureapart by exploring the fun-
damental characterof matter (ibid., 265). For their matter-theoreticpur-
poses, scientists at the Cavendish became indebted to Wilson's artificial
clouds for revealing the fundamental electrical nature of matter, "As the
knotty clouds blended into the tracks of alpha particles and the 'thread-
like' clouds became beta-particletrajectories, the old sense and meaningof the chamber changed" (ibid., 268). For twentieth-century physicists
the formation of droplets were replaced by the energies of gamma rays,the scattering of alpha particles, and discovery of new particles. Wilson
and the matterphysicists profferedrival theoretical nterpretations-derivedfrom distinct physical analogs-of the chamber's causal structure.
Thompson and Wilson employed different instruments.
3. Absorption Spectrometers. Let us consider basic design principlesfor absorption spectrometers commonly used for identification, structure
elucidation, and quantification of chemical substances. Modern absorp-
tion spectrometerswere designed from the analogical projection of causalmodels of the photoelectric effects of light.
Scientists naturallyunderstandmodern instruments as informationpro-cessors. From this perspective many instruments function as complex sys-tems of detecting, transforming, and processing information from an in-
put event, typically an instrument/specimen interface, to some outputevent, typically a readout of information.
Within instrumentaldesign the reliability of the signal becomes a pri-
mary focus of attention. The signal must be detected, converted by the
transducerto a different energy form, processed, and finally amplifiedfor the readout. The signal is defined roughly as an information-carryingvariable. An analog signal (commonly voltage or current)has a topology-
preserving correspondence with a variable of the specimen under study;that is, the signal strength is directly proportional to the value of the
measured quantity. A digital signal carries the source variable encoded
into high or low signal levels usually expressed within binary notation.
Instruments, interfaced to digital computers for either data acquisition
and/or automaticcontrol, incorporatedevices for signal conversion (A/DC
or D/AC).Within an absorptionspectrometera beam of electromagnetic radiation
emitted in the spectralregion of interestpasses througha monochromator,which is a series of optical components such as lenses and mirrors. This
radiation then impinges on a sample. The monochromator isolates the
Figure 3.1. A single-beam absorption spectrometer. (Reproduced by permission fromCharlesK. Mann, Thomas J. Vickers, and Wilson M. Gulick, InstrumentalAnalysis, 1974,
New York: Harper & Row.)
radiationfrom a broad band of wavelengths to a continuous selection of
narrow band wavelengths. These wavelengths can be held constant, or
they can be scanned automatically or manually.
Depending on the sample, various wavelengths of radiation are ab-
sorbed, reflected, or transmitted.That partof the radiationpassing throughthe sample is detected and converted to an electrical signal, usually by a
photomultipliertube. The electric outputis electronically manipulatedand
sent to the readoutdevice, such as a meter, a computer, a controlled video
display, or a printer/plotter.Consider a schematic depiction of a single-beam absorption spectrom-
eter, shown in figure 3.1 (Mann et al. 1974, 312). For such a spectrom-eter the amplified output of the detector is measured directly in terms of
meter deflection. Notice that the sample reading is compared to a ref-
erence sample, as indicated in figure 3.1 by sample cell S and reference
cell R (ibid., 311).The interaction of electromagnetic radiation and a specific chemical
sample is unique. The "fingerprint"of this interaction is revealed by the
absorption spectrum over the entire electromagnetic energy continuum,and thus the interaction provides vital information about a specimen'smolecular structure.Some of the most convincing evidence about atomic
and molecular structurehas been obtained by spectral analysis.The success of spectral analysis is based on the following causal prin-
ciple about atomic or molecular structure:If a specimen absorbs a certain
wavelength of light (the wavelength corresponding to a particular en-
ergy), then that absorbed energy must be exactly the same as the energy
requiredfor some specific internal change in the molecule or atom. Re-
maining energies in the light spectrum are "ignored" by the substance,and these energies are then reflected or transmitted. (The absorbed light
energy causes such changes as atomic and molecular vibrations, rotations,and electron excitation.) As a result of the absorption, a specially de-
signed instrumentmay detect an energy change that we may "sense" in
effects, the conception of energy detected within the spectrometer is an-
alogically derived in partfrom light beams consisting of discrete photons.
When a flash of light is observed with a photomultiplier and displayedon an oscilloscope, the observed signal is a set of impulses (Bair 1962,
13). Such photoelectric signals from a flash of light function as the data-
constituting analog to the conception of energy detected in absorption
spectrometers.
4. Overcoming the Skeptic's Noise. This analogical conception of in-
strumentationdoes not warranta skepticism about the capacity of instru-
ments to reveal the specimen's physical dispositions. The images from
infrareddetectors employed by astrophysicists to reveal newborn stars arenot the complete fabrication of the experimenter's symbol system. The
line sequences from a spectral analysis are not artifacts of the scientists'
conceptualizations. The tracksof alpha particles within a bubble chamber
are not fictitious concoctions by self-deluding scientists.
As communication systems, instruments are designed to minimize dis-
tortion and vulnerability to noise for the purpose of creating a one-to-one
transformationfrom signal to source states. The reliability of data restsin part on the ability of the experimenter to overcome potentially inter-
fering influences that would result in the signal's random error. Such
influences would prevent experimentersfrom distinguishing the detection
of the phenomenon from backgroundnoise because in such a case a one-
to-many transformationfrom signal to source would result.
Reliable channels of communication, based on background theories,can be in principle achieved so that the signal is practically unequivocal,that the mapping from data structure to specimen structure approachesone-to-one, and that the signal-to-noise ratio can be maximized. The ex-
perimentercan be reasonably confident that such confounding factors are
minimized by blocking out the potentially interfering agent. The source
of noise for electrical signals may be the light reflected by objects in a
room, energy radiatedby electrical lines in walls, and mechanical vibra-
tions transmittedthrough a floor. Such random energy sources can be
significantly reduced by shielding electrical lines or by insulating walls
to protect against temperaturechanges.
Alternatively, the experimenter can sometimes isolate the features of
the phenomenon of interest from the external confounding factors. Sci-
entistsattempting
to detect magneticmonopoles
within cosmicray
ex-
periments often had to distinguish heavy charged particles like possible
monopoles from light nuclei. Since both kinds of particles were detected
by ordinary photographic emulsions, experimenters switched to a com-
mercial plastic that was sensitive only to the heavy charged particles(Woodward 1989, 411).
5. The Data/Phenomena Dichotomy by Bogen and Woodward.
Bogen and Woodward (1988, 1992) argue that the data/theory dichotomyshould be replaced by the data/phenomena distinction. Phenomena, un-like data, are candidates for explanation and prediction by general sys-tematic theory. Phenomena have stable recurringfeatures produced reg-
ularly by some manageable set of factors. The same phenomena should
be detectable in a variety of apparentways not subject to significant fluc-
tuation. To detect a phenomenon one must identify a relatively stable and
invariantpatternof some simplicity and generality with recurrentfeatures
(Woodward 1989, 393-398).In contrast, data register on the measurement or recording device in a
form accessible to human perceptual systems. Data serve as evidence forclaims about phenomena. Although data depend on causal factors from
a variety of physical sources, many factors are idiosyncratic to details of
the experiment. The evidential function of data is secured by specific
proceduralrequirements, such as the control of possible confounding ef-
fects and systematic error, replicability, overcoming problems of data re-
duction and statistical analysis, and calibration and empirical investiga-tion of equipment. Data have no theoretical import in themselves, exceptinsofar as data constitute evidence for the existence of phenomena. Data
are neither the explananda of theoretical systems, nor the subject of sys-tematic predictions (Bogen and Woodward 1988, 315-322).
However, the conclusion that data are not candidates for explanation
by systematic theory rests on a misleading portrayal of the data/theory
relationship.In particular, heoreticalmodels are essential for reliabledata.
Reliability requires access to underlying causal mechanisms for the pro-duction of data, and such mechanisms are conveyed by background the-
ories. This function of background theories is apparentwhen new theo-
retical insights enhance dataproductivity. Thus, the instrument's success
at exposing unknownpropertiesis tied directly to the capacity of scientists
to extend theoretically iconic models of naturalevents to artificial con-
texts.
Bogen and Woodward recognize the complex causal chain that under-
lies the sequence from specimen to data. According to Woodward, how-
ever, such a causal chain by itself does not constitute explanation, which
requires both generality of causal mechanisms responsible for the ex-
planandum-eventand a unification of phenomena within a general pattern
(Woodward 1989, 400-403). The context of instrumentaldesignexposesthe vital contributionof iconic models to the phenomena/data interaction,
and shows how theoretical explanation of the detection signal is requiredfor reliable data.
Furthermore, the argument by Bogen and Woodward reflects a type-token confusion. If phenomena have recurring features produced regu-
Hacking's attempt to cleanse the engineer's practice of the theoreti-
cian's abstractionsconveys similarities to van Fraassen's antirealism. Forboth philosophers the criterionfor reality is fundamentallynontheoretical.
But even on Hacking's own terms of a praxis epistemology, the exper-imenter's low-level generalizations are intimately groundedon theoretical
models of higher generality (Morrison 1990). Again, innovative instru-
mental designs usually reflect the advanced state of theoreticalknowledgefor a wide arrayof domains of inquiry, a point Hacking seriously under-
estimates. Hacking's argument that a technician can manipulate the ap-
paratus without theoretical background is misleading and epistemologi-
cally uninformative; most facets of instrumental design, calibrations of
measurements, and dataanalysis rest on acceptance by the scientific com-
munity at large of causal models of physical reality. Without this accep-tance, the experimentershould lack confidence in the very manipulabilityof entities duringinstrumentusage, and the technician serves no epistemicfunction.
The manipulability of entities for the purpose of interfering with hy-
pothetical processes constitutes a vital component of most contemporaryinstruments, but Hacking's use of manipulability as a criterion of reality
artificiallydemarcatestheory and practice. For example, if manipulabilitywarrants existence of electrons, scientists can legitimately countenance
the specimen's chemical composition precisely because of the inescapable
theory-laden characterof manipulability.The antirealist cannot explain the capacity of instruments to span ex-
traordinary epistemic distances. Within either macroscopic or micro-
scopic dimensions, scientists' access to unknown properties is explained
by the existential continuity (Harre1961, 54) from data to the specimen's
physical structure. Such a continuity is grounded on the causal sequenceof physical events within the instrument. As a result of this causal se-
quence, the instrumentdisplays the markings of reference for some spec-imen. An experimental realism of the specimen's structure is warranted
because real physical processes are nomicly nested within the interpreteddata. But this experimentalrealism does not commit the fallacy of reverse