Top Banner
8/2/2019 Epistemology of Spec to Meter http://slidepdf.com/reader/full/epistemology-of-spec-to-meter 1/15 The Epistemology of a Spectrometer Author(s): Daniel Rothbart and Suzanne W. Slayden Source: Philosophy of Science, Vol. 61, No. 1 (Mar., 1994), pp. 25-38 Published by: The University of Chicago Press on behalf of the Philosophy of Science Association Stable URL: http://www.jstor.org/stable/188287 . Accessed: 16/10/2011 05:32 Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at . http://www.jstor.org/page/info/about/policies/terms.jsp JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact [email protected]. The University of Chicago Press and Philosophy of Science Association are collaborating with JSTOR to digitize, preserve and extend access to Philosophy of Science. http://www.jstor.org
15

Epistemology of Spec to Meter

Apr 06, 2018

Download

Documents

Ovidiu Badea
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Epistemology of Spec to Meter

8/2/2019 Epistemology of Spec to Meter

http://slidepdf.com/reader/full/epistemology-of-spec-to-meter 1/15

The Epistemology of a SpectrometerAuthor(s): Daniel Rothbart and Suzanne W. Slayden

Source: Philosophy of Science, Vol. 61, No. 1 (Mar., 1994), pp. 25-38Published by: The University of Chicago Press on behalf of the Philosophy of Science AssociationStable URL: http://www.jstor.org/stable/188287 .

Accessed: 16/10/2011 05:32

Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at .http://www.jstor.org/page/info/about/policies/terms.jsp

JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of 

content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms

of scholarship. For more information about JSTOR, please contact [email protected].

The University of Chicago Press and Philosophy of Science Association are collaborating with JSTOR to

digitize, preserve and extend access to Philosophy of Science.

http://www.jstor.org

Page 2: Epistemology of Spec to Meter

8/2/2019 Epistemology of Spec to Meter

http://slidepdf.com/reader/full/epistemology-of-spec-to-meter 2/15

THE EPISTEMOLOGY OF A SPECTROMETER*

DANIEL ROTHBARTtt

Department of Philosophy and Religious StudiesGeorge Mason University

AND

SUZANNE W. SLAYDEN

Department of ChemistryGeorge Mason University

Contraryto the assumptions of empiricist philosophies of science, the theory-laden character of data will not imply the inherent failure (subjectivity, circu-larity, or rationalization)of instrumentsto expose nature's secrets. The successof instruments is credited to scientists' capacity to create artificial technologicalanalogs to familiar physical systems. The design of absorption spectrometersillustrates the point:Progress in designing many modem instrumentsis generatedby analogically projecting theoretical insights from known physical systems tounknown terrain. An experimental realism is defended.

1. Introduction. Why should scientists trust the reliability of modem

instrumentsto expose unobservablephysical structures?According to em-

piricists, instrumentsfunction to magnify our physiologically limited sen-sory capacities by "causally" linking the specimen's sensory propertiesto accessible empirical data; such data in turn are validated by the same

empiricist standardsused to access ordinary (middle-sized) phenomena.

Empiricists have given scant attention to instruments as a separate topicof inquiry on the grounds that the epistemic value of instrumentsreduces

to the epistemology of commonsense experience.Yet even the dictum that sensory data are theory-laden has the effect

of minimizing the philosophical significance of instruments.Many critics

of empiricism work within the empiricist distinction between the subjec-tivity of theory and the apparent objectivity of data. Such a distinction

assumes a naive understandingof scientific instrumentaldesign. Once we

overcome this empiricist conception of instrumentaldesign, the theory-laden characterof data will not imply the inherent failure (subjectivity,

*Received September 1992; revised April 1993.

tWe greatlyappreciatecomments on earlier drafts from Rom Harr6,Mary Hesse, Emmett

Holman, and especially an anonymous referee for this journal.tSend reprintrequeststo Daniel Rothbart,Departmentof PhilosophyandReligious Studies,

George Mason University, 4400 University Blvd., Fairfax, VA 22030, USA.

Philosophy of Science, 61 (1994) pp. 25-38

Copyright ? 1994 by the Philosophy of Science Association.

25

Page 3: Epistemology of Spec to Meter

8/2/2019 Epistemology of Spec to Meter

http://slidepdf.com/reader/full/epistemology-of-spec-to-meter 3/15

DANIEL ROTHBART AND SUZANNE W. SLAYDEN

circularity, or rationalization) of instruments to expose nature's secrets.

Rather than a warrantfor its subjectivity, the theory-laden character of

data reveals the instrument's success at exposing real-world structures.In this paper, we argue that the success of instrumentspartially results

from artificial technological replicas of various physical systems familiar

to scientists at a given time. Progress in designing many, though not all,instruments is generated by analogical projections of components from

known physical systems to unknown terrain. Instrumentationenables sci-

entists to expand their limited theoretical understandingto previously hid-

den domains. We argue against both a skepticism and naive realism of

scientific instruments in favor of an experimental realism that interprets

instrumentsas analogs to natural systems.Toward these goals, we explain how instruments are conceived as an-

alogical replicas of real-world systems (section 2), examine the design of

absorptionspectrometers (section 3), respond to the skeptic's charge that

unobservable structuresare inaccessible (section 4), evaluate the provoc-ative data/phenomena distinction by Bogen and Woodward (section 5),and briefly propose an experimental realism based on instrumentation

(section 6).The recent resurgence on the nature of instrumentation is partially ev-

ident in the works of Ackermann (1985), Baird and Faust (1990), Bogenand Woodward (1988, 1992), Franklin(1986), Galison (1987), Gooding

(1989), Hacking (1983, 1988), Latour (1987), Pickering (1989), Radder

(1986, 1988), Ramsey (1992), Shapin and Schaffer (1985), and Woolgar

(1988). In exploring the natureof scientific instruments, we will examine

the works of a few of these authors.

2. InstrumentsDesignedas Replicasof Nature. Intended o avoidthedebacles of naive empiricism, Ackermannexplains instrumentationwithin

the frameworkof an evolutionary epistemology. Theories evolve in waysthat best adapt to the environmental niches of "facts"; the data domains

are the socially sanctioned depiction of such facts about the world. In-

strumentsfunction as epistemic intermediariesbetween theories and data.

Through instruments the influence of interpretationis broken, presum-

ably, by refining and extending human sensory capacity. But in the end,Ackermann's epistemology is strikingly empiricist. The primaryfunction

of instruments is to break the line of influence from theory to fact by

groundingthe subjectivityof interpretation n the intersubjectivityof fact.

Consequently, the authenticity of data domains is not grounded on anytheoretical constructs, but stems rather from socially negotiated sensorycontent (1985, 127-131).

However, Ackermann's rationale for instrumentshas little bearing on

the design of modem instruments. First, data are not always identified

26

Page 4: Epistemology of Spec to Meter

8/2/2019 Epistemology of Spec to Meter

http://slidepdf.com/reader/full/epistemology-of-spec-to-meter 4/15

THE EPISTEMOLOGY OF A SPECTROMETER

with perceptual experiences (Bogen and Woodward 1992, 593). Typi-

cally, the experimenterreads graphic displays, digital messages, or coded

charts directly from the instrumental readout. The computer-controlledvideo display and the more common printer/plotter, for example, employ

language that is accessible only to the trained technician. For example,a photomultiplierreadout device transforms the radiantenergy of a signalinto electrical energy while simultaneously increasing the generated cur-

rent a millionfold. The current from the device flows to either a chart

recorderof numbersor a series of milliammeters. Within spectral analysisthe prevalence of visual data, for example, the yellow from a sodium

flame, has been replaced in modem spectrometersby discursive readouts.

Second, the empiricist conception of extraordinaryphenomenon has no

place in modem instrumentation.Typically, the phenomenon of interest

is a set of physical interactions between the specimen and experimentalconditions. All experimental properties that are instrumentally detected

are tendencies, or conditional manifestations of the specimen, to react to

certainexperimentalstimuli. The specimen has tendencies manifestedonlyif certain humanly designed experimental conditions are realized (Harre

1986, chap. 15). Although such conditions are teleologically determined,the tendencies are

groundedon the

specimen'sreal

physical structure,which exists independently of human thought. The phenomenon of in-

terest is not entirely generated exclusively by external physical structures

and not entirely by internal conceptualization.

Third, the reliability of instruments must be credited to their design as

artificialanalogs to naturalsystems. The physical sequence of events from

specimen structureto data readout constitutes a technological analog to

multiple naturalsystems based on underlying causal models of real-world

phenomena. The instrument'sdesigners typically dissect, restructure,and

reorganize natural systems for the purpose of projecting powerful theo-retical analogs to unexplored terrain.The instrument thus can expose pre-

viously hiddenphysical propertiesby cross-fertilizationromknownphysical

symmetries to the unknown structures under investigation. This cross-

fertilizationmotivates scientists to projectparametersfrom known models

of naturalphenomena to unknown models of causal processes underlyinginstrumentdesign.

In this context a model must be conceived iconically as a cognitivestructure hatreplicates some phenomenal system. The iconic model con-

sists of a set of abstractparametersordered according to lawlike gener-alizations of some theory. The theory in turn consists of a set of such

models. The iconic model is not reducible to a mathematicalmodel since

the mathematicalstructureof the iconic model does not exhaust its entire

content. Also, the iconic model is not by definition a descriptive model,

27

Page 5: Epistemology of Spec to Meter

8/2/2019 Epistemology of Spec to Meter

http://slidepdf.com/reader/full/epistemology-of-spec-to-meter 5/15

DANIEL ROTHBART AND SUZANNE W. SLAYDEN

although any iconic model can be transposed to a linguistic formulation

to produce a descriptive model.

So, underlying he design of many modem instrumentsare sourcemodelsof real-world systems. Each source model exhibits positive, negative, and

neutralanalogies, to use M. Hesse's terminology, to the target sequenceof physical events within the instrument's operation. Yet, this analogical

projection from source to target models is not theory reduction of un-

known to known causal structures. The instrument is designed to create

artificially a complex maze of causal processes from a combination of

diverse physical theories.

Theanalog system

functions as an idealizedprototype

that isprojectibleonto the phenomenal system under scrutiny. The analog model deter-

mines the range of conceptual possibilities by supplying new horizons of

iconic vision for extractinga physical reaction from a specimen structure.

In this respect the source analog acquires a normative force by directing

engineers to explore a specific realm of possible models. Yet the dis-

covery of fresh analogies, and new prototypes, does not always requirea monolithic overhaul of the entire scientific enterprise, as is suggested

by a Kuhnianparadigmshift. Newly discovered analogies typically yield

a specifiable and localized transformationof some problematic subject.Nevertheless, a prominent factor in judging a theory's success is its

capacity to motivate instrumentalprogress. A mutual dependence arises

between instrumental design and theoretical progress: The instrument's

design requires the complex combinations of various theoretical insights,and the theory's fertility is partially measured by successes of instru-

mental designs. In this respect the internal/external distinction assumed

above between the specimen's unknown parametersand the backgroundtheoretical models must be qualified.

One major task for any designer is to select the most promising ana-logical system to function as the generator for the instrument's relevant

causal relations. The analogical origins of such designs become hidden

under the cloak of repeated experimental successes. For example, C. T.

R. Wilson designed the cloud chamber not as a particle detector but as

a meteorologicalreproduction f real atmosphericcondensation. As Galison

and Assmus (1989) document, meteorology in the 1890s was experienc-

ing a "mimetic" transformation n which the morphological scientists be-

gan to use the laboratory o reproducenaturaloccurrences.The mimeticists

produced miniature versions of cyclones, glaciers, and windstorms.

Wilson's design of the cloud chamber was explicitly based on J. Aitken's

dust chamber, which in turn recreated the effects of fogs threatening

England's industrial cities. Wilson transportedthe basic components of

the dust chamber (the pump, reservoir, filter, values, and expansion me-

28

Page 6: Epistemology of Spec to Meter

8/2/2019 Epistemology of Spec to Meter

http://slidepdf.com/reader/full/epistemology-of-spec-to-meter 6/15

THE EPISTEMOLOGY OF A SPECTROMETER

chanics) to his cloud chamber for the reproductionof thunderstorms,co-

ronae, and atmospheric electricity.

J. J. Thompson and researchersat the Cavendish laboratoriesgave the"same"instrumentsa new theoreticalrationale. Rather thanimitatingcloud

formations, Thompson intended to take natureapart by exploring the fun-

damental characterof matter (ibid., 265). For their matter-theoreticpur-

poses, scientists at the Cavendish became indebted to Wilson's artificial

clouds for revealing the fundamental electrical nature of matter, "As the

knotty clouds blended into the tracks of alpha particles and the 'thread-

like' clouds became beta-particletrajectories, the old sense and meaningof the chamber changed" (ibid., 268). For twentieth-century physicists

the formation of droplets were replaced by the energies of gamma rays,the scattering of alpha particles, and discovery of new particles. Wilson

and the matterphysicists profferedrival theoretical nterpretations-derivedfrom distinct physical analogs-of the chamber's causal structure.

Thompson and Wilson employed different instruments.

3. Absorption Spectrometers. Let us consider basic design principlesfor absorption spectrometers commonly used for identification, structure

elucidation, and quantification of chemical substances. Modern absorp-

tion spectrometerswere designed from the analogical projection of causalmodels of the photoelectric effects of light.

Scientists naturallyunderstandmodern instruments as informationpro-cessors. From this perspective many instruments function as complex sys-tems of detecting, transforming, and processing information from an in-

put event, typically an instrument/specimen interface, to some outputevent, typically a readout of information.

Within instrumentaldesign the reliability of the signal becomes a pri-

mary focus of attention. The signal must be detected, converted by the

transducerto a different energy form, processed, and finally amplifiedfor the readout. The signal is defined roughly as an information-carryingvariable. An analog signal (commonly voltage or current)has a topology-

preserving correspondence with a variable of the specimen under study;that is, the signal strength is directly proportional to the value of the

measured quantity. A digital signal carries the source variable encoded

into high or low signal levels usually expressed within binary notation.

Instruments, interfaced to digital computers for either data acquisition

and/or automaticcontrol, incorporatedevices for signal conversion (A/DC

or D/AC).Within an absorptionspectrometera beam of electromagnetic radiation

emitted in the spectralregion of interestpasses througha monochromator,which is a series of optical components such as lenses and mirrors. This

radiation then impinges on a sample. The monochromator isolates the

29

Page 7: Epistemology of Spec to Meter

8/2/2019 Epistemology of Spec to Meter

http://slidepdf.com/reader/full/epistemology-of-spec-to-meter 7/15

DANIEL ROTHBART AND SUZANNE W. SLAYDEN

shutter

lI . I ?100

source detector

samplecells

Figure 3.1. A single-beam absorption spectrometer. (Reproduced by permission fromCharlesK. Mann, Thomas J. Vickers, and Wilson M. Gulick, InstrumentalAnalysis, 1974,

New York: Harper & Row.)

radiationfrom a broad band of wavelengths to a continuous selection of

narrow band wavelengths. These wavelengths can be held constant, or

they can be scanned automatically or manually.

Depending on the sample, various wavelengths of radiation are ab-

sorbed, reflected, or transmitted.That partof the radiationpassing throughthe sample is detected and converted to an electrical signal, usually by a

photomultipliertube. The electric outputis electronically manipulatedand

sent to the readoutdevice, such as a meter, a computer, a controlled video

display, or a printer/plotter.Consider a schematic depiction of a single-beam absorption spectrom-

eter, shown in figure 3.1 (Mann et al. 1974, 312). For such a spectrom-eter the amplified output of the detector is measured directly in terms of

meter deflection. Notice that the sample reading is compared to a ref-

erence sample, as indicated in figure 3.1 by sample cell S and reference

cell R (ibid., 311).The interaction of electromagnetic radiation and a specific chemical

sample is unique. The "fingerprint"of this interaction is revealed by the

absorption spectrum over the entire electromagnetic energy continuum,and thus the interaction provides vital information about a specimen'smolecular structure.Some of the most convincing evidence about atomic

and molecular structurehas been obtained by spectral analysis.The success of spectral analysis is based on the following causal prin-

ciple about atomic or molecular structure:If a specimen absorbs a certain

wavelength of light (the wavelength corresponding to a particular en-

ergy), then that absorbed energy must be exactly the same as the energy

requiredfor some specific internal change in the molecule or atom. Re-

maining energies in the light spectrum are "ignored" by the substance,and these energies are then reflected or transmitted. (The absorbed light

energy causes such changes as atomic and molecular vibrations, rotations,and electron excitation.) As a result of the absorption, a specially de-

signed instrumentmay detect an energy change that we may "sense" in

30

Page 8: Epistemology of Spec to Meter

8/2/2019 Epistemology of Spec to Meter

http://slidepdf.com/reader/full/epistemology-of-spec-to-meter 8/15

THE EPISTEMOLOGY OF A SPECTROMETER

some cases as heat, fluorescence, phosphorescence, or color. Thus, the

detected signal can expose the molecular structure of the specimen in

terms of the specific patternsof absorbed and reflected/transmitted lightenergies.

Which instrumenttype and radiationsource should be chosen for study-

ing a particularspecimen? This problem requiresextensive knowledge of

the range of chemical structures,the different types of spectrometers, the

electronic processes, and the measurement of the resultant spectra. The

designer's articulation of channel conditions, as well as the experi-menter's operation, include complex modeling from electromagnetism,

optics, atomic theory, chemistry, and geometry. Consider how various

stages of energy transformation hroughoutthe instrumentare representediconically by various power flow models. For example, electrical energy

yields an effort of voltage and a flow of electrical current. No preferredvalue can be given for a single parameter in isolation from others. In-

strumentaldesign and operation must be understood as combinations of

conditions and combinations of circumstances ranging across distinct do-

mains of inquiry. A thorough understandingof the spectrometerrequiresa major segment of the physical sciences in general, a point C. A. Hooker

(1987, 116) illustrates within the design of the Wilson cloud chamber for

testing particle reactions.The signal that carries information about the specimen's structure is

defined by fixed channel conditions. The channel of communication is a

set of conditions that either (1) generates no relevant information, or (2)

generates only redundant information (Dretske 1981, 115). The infor-

mation that the specimen a has propertyF ratherthan not-F requires de-

signers to define the fixed channel conditions on the basis of external

physical theories. Many newly designed instrumentsrequire the techno-

logical extension of physical principles familiar to scientists within nat-

ural domains.Thus, the empiricist's dictum that scientific instrumentsextend the lim-

ited sensory capacity distorts the inherent theoretical rationale: Instru-

ments function to expose the specimen's underlying physical structureby

technological analogy to naturalcausal symmetries. Access to unknown

properties of the specimen's structureoccurs by theoretical extension of

already familiar independent causal models. The technology exposes the

specimen's unknown attributesby generating a moment of theoretical in-

tersection between the actual and the possible, that is, between familiar

theories functioning externally to the experiment and hypothetical models

presumably replicating the specimen's structure.

The informationaloutput of the absorption spectrometercenters on the

electromagnetically understood energy absorbed by the specimen. Be-

cause such spectrometers are designed by analogy to the photoelectric

31

Page 9: Epistemology of Spec to Meter

8/2/2019 Epistemology of Spec to Meter

http://slidepdf.com/reader/full/epistemology-of-spec-to-meter 9/15

DANIEL ROTHBART AND SUZANNE W. SLAYDEN

effects, the conception of energy detected within the spectrometer is an-

alogically derived in partfrom light beams consisting of discrete photons.

When a flash of light is observed with a photomultiplier and displayedon an oscilloscope, the observed signal is a set of impulses (Bair 1962,

13). Such photoelectric signals from a flash of light function as the data-

constituting analog to the conception of energy detected in absorption

spectrometers.

4. Overcoming the Skeptic's Noise. This analogical conception of in-

strumentationdoes not warranta skepticism about the capacity of instru-

ments to reveal the specimen's physical dispositions. The images from

infrareddetectors employed by astrophysicists to reveal newborn stars arenot the complete fabrication of the experimenter's symbol system. The

line sequences from a spectral analysis are not artifacts of the scientists'

conceptualizations. The tracksof alpha particles within a bubble chamber

are not fictitious concoctions by self-deluding scientists.

As communication systems, instruments are designed to minimize dis-

tortion and vulnerability to noise for the purpose of creating a one-to-one

transformationfrom signal to source states. The reliability of data restsin part on the ability of the experimenter to overcome potentially inter-

fering influences that would result in the signal's random error. Such

influences would prevent experimentersfrom distinguishing the detection

of the phenomenon from backgroundnoise because in such a case a one-

to-many transformationfrom signal to source would result.

Reliable channels of communication, based on background theories,can be in principle achieved so that the signal is practically unequivocal,that the mapping from data structure to specimen structure approachesone-to-one, and that the signal-to-noise ratio can be maximized. The ex-

perimentercan be reasonably confident that such confounding factors are

minimized by blocking out the potentially interfering agent. The source

of noise for electrical signals may be the light reflected by objects in a

room, energy radiatedby electrical lines in walls, and mechanical vibra-

tions transmittedthrough a floor. Such random energy sources can be

significantly reduced by shielding electrical lines or by insulating walls

to protect against temperaturechanges.

Alternatively, the experimenter can sometimes isolate the features of

the phenomenon of interest from the external confounding factors. Sci-

entistsattempting

to detect magneticmonopoles

within cosmicray

ex-

periments often had to distinguish heavy charged particles like possible

monopoles from light nuclei. Since both kinds of particles were detected

by ordinary photographic emulsions, experimenters switched to a com-

mercial plastic that was sensitive only to the heavy charged particles(Woodward 1989, 411).

32

Page 10: Epistemology of Spec to Meter

8/2/2019 Epistemology of Spec to Meter

http://slidepdf.com/reader/full/epistemology-of-spec-to-meter 10/15

THE EPISTEMOLOGY OF A SPECTROMETER

A compensation technique can be used when the confounding factor

operates uniformly. The signal's fluctuation can then be used to convey

informationabout the specimen's attributes,assuming other experimentalobstacles are overcome (Franklin 1986).

However, a skeptic might argue that any aspiration for a completely

unequivocal system becomes hopeless because the components may de-

teriorate, the technician may err, and the external influences may be un-

detected. Dretske (1981, chap. 5) correctly responds that the logical pos-

sibilityof equivocationof the signal does not by itself warrant he reasonable

likelihood of such equivocation. Consider the channel conditions neces-

sary for the current flowing through a voltmeter. The pointer would be

equivocal with respect to the measured voltage if this resistance of the

leads varied. But electromagnetic theory shows that the leads will have

the same resistance over a short period of time. The fact that the elec-

tromagnetictheory may be incorrect, that the apparatusmay malfunction,and that extraneous factors may interfere with the voltage merely show

thatbefore using the instrumentto measure voltage the experimentermust

acquire more information about the system's integrity. The skeptical ex-

perimenter shows signs of neurosis if the channel conditions are repeat-

edly checked beyond necessity (ibid., 115-116).The instrument's designers typically address confounding factors by

maximizing the signal-to-noise ratio. Noise can be ignored for those in-

strumentswith a high ratio of signal to noise. This strategy is based on

the definition of the signal-to-noise ratio:

S/N = 10 log (V,2/Vn2),

where Vs is the signal voltage and Vnis the noise voltage (Strobel and

Heineman 1989, 412-415).

According to Shannon's fundamental theorem, when the rate of infor-mationtransfer s less than channelcapacity,the information an be "coded"

in such a way that it will reach the receiver with arbitrarilyhigh fidelity.

Although the degree of reliability is never absolute, doubt can be reduced

to an exceedingly small increment (Massey 1967, 50-52).Let us apply Shannon's theorem to the equivocation of a noisy channel.

Assume thatthe capacity C of a noisy channel is defined as the maximum

rate at which useful informationcan be transmittedover the channel. As-

sume also that the entropyH is the measure of the informationper symbol

at the source of messages. If C for some noisy channel is equal to or

largerthan H for that channel, then the output of the source can be trans-

mitted over the channel with little error. Although some uncertaintymust

remain, errorcan be significantly minimized by devising appropriatecod-

ing systems (Weaver 1964, 20-22).

33

Page 11: Epistemology of Spec to Meter

8/2/2019 Epistemology of Spec to Meter

http://slidepdf.com/reader/full/epistemology-of-spec-to-meter 11/15

DANIEL ROTHBART AND SUZANNE W. SLAYDEN

5. The Data/Phenomena Dichotomy by Bogen and Woodward.

Bogen and Woodward (1988, 1992) argue that the data/theory dichotomyshould be replaced by the data/phenomena distinction. Phenomena, un-like data, are candidates for explanation and prediction by general sys-tematic theory. Phenomena have stable recurringfeatures produced reg-

ularly by some manageable set of factors. The same phenomena should

be detectable in a variety of apparentways not subject to significant fluc-

tuation. To detect a phenomenon one must identify a relatively stable and

invariantpatternof some simplicity and generality with recurrentfeatures

(Woodward 1989, 393-398).In contrast, data register on the measurement or recording device in a

form accessible to human perceptual systems. Data serve as evidence forclaims about phenomena. Although data depend on causal factors from

a variety of physical sources, many factors are idiosyncratic to details of

the experiment. The evidential function of data is secured by specific

proceduralrequirements, such as the control of possible confounding ef-

fects and systematic error, replicability, overcoming problems of data re-

duction and statistical analysis, and calibration and empirical investiga-tion of equipment. Data have no theoretical import in themselves, exceptinsofar as data constitute evidence for the existence of phenomena. Data

are neither the explananda of theoretical systems, nor the subject of sys-tematic predictions (Bogen and Woodward 1988, 315-322).

However, the conclusion that data are not candidates for explanation

by systematic theory rests on a misleading portrayal of the data/theory

relationship.In particular, heoreticalmodels are essential for reliabledata.

Reliability requires access to underlying causal mechanisms for the pro-duction of data, and such mechanisms are conveyed by background the-

ories. This function of background theories is apparentwhen new theo-

retical insights enhance dataproductivity. Thus, the instrument's success

at exposing unknownpropertiesis tied directly to the capacity of scientists

to extend theoretically iconic models of naturalevents to artificial con-

texts.

Bogen and Woodward recognize the complex causal chain that under-

lies the sequence from specimen to data. According to Woodward, how-

ever, such a causal chain by itself does not constitute explanation, which

requires both generality of causal mechanisms responsible for the ex-

planandum-eventand a unification of phenomena within a general pattern

(Woodward 1989, 400-403). The context of instrumentaldesignexposesthe vital contributionof iconic models to the phenomena/data interaction,

and shows how theoretical explanation of the detection signal is requiredfor reliable data.

Furthermore, the argument by Bogen and Woodward reflects a type-token confusion. If phenomena have recurring features produced regu-

34

Page 12: Epistemology of Spec to Meter

8/2/2019 Epistemology of Spec to Meter

http://slidepdf.com/reader/full/epistemology-of-spec-to-meter 12/15

THE EPISTEMOLOGY OF A SPECTROMETER

larly by some small set of factors, as Woodward states, then the notion

of phenomena is that of an organized type that is instantiatedby specific

specimens (tokens) under scrutiny. If data assume inherently singular in-stances of experimental environments, then data are obviously tokens of

some pattern (type). But the claim that phenomena and not data are can-

didates for theoretical explanation is trivialized by the contrast between

phenomena as types and dataas tokens. Patterneddata, such as datastruc-

tures, are subject to theoretical explanation.

6. Toward an Experimental Realism. For van Fraassen experimenta-tion in physics requiresthat scientists fill in theoretical blanks, based pri-

marily on the theory's empirical adequacy, with information ostensiblyaboutelectrons, neutrons, and so on. Such informationreflects theoretical

gaps, only; no epistemic access to an unobservable realm is warranted

(van Fraassen 1980, 75). Metaphysical commitment to unobservable

structuresepistemically compares to belief in the influence of spiritualforces on human behavior. The theory's empirical content, its method-

ological evaluation by empirical adequacy, and its intended scope of ap-

plication all rest on the principled identification of observable entities.

Nevertheless, that which is observable must be a theory-independent

question. To avoid vicious methodological circles in science, the observ-

able/unobservabledistinction s neithertheory-dependent or theory-relative

(ibid., 57-58).

However, there simply are no theory-neutralobservables or unobserv-

ables within the arena of scientific inquiry. Experimentersreadily speakof certain hypothetical entities as unobservable relative to the state of

knowledge at a given time, and restricted by the currenttheoretical un-

derstandingfor a community of scientists. The claim that some phenom-enon is instrumentally observable assumes a wide range of theoretical

insights. The discovery in 1981 of the scanning tunnel microscope en-

abled scientists to detect molecules to a magnification of 107. But it would

be fruitless to criticize seventeenth-century atomists for proclaiming at-

oms as the unobservable corpuscles of matter. Atoms became observable

only after 1981.

Much of the realism/antirealism debate this century rests on an incor-

rect demarcation between observable and unobservable realms. The an-

tirealist's proscription against exploring the (in-principle) unobservable

realm constitutes an arbitraryconstraint on the explanatory power of sci-

entific inquiry. Similarly, the naive realist's aspiration for unveiling the

(in-principle) unobservable causal forces also suggests an arbitrary den-

tification of a priori unobservable entities. Again, any theory-neutralob-

servable/unobservable distinction assumes an unwarranted essentialist

demarcationbetween ostensibly distinct realms of nature.

35

Page 13: Epistemology of Spec to Meter

8/2/2019 Epistemology of Spec to Meter

http://slidepdf.com/reader/full/epistemology-of-spec-to-meter 13/15

DANIEL ROTHBART AND SUZANNE W. SLAYDEN

Hacking (1983, 265) dismisses van Fraassen's antirealism for unne-

cessarily restricting the experimenter's practice within instrumentation.

From his practice-oriented epistemology Hacking defends the reality ofelectrons, for example, on the grounds that electrons can be instrumen-

tally manipulatedas tools for exploring other processes. An entity realism

is grounded on the technician's manipulability of real physical events.

Entity realism is not grounded on the reality of theoretical constructs perse, since a theoretical realism transcends the "home truths" of low-level

generalization familiar to practicing experimenters. Engineering, not

theorizing, exposes nature's secrets (ibid., 263).

Hacking's attempt to cleanse the engineer's practice of the theoreti-

cian's abstractionsconveys similarities to van Fraassen's antirealism. Forboth philosophers the criterionfor reality is fundamentallynontheoretical.

But even on Hacking's own terms of a praxis epistemology, the exper-imenter's low-level generalizations are intimately groundedon theoretical

models of higher generality (Morrison 1990). Again, innovative instru-

mental designs usually reflect the advanced state of theoreticalknowledgefor a wide arrayof domains of inquiry, a point Hacking seriously under-

estimates. Hacking's argument that a technician can manipulate the ap-

paratus without theoretical background is misleading and epistemologi-

cally uninformative; most facets of instrumental design, calibrations of

measurements, and dataanalysis rest on acceptance by the scientific com-

munity at large of causal models of physical reality. Without this accep-tance, the experimentershould lack confidence in the very manipulabilityof entities duringinstrumentusage, and the technician serves no epistemicfunction.

The manipulability of entities for the purpose of interfering with hy-

pothetical processes constitutes a vital component of most contemporaryinstruments, but Hacking's use of manipulability as a criterion of reality

artificiallydemarcatestheory and practice. For example, if manipulabilitywarrants existence of electrons, scientists can legitimately countenance

the specimen's chemical composition precisely because of the inescapable

theory-laden characterof manipulability.The antirealist cannot explain the capacity of instruments to span ex-

traordinary epistemic distances. Within either macroscopic or micro-

scopic dimensions, scientists' access to unknown properties is explained

by the existential continuity (Harre1961, 54) from data to the specimen's

physical structure. Such a continuity is grounded on the causal sequenceof physical events within the instrument. As a result of this causal se-

quence, the instrumentdisplays the markings of reference for some spec-imen. An experimental realism of the specimen's structure is warranted

because real physical processes are nomicly nested within the interpreteddata. But this experimentalrealism does not commit the fallacy of reverse

36

Page 14: Epistemology of Spec to Meter

8/2/2019 Epistemology of Spec to Meter

http://slidepdf.com/reader/full/epistemology-of-spec-to-meter 14/15

Page 15: Epistemology of Spec to Meter

8/2/2019 Epistemology of Spec to Meter

http://slidepdf.com/reader/full/epistemology-of-spec-to-meter 15/15