Top Banner
8/2/2019 Epistemology of Spec to Meter 1/15 The Epistemology of a Spectrometer Author(s): Daniel Rothbart and Suzanne W. Slayden Source: Philosophy of Science, Vol. 61, No. 1 (Mar., 1994), pp. 25-38 Published by: The University of Chicago Press on behalf of the Philosophy of Science Association Stable URL: . Accessed: 16/10/2011 05:32 Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at . JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact The University of Chicago Press and Philosophy of Science Association are collaborating with JSTOR to digitize, preserve and extend access to Philosophy of Science.

Epistemology of Spec to Meter

Apr 06, 2018



Ovidiu Badea
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
  • 8/2/2019 Epistemology of Spec to Meter


    The Epistemology of a SpectrometerAuthor(s): Daniel Rothbart and Suzanne W. Slayden

    Source: Philosophy of Science, Vol. 61, No. 1 (Mar., 1994), pp. 25-38Published by: The University of Chicago Press on behalf of the Philosophy of Science AssociationStable URL: .

    Accessed: 16/10/2011 05:32

    Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at .

    JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of

    content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms

    of scholarship. For more information about JSTOR, please contact

    The University of Chicago Press and Philosophy of Science Association are collaborating with JSTOR to

    digitize, preserve and extend access to Philosophy of Science.
  • 8/2/2019 Epistemology of Spec to Meter



    Department of Philosophy and Religious StudiesGeorge Mason UniversityAND

    SUZANNE W. SLAYDENDepartment of ChemistryGeorge Mason University

    Contraryto the assumptions of empiricist philosophies of science, the theory-laden character of data will not imply the inherent failure (subjectivity, circu-larity, or rationalization)of instrumentsto expose nature's secrets. The successof instruments is credited to scientists' capacity to create artificial technologicalanalogs to familiar physical systems. The design of absorption spectrometersillustrates the point:Progress in designing many modem instrumentsis generatedby analogically projecting theoretical insights from known physical systems tounknown terrain. An experimental realism is defended.

    1. Introduction. Why should scientists trust the reliability of modeminstrumentsto expose unobservablephysical structures?According to em-piricists, instrumentsfunction to magnify our physiologically limited sen-sory capacities by "causally" linking the specimen's sensory propertiesto accessible empirical data; such data in turn are validated by the sameempiricist standardsused to access ordinary (middle-sized) phenomena.Empiricists have given scant attention to instruments as a separate topicof inquiry on the grounds that the epistemic value of instrumentsreducesto the epistemology of commonsense experience.Yet even the dictum that sensory data are theory-laden has the effectof minimizing the philosophical significance of instruments.Many criticsof empiricism work within the empiricist distinction between the subjec-tivity of theory and the apparent objectivity of data. Such a distinctionassumes a naive understandingof scientific instrumentaldesign. Once weovercome this empiricist conception of instrumentaldesign, the theory-laden characterof data will not imply the inherent failure (subjectivity,

    *Received September 1992; revised April 1993.tWe greatlyappreciatecomments on earlier drafts from Rom Harr6,Mary Hesse, EmmettHolman, and especially an anonymous referee for this journal.tSend reprintrequeststo Daniel Rothbart,Departmentof PhilosophyandReligious Studies,George Mason University, 4400 University Blvd., Fairfax, VA 22030, USA.Philosophy of Science, 61 (1994) pp. 25-38Copyright ? 1994 by the Philosophy of Science Association.


  • 8/2/2019 Epistemology of Spec to Meter



    circularity, or rationalization) of instruments to expose nature's secrets.Rather than a warrantfor its subjectivity, the theory-laden character ofdata reveals the instrument's success at exposing real-world structures.In this paper, we argue that the success of instrumentspartially resultsfrom artificial technological replicas of various physical systems familiarto scientists at a given time. Progress in designing many, though not all,instruments is generated by analogical projections of components fromknown physical systems to unknown terrain. Instrumentationenables sci-entists to expand their limited theoretical understandingto previously hid-den domains. We argue against both a skepticism and naive realism ofscientific instruments in favor of an experimental realism that interpretsinstrumentsas analogs to natural systems.Toward these goals, we explain how instruments are conceived as an-alogical replicas of real-world systems (section 2), examine the design ofabsorptionspectrometers (section 3), respond to the skeptic's charge thatunobservable structuresare inaccessible (section 4), evaluate the provoc-ative data/phenomena distinction by Bogen and Woodward (section 5),and briefly propose an experimental realism based on instrumentation(section 6).The recent resurgence on the nature of instrumentation is partially ev-ident in the works of Ackermann (1985), Baird and Faust (1990), Bogenand Woodward (1988, 1992), Franklin(1986), Galison (1987), Gooding(1989), Hacking (1983, 1988), Latour (1987), Pickering (1989), Radder(1986, 1988), Ramsey (1992), Shapin and Schaffer (1985), and Woolgar(1988). In exploring the natureof scientific instruments, we will examinethe works of a few of these authors.2. InstrumentsDesignedas Replicasof Nature. Intended o avoidthedebacles of naive empiricism, Ackermannexplains instrumentationwithinthe frameworkof an evolutionary epistemology. Theories evolve in waysthat best adapt to the environmental niches of "facts"; the data domainsare the socially sanctioned depiction of such facts about the world. In-strumentsfunction as epistemic intermediariesbetween theories and data.Through instruments the influence of interpretationis broken, presum-ably, by refining and extending human sensory capacity. But in the end,Ackermann's epistemology is strikingly empiricist. The primaryfunctionof instruments is to break the line of influence from theory to fact bygroundingthe subjectivityof interpretation n the intersubjectivityof fact.Consequently, the authenticity of data domains is not grounded on anytheoretical constructs, but stems rather from socially negotiated sensorycontent (1985, 127-131).However, Ackermann's rationale for instrumentshas little bearing onthe design of modem instruments. First, data are not always identified


  • 8/2/2019 Epistemology of Spec to Meter


    THE EPISTEMOLOGY OF A SPECTROMETERwith perceptual experiences (Bogen and Woodward 1992, 593). Typi-cally, the experimenterreads graphic displays, digital messages, or codedcharts directly from the instrumental readout. The computer-controlledvideo display and the more common printer/plotter, for example, employlanguage that is accessible only to the trained technician. For example,a photomultiplierreadout device transforms the radiantenergy of a signalinto electrical energy while simultaneously increasing the generated cur-rent a millionfold. The current from the device flows to either a chartrecorderof numbersor a series of milliammeters. Within spectral analysisthe prevalence of visual data, for example, the yellow from a sodiumflame, has been replaced in modem spectrometersby discursive readouts.Second, the empiricist conception of extraordinaryphenomenon has noplace in modem instrumentation.Typically, the phenomenon of interestis a set of physical interactions between the specimen and experimentalconditions. All experimental properties that are instrumentally detectedare tendencies, or conditional manifestations of the specimen, to react tocertainexperimentalstimuli. The specimen has tendencies manifestedonlyif certain humanly designed experimental conditions are realized (Harre1986, chap. 15). Although such conditions are teleologically determined,the tendencies are grounded on the specimen's real physical structure,which exists independently of human thought. The phenomenon of in-terest is not entirely generated exclusively by external physical structuresand not entirely by internal conceptualization.Third, the reliability of instruments must be credited to their design asartificialanalogs to naturalsystems. The physical sequence of events fromspecimen structureto data readout constitutes a technological analog tomultiple naturalsystems based on underlying causal models of real-worldphenomena. The instrument'sdesigners typically dissect, restructure,andreorganize natural systems for the purpose of projecting powerful theo-retical analogs to unexplored terrain.The instrument thus can expose pre-viously hiddenphysical propertiesby cross-fertilizationromknownphysicalsymmetries to the unknown structures under investigation. This cross-fertilizationmotivates scientists to projectparametersfrom known modelsof naturalphenomena to unknown models of causal processes underlyinginstrumentdesign.In this context a model must be conceived iconically as a cognitivestructure hatreplicates some phenomenal system. The iconic model con-sists of a set of abstractparametersordered according to lawlike gener-alizations of some theory. The theory in turn consists of a set of suchmodels. The iconic model is not reducible to a mathematicalmodel sincethe mathematicalstructureof the iconic model does not exhaust its entirecontent. Also, the iconic model is not by definition a descriptive model,


  • 8/2/2019 Epistemology of Spec to Meter



    although any iconic model can be transposed to a linguistic formulationto produce a descriptive model.So, underlying he design of many modem instrumentsare sourcemodelsof real-world systems. Each source model exhibits positive, negative, andneutralanalogies, to use M. Hesse's terminology, to the target sequenceof physical events within the instrument's operation. Yet, this analogicalprojection from source to target models is not theory reduction of un-known to known causal structures. The instrument is designed to createartificially a complex maze of causal processes from a combination ofdiverse physical theories.The analog system functions as an idealized prototypethat is projectibleonto the phenomenal system under scrutiny. The analog model deter-mines the range of conceptual possibilities by supplying new horizons oficonic vision for extractinga physical reaction from a specimen structure.In this respect the source analog acquires a normative force by directingengineers to explore a specific realm of possible models. Yet the dis-covery of fresh analogies, and new prototypes, does not always requirea monolithic overhaul of the entire scientific enterprise, as is suggestedby a Kuhnianparadigmshift. Newly discovered analogies typically yielda specifiable and localized transformationof some problematic subject.Nevertheless, a prominent factor in judging a theory's success is itscapacity to motivate instrumentalprogress. A mutual dependence arisesbetween instrumental design and theoretical progress: The instrument'sdesign requires the complex combinations of various theoretical insights,and the theory's fertility is partially measured by successes of instru-mental designs. In this respect the internal/external distinction assumedabove between the specimen's unknown parametersand the backgroundtheoretical models must be qualified.

    One major task for any designer is to select the most promising ana-logical system to function as the generator for the instrument's relevantcausal relations. The analogical origins of such designs become hiddenunder the cloak of repeated experimental successes. For example, C. T.R. Wilson designed the cloud chamber not as a particle detector but asa meteorologicalreproduction f real atmosphericcondensation. As Galisonand Assmus (1989) document, meteorology in the 1890s was experienc-ing a "mimetic" transformation n which the morphological scientists be-gan to use the laboratory o reproducenaturaloccurrences.The mimeticistsproduced miniature versions of cyclones, glaciers, and windstorms.Wilson's design of the cloud chamber was explicitly based on J. Aitken'sdust chamber, which in turn recreated the effects of fogs threateningEngland's industrial cities. Wilson transportedthe basic components ofthe dust chamber (the pump, reservoir, filter, values, and expansion me-


  • 8/2/2019 Epistemology of Spec to Meter



    chanics) to his cloud chamber for the reproductionof thunderstorms,co-ronae, and atmospheric electricity.J. J. Thompson and researchersat the Cavendish laboratoriesgave the"same"instrumentsa new theoreticalrationale. Rather thanimitatingcloudformations, Thompson intended to take natureapart by exploring the fun-damental characterof matter (ibid., 265). For their matter-theoreticpur-poses, scientists at the Cavendish became indebted to Wilson's artificialclouds for revealing the fundamental electrical nature of matter, "As the

    knotty clouds blended into the tracks of alpha particles and the 'thread-like' clouds became beta-particletrajectories, the old sense and meaningof the chamber changed" (ibid., 268). For twentieth-century physiciststhe formation of droplets were replaced by the energies of gamma rays,the scattering of alpha particles, and discovery of new particles. Wilsonand the matterphysicists profferedrival theoretical nterpretations-derivedfrom distinct physical analogs-of the chamber's causal structure.Thompson and Wilson employed different instruments.3. Absorption Spectrometers. Let us consider basic design principlesfor absorption spectrometers commonly used for identification, structureelucidation, and quantification of chemical substances. Modern absorp-tion spectrometerswere designed from the analogical projection of causalmodels of the photoelectric effects of light.Scientists naturallyunderstandmodern instruments as informationpro-cessors. From this perspective many instruments function as complex sys-tems of detecting, transforming, and processing information from an in-put event, typically an instrument/specimen interface, to some outputevent, typically a readout of information.Within instrumentaldesign the reliability of the signal becomes a pri-mary focus of attention. The signal must be detected, converted by thetransducerto a different energy form, processed, and finally amplifiedfor the readout. The signal is defined roughly as an information-carryingvariable. An analog signal (commonly voltage or current)has a topology-preserving correspondence with a variable of the specimen under study;that is, the signal strength is directly proportional to the value of themeasured quantity. A digital signal carries the source variable encodedinto high or low signal levels usually expressed within binary notation.Instruments, interfaced to digital computers for either data acquisitionand/or automaticcontrol, incorporatedevices for signal conversion (A/DCor D/AC).Within an absorptionspectrometera beam of electromagnetic radiationemitted in the spectralregion of interestpasses througha monochromator,which is a series of optical components such as lenses and mirrors. Thisradiation then impinges on a sample. The monochromator isolates the


  • 8/2/2019 Epistemology of Spec to Meter



    lI . I ? 100source detector

    samplecellsFigure 3.1. A single-beam absorption spectrometer. (Reproduced by permission fromCharlesK. Mann, Thomas J. Vickers, and Wilson M. Gulick, InstrumentalAnalysis, 1974,New York: Harper & Row.)

    radiationfrom a broad band of wavelengths to a continuous selection ofnarrow band wavelengths. These wavelengths can be held constant, orthey can be scanned automatically or manually.Depending on the sample, various wavelengths of radiation are ab-sorbed, reflected, or transmitted.That partof the radiationpassing throughthe sample is detected and converted to an electrical signal, usually by aphotomultipliertube. The electric outputis electronically manipulatedandsent to the readoutdevice, such as a meter, a computer, a controlled videodisplay, or a printer/plotter.Consider a schematic depiction of a single-beam absorption spectrom-eter, shown in figure 3.1 (Mann et al. 1974, 312). For such a spectrom-eter the amplified output of the detector is measured directly in terms ofmeter deflection. Notice that the sample reading is compared to a ref-erence sample, as indicated in figure 3.1 by sample cell S and referencecell R (ibid., 311).The interaction of electromagnetic radiation and a specific chemicalsample is unique. The "fingerprint"of this interaction is revealed by theabsorption spectrum over the entire electromagnetic energy continuum,and thus the interaction provides vital information about a specimen'smolecular structure.Some of the most convincing evidence about atomicand molecular structurehas been obtained by spectral analysis.The success of spectral analysis is based on the following causal prin-ciple about atomic or molecular structure:If a specimen absorbs a certainwavelength of light (the wavelength corresponding to a particular en-ergy), then that absorbed energy must be exactly the same as the energyrequiredfor some specific internal change in the molecule or atom. Re-maining energies in the light spectrum are "ignored" by the substance,and these energies are then reflected or transmitted. (The absorbed lightenergy causes such changes as atomic and molecular vibrations, rotations,and electron excitation.) As a result of the absorption, a specially de-signed instrumentmay detect an energy change that we may "sense" in


  • 8/2/2019 Epistemology of Spec to Meter


    THE EPISTEMOLOGY OF A SPECTROMETERsome cases as heat, fluorescence, phosphorescence, or color. Thus, thedetected signal can expose the molecular structure of the specimen interms of the specific patternsof absorbed and reflected/transmitted lightenergies.Which instrumenttype and radiationsource should be chosen for study-ing a particularspecimen? This problem requiresextensive knowledge ofthe range of chemical structures,the different types of spectrometers, theelectronic processes, and the measurement of the resultant spectra. Thedesigner's articulation of channel conditions, as well as the experi-menter's operation, include complex modeling from electromagnetism,optics, atomic theory, chemistry, and geometry. Consider how variousstages of energy transformation hroughoutthe instrumentare representediconically by various power flow models. For example, electrical energyyields an effort of voltage and a flow of electrical current. No preferredvalue can be given for a single parameter in isolation from others. In-strumentaldesign and operation must be understood as combinations ofconditions and combinations of circumstances ranging across distinct do-mains of inquiry. A thorough understandingof the spectrometerrequiresa major segment of the physical sciences in general, a point C. A. Hooker(1987, 116) illustrates within the design of the Wilson cloud chamber fortesting particle reactions.The signal that carries information about the specimen's structure isdefined by fixed channel conditions. The channel of communication is aset of conditions that either (1) generates no relevant information, or (2)generates only redundant information (Dretske 1981, 115). The infor-mation that the specimen a has propertyF ratherthan not-F requires de-signers to define the fixed channel conditions on the basis of externalphysical theories. Many newly designed instrumentsrequire the techno-logical extension of physical principles familiar to scientists within nat-ural domains.Thus, the empiricist's dictum that scientific instrumentsextend the lim-ited sensory capacity distorts the inherent theoretical rationale: Instru-ments function to expose the specimen's underlying physical structurebytechnological analogy to naturalcausal symmetries. Access to unknownproperties of the specimen's structureoccurs by theoretical extension ofalready familiar independent causal models. The technology exposes thespecimen's unknown attributesby generating a moment of theoretical in-tersection between the actual and the possible, that is, between familiartheories functioning externally to the experiment and hypothetical modelspresumably replicating the specimen's structure.The informationaloutput of the absorption spectrometercenters on theelectromagnetically understood energy absorbed by the specimen. Be-cause such spectrometers are designed by analogy to the photoelectric


  • 8/2/2019 Epistemology of Spec to Meter



    effects, the conception of energy detected within the spectrometer is an-alogically derived in partfrom light beams consisting of discrete photons.When a flash of light is observed with a photomultiplier and displayedon an oscilloscope, the observed signal is a set of impulses (Bair 1962,13). Such photoelectric signals from a flash of light function as the data-constituting analog to the conception of energy detected in absorptionspectrometers.4. Overcoming the Skeptic's Noise. This analogical conception of in-strumentationdoes not warranta skepticism about the capacity of instru-ments to reveal the specimen's physical dispositions. The images frominfrareddetectors employed by astrophysicists to reveal newborn stars arenot the complete fabrication of the experimenter's symbol system. Theline sequences from a spectral analysis are not artifacts of the scientists'conceptualizations. The tracksof alpha particles within a bubble chamberare not fictitious concoctions by self-deluding scientists.As communication systems, instruments are designed to minimize dis-tortion and vulnerability to noise for the purpose of creating a one-to-onetransformationfrom signal to source states. The reliability of data restsin part on the ability of the experimenter to overcome potentially inter-fering influences that would result in the signal's random error. Suchinfluences would prevent experimentersfrom distinguishing the detectionof the phenomenon from backgroundnoise because in such a case a one-to-many transformationfrom signal to source would result.Reliable channels of communication, based on background theories,can be in principle achieved so that the signal is practically unequivocal,that the mapping from data structure to specimen structure approachesone-to-one, and that the signal-to-noise ratio can be maximized. The ex-perimentercan be reasonably confident that such confounding factors areminimized by blocking out the potentially interfering agent. The sourceof noise for electrical signals may be the light reflected by objects in aroom, energy radiatedby electrical lines in walls, and mechanical vibra-tions transmittedthrough a floor. Such random energy sources can besignificantly reduced by shielding electrical lines or by insulating wallsto protect against temperaturechanges.Alternatively, the experimenter can sometimes isolate the features ofthe phenomenon of interest from the external confounding factors. Sci-entists attempting to detect magnetic monopoles within cosmic ray ex-periments often had to distinguish heavy charged particles like possiblemonopoles from light nuclei. Since both kinds of particles were detectedby ordinary photographic emulsions, experimenters switched to a com-mercial plastic that was sensitive only to the heavy charged particles(Woodward 1989, 411).


  • 8/2/2019 Epistemology of Spec to Meter


    THE EPISTEMOLOGY OF A SPECTROMETERA compensation technique can be used when the confounding factor

    operates uniformly. The signal's fluctuation can then be used to conveyinformationabout the specimen's attributes,assuming other experimentalobstacles are overcome (Franklin 1986).However, a skeptic might argue that any aspiration for a completelyunequivocal system becomes hopeless because the components may de-teriorate, the technician may err, and the external influences may be un-detected. Dretske (1981, chap. 5) correctly responds that the logical pos-sibilityof equivocationof the signal does not by itself warrant he reasonablelikelihood of such equivocation. Consider the channel conditions neces-sary for the current flowing through a voltmeter. The pointer would beequivocal with respect to the measured voltage if this resistance of theleads varied. But electromagnetic theory shows that the leads will havethe same resistance over a short period of time. The fact that the elec-tromagnetictheory may be incorrect, that the apparatusmay malfunction,and that extraneous factors may interfere with the voltage merely showthatbefore using the instrumentto measure voltage the experimentermustacquire more information about the system's integrity. The skeptical ex-perimenter shows signs of neurosis if the channel conditions are repeat-edly checked beyond necessity (ibid., 115-116).The instrument's designers typically address confounding factors bymaximizing the signal-to-noise ratio. Noise can be ignored for those in-strumentswith a high ratio of signal to noise. This strategy is based onthe definition of the signal-to-noise ratio:

    S/N = 10 log (V,2/Vn2),where Vs is the signal voltage and Vnis the noise voltage (Strobel andHeineman 1989, 412-415).

    According to Shannon's fundamental theorem, when the rate of infor-mationtransfer s less than channelcapacity,the information an be "coded"in such a way that it will reach the receiver with arbitrarilyhigh fidelity.Although the degree of reliability is never absolute, doubt can be reducedto an exceedingly small increment (Massey 1967, 50-52).Let us apply Shannon's theorem to the equivocation of a noisy channel.Assume thatthe capacity C of a noisy channel is defined as the maximumrate at which useful informationcan be transmittedover the channel. As-sume also that the entropyH is the measure of the informationper symbolat the source of messages. If C for some noisy channel is equal to orlargerthan H for that channel, then the output of the source can be trans-mitted over the channel with little error. Although some uncertaintymustremain, errorcan be significantly minimized by devising appropriatecod-ing systems (Weaver 1964, 20-22).


  • 8/2/2019 Epistemology of Spec to Meter


    DANIEL ROTHBART AND SUZANNE W. SLAYDEN5. The Data/Phenomena Dichotomy by Bogen and Woodward.Bogen and Woodward (1988, 1992) argue that the data/theory dichotomyshould be replaced by the data/phenomena distinction. Phenomena, un-like data, are candidates for explanation and prediction by general sys-tematic theory. Phenomena have stable recurringfeatures produced reg-ularly by some manageable set of factors. The same phenomena shouldbe detectable in a variety of apparentways not subject to significant fluc-tuation. To detect a phenomenon one must identify a relatively stable andinvariantpatternof some simplicity and generality with recurrentfeatures(Woodward 1989, 393-398).In contrast, data register on the measurement or recording device in aform accessible to human perceptual systems. Data serve as evidence forclaims about phenomena. Although data depend on causal factors froma variety of physical sources, many factors are idiosyncratic to details ofthe experiment. The evidential function of data is secured by specificproceduralrequirements, such as the control of possible confounding ef-fects and systematic error, replicability, overcoming problems of data re-duction and statistical analysis, and calibration and empirical investiga-tion of equipment. Data have no theoretical import in themselves, exceptinsofar as data constitute evidence for the existence of phenomena. Dataare neither the explananda of theoretical systems, nor the subject of sys-tematic predictions (Bogen and Woodward 1988, 315-322).However, the conclusion that data are not candidates for explanationby systematic theory rests on a misleading portrayal of the data/theoryrelationship.In particular, heoreticalmodels are essential for reliabledata.Reliability requires access to underlying causal mechanisms for the pro-duction of data, and such mechanisms are conveyed by background the-ories. This function of background theories is apparentwhen new theo-retical insights enhance dataproductivity. Thus, the instrument's successat exposing unknownpropertiesis tied directly to the capacity of scientiststo extend theoretically iconic models of naturalevents to artificial con-texts.

    Bogen and Woodward recognize the complex causal chain that under-lies the sequence from specimen to data. According to Woodward, how-ever, such a causal chain by itself does not constitute explanation, whichrequires both generality of causal mechanisms responsible for the ex-planandum-eventand a unification of phenomena within a general pattern(Woodward 1989, 400-403). The context of instrumentaldesign exposesthe vital contributionof iconic models to the phenomena/data interaction,and shows how theoretical explanation of the detection signal is requiredfor reliable data.Furthermore, the argument by Bogen and Woodward reflects a type-token confusion. If phenomena have recurring features produced regu-


  • 8/2/2019 Epistemology of Spec to Meter



    larly by some small set of factors, as Woodward states, then the notionof phenomena is that of an organized type that is instantiatedby specificspecimens (tokens) under scrutiny. If data assume inherently singular in-stances of experimental environments, then data are obviously tokens ofsome pattern (type). But the claim that phenomena and not data are can-didates for theoretical explanation is trivialized by the contrast betweenphenomena as types and dataas tokens. Patterneddata, such as datastruc-tures, are subject to theoretical explanation.6. Toward an Experimental Realism. For van Fraassen experimenta-tion in physics requiresthat scientists fill in theoretical blanks, based pri-marily on the theory's empirical adequacy, with information ostensiblyaboutelectrons, neutrons, and so on. Such informationreflects theoreticalgaps, only; no epistemic access to an unobservable realm is warranted(van Fraassen 1980, 75). Metaphysical commitment to unobservablestructuresepistemically compares to belief in the influence of spiritualforces on human behavior. The theory's empirical content, its method-ological evaluation by empirical adequacy, and its intended scope of ap-plication all rest on the principled identification of observable entities.Nevertheless, that which is observable must be a theory-independentquestion. To avoid vicious methodological circles in science, the observ-able/unobservabledistinction s neithertheory-dependent or theory-relative(ibid., 57-58).However, there simply are no theory-neutralobservables or unobserv-ables within the arena of scientific inquiry. Experimentersreadily speakof certain hypothetical entities as unobservable relative to the state ofknowledge at a given time, and restricted by the currenttheoretical un-derstandingfor a community of scientists. The claim that some phenom-enon is instrumentally observable assumes a wide range of theoreticalinsights. The discovery in 1981 of the scanning tunnel microscope en-abled scientists to detect molecules to a magnification of 107. But it wouldbe fruitless to criticize seventeenth-century atomists for proclaiming at-oms as the unobservable corpuscles of matter. Atoms became observableonly after 1981.Much of the realism/antirealism debate this century rests on an incor-rect demarcation between observable and unobservable realms. The an-tirealist's proscription against exploring the (in-principle) unobservablerealm constitutes an arbitraryconstraint on the explanatory power of sci-entific inquiry. Similarly, the naive realist's aspiration for unveiling the(in-principle) unobservable causal forces also suggests an arbitrary den-tification of a priori unobservable entities. Again, any theory-neutralob-servable/unobservable distinction assumes an unwarranted essentialistdemarcationbetween ostensibly distinct realms of nature.


  • 8/2/2019 Epistemology of Spec to Meter



    Hacking (1983, 265) dismisses van Fraassen's antirealism for unne-cessarily restricting the experimenter's practice within instrumentation.From his practice-oriented epistemology Hacking defends the reality ofelectrons, for example, on the grounds that electrons can be instrumen-tally manipulatedas tools for exploring other processes. An entity realismis grounded on the technician's manipulability of real physical events.Entity realism is not grounded on the reality of theoretical constructs perse, since a theoretical realism transcends the "home truths" of low-levelgeneralization familiar to practicing experimenters. Engineering, nottheorizing, exposes nature's secrets (ibid., 263).Hacking's attempt to cleanse the engineer's practice of the theoreti-cian's abstractionsconveys similarities to van Fraassen's antirealism. Forboth philosophers the criterionfor reality is fundamentallynontheoretical.But even on Hacking's own terms of a praxis epistemology, the exper-imenter's low-level generalizations are intimately groundedon theoreticalmodels of higher generality (Morrison 1990). Again, innovative instru-mental designs usually reflect the advanced state of theoreticalknowledgefor a wide arrayof domains of inquiry, a point Hacking seriously under-estimates. Hacking's argument that a technician can manipulate the ap-paratus without theoretical background is misleading and epistemologi-cally uninformative; most facets of instrumental design, calibrations ofmeasurements, and dataanalysis rest on acceptance by the scientific com-munity at large of causal models of physical reality. Without this accep-tance, the experimentershould lack confidence in the very manipulabilityof entities duringinstrumentusage, and the technician serves no epistemicfunction.The manipulability of entities for the purpose of interfering with hy-pothetical processes constitutes a vital component of most contemporaryinstruments, but Hacking's use of manipulability as a criterion of realityartificiallydemarcatestheory and practice. For example, if manipulabilitywarrants existence of electrons, scientists can legitimately countenancethe specimen's chemical composition precisely because of the inescapabletheory-laden characterof manipulability.The antirealist cannot explain the capacity of instruments to span ex-traordinary epistemic distances. Within either macroscopic or micro-scopic dimensions, scientists' access to unknown properties is explainedby the existential continuity (Harre1961, 54) from data to the specimen'sphysical structure. Such a continuity is grounded on the causal sequenceof physical events within the instrument. As a result of this causal se-quence, the instrumentdisplays the markings of reference for some spec-imen. An experimental realism of the specimen's structure is warrantedbecause real physical processes are nomicly nested within the interpreteddata. But this experimentalrealism does not commit the fallacy of reverse


  • 8/2/2019 Epistemology of Spec to Meter


  • 8/2/2019 Epistemology of Spec to Meter