Top Banner
Journal of Neuroscience Methods 174 (2008) 245–258 Contents lists available at ScienceDirect Journal of Neuroscience Methods journal homepage: www.elsevier.com/locate/jneumeth A flexible software tool for temporally-precise behavioral control in Matlab Wael F. Asaad a,b,, Emad N. Eskandar a,b a Department of Neurosurgery, Massachusetts General Hospital, Boston, MA 02114, USA b Program in Neuroscience, Harvard Medical School, Boston, MA 02115, USA article info Article history: Received 9 May 2008 Received in revised form 30 June 2008 Accepted 16 July 2008 Keywords: Neurophysiology Psychophysics Matlab Behavioral control Software Cognition Human Monkey abstract Systems and cognitive neuroscience depend on carefully designed and precisely implemented behavioral tasks to elicit the neural phenomena of interest. To facilitate this process, we have developed a software system that allows for the straightforward coding and temporally-reliable execution of these tasks in Matlab. We find that, in most cases, millisecond accuracy is attainable, and those instances in which it is not are usually related to predictable, programmed events. In this report, we describe the design of our system, benchmark its performance in a real-world setting, and describe some key features. © 2008 Elsevier B.V. All rights reserved. 1. Introduction Interesting neural data are often the products of well-designed, psychophysically-rigorous behavioral paradigms. The creation and execution of these behavioral tasks relies upon a small range of applications that run on a relatively narrow range of software and hardware (Hays et al., 1982; White et al., 1989–2008; Ghose et al., 1995; Maunsell, 2008). The strengths and weakness of each applica- tion reflect the types of behaviors studied at the time of their initial development. Too often, the transition towards different types of behavioral tasks strains the flexibility of these programs, and cum- bersome workarounds layer successively upon one another. Recently, however, the performance of even a higher-level programming environment, specifically Matlab, has been demon- strated to be adequate for behavioral control at the 1 ms time-scale (Meyer and Constantinidis, 2005; Asaad and Eskandar, 2008). Thus, although no software running on Windows can attain truly deter- ministic, hard-real-time performance (Ramamritham et al., 1998), such software can nevertheless deliver high (not perfect) temporal reliability. Given those data, we now focus on the design, real-world performance, and usability such a system can achieve. Corresponding author at: Department of Neurosurgery, Edwards Building, Room 426, 55 Fruit Street, Massachusetts General Hospital, Boston, MA 02114, USA. Tel.: +1 617 905 7691; fax: +1 617 726 2310. E-mail address: [email protected] (W.F. Asaad). In particular, we sought to harness the Matlab high-level pro- gramming environment to allow the quick and efficient coding behavioral tasks. By creating a system that has minimal program- ming overhead, we hoped to allow users to focus on the essential features of experimental design and the basic elements of behav- ioral control and monitoring rather than on the often arcane details of the video presentation and data acquisition hardware. Our major goals were: To allow behavioral control with high temporal precision in Mat- lab. To allow straightforward scripting of behavioral tasks using stan- dard Matlab syntax and functions. To interface transparently with data acquisition hardware for input / output functions, such as eye-signal, joystick and button- press acquisition, reward delivery, digital event marker output, as well as analog and TTL output to drive stimulators and injectors. To allow the full reconstruction of task events from the behavioral data file by including complete descriptions of behavioral perfor- mance, the event markers and their text labels, the task structure, and the actual stimulus images used; as a demonstration of this goal, to allow the re-playing of any given trial from the behavioral data file alone. To provide the experimenter with an information-rich display of behavioral performance and to reflect task events in real-time to aid the assessment of on-going behavior. 0165-0270/$ – see front matter © 2008 Elsevier B.V. All rights reserved. doi:10.1016/j.jneumeth.2008.07.014
14

Art. WF Assad - A Flexible Software Tool for Temporally-precise Behavioral Control in Matlab

Sep 24, 2015

Download

Documents

Paul Rodrigo

Systems and cognitive neuroscience depend on carefully designed and precisely implemented behavioral
tasks to elicit the neural phenomena of interest. To facilitate this process, we have developed a software
system that allows for the straightforward coding and temporally-reliable execution of these tasks in
Matlab.We find that, in most cases, millisecond accuracy is attainable, and those instances in which it is
not are usually related to predictable, programmed events. In this report, we describe the design of our
system, benchmark its performance in a real-world setting, and describe some key features.
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
  • Journal of Neuroscience Methods 174 (2008) 245258

    Contents lists available at ScienceDirect

    Journal of Neuroscience Methods

    journa l homepage: www.e lsev ier .com

    A exib be

    Wael F. Aa Department ob Program in N

    a r t i c l

    Article history:Received 9 MaReceived in reAccepted 16 Ju

    Keywords:NeurophysioloPsychophysicsMatlabBehavioral conSoftwareCognitionHumanMonkey

    pendf intewardillisecprogra rea

    1. Introduc

    Interestipsychophysexecution oapplicationhardware (H1995;Mauntion reectdevelopmenbehavioral tbersome wo

    Recentlyprogrammistrated to b(Meyer andalthough noministic, hasuch softwareliability. Gperformanc

    Correspon426, 55 Fruit S+1 617 905 769

    E-mail add

    0165-0270/$ doi:10.1016/j.jntion

    ng neural data are often the products of well-designed,ically-rigorous behavioral paradigms. The creation andf these behavioral tasks relies upon a small range ofs that run on a relatively narrow range of software andays et al., 1982; White et al., 19892008; Ghose et al.,sell, 2008). The strengths andweaknessof eachapplica-the types of behaviors studied at the time of their initialt. Too often, the transition towards different types ofasks strains the exibility of these programs, and cum-rkarounds layer successively upon one another., however, the performance of even a higher-levelng environment, specically Matlab, has been demon-e adequate for behavioral control at the 1ms time-scaleConstantinidis, 2005; Asaad and Eskandar, 2008). Thus,software running on Windows can attain truly deter-

    rd-real-time performance (Ramamritham et al., 1998),re can nevertheless deliver high (not perfect) temporaliven those data,wenow focus on thedesign, real-worlde, and usability such a system can achieve.

    ding author at: Department ofNeurosurgery, Edwards Building, Roomtreet, Massachusetts General Hospital, Boston, MA 02114, USA. Tel.:1; fax: +1 617 726 2310.ress: [email protected] (W.F. Asaad).

    In particular, we sought to harness the Matlab high-level pro-gramming environment to allow the quick and efcient codingbehavioral tasks. By creating a system that has minimal program-ming overhead, we hoped to allow users to focus on the essentialfeatures of experimental design and the basic elements of behav-ioral control andmonitoring rather than on the often arcane detailsof the video presentation and data acquisition hardware. Ourmajorgoals were:

    To allow behavioral control with high temporal precision in Mat-lab.

    To allow straightforward scripting of behavioral tasks using stan-dard Matlab syntax and functions.

    To interface transparently with data acquisition hardware forinput / output functions, such as eye-signal, joystick and button-press acquisition, reward delivery, digital eventmarker output, aswell as analog and TTL output to drive stimulators and injectors.

    To allow the full reconstruction of task events from the behavioraldata le by including complete descriptions of behavioral perfor-mance, the eventmarkers and their text labels, the task structure,and the actual stimulus images used; as a demonstration of thisgoal, to allow the re-playing of any given trial from the behavioraldata le alone.

    To provide the experimenter with an information-rich display ofbehavioral performance and to reect task events in real-time toaid the assessment of on-going behavior.

    see front matter 2008 Elsevier B.V. All rights reserved.eumeth.2008.07.014le software tool for temporally-precise

    saada,b,, Emad N. Eskandara,b

    f Neurosurgery, Massachusetts General Hospital, Boston, MA 02114, USAeuroscience, Harvard Medical School, Boston, MA 02115, USA

    e i n f o

    y 2008vised form 30 June 2008ly 2008

    gy

    trol

    a b s t r a c t

    Systems and cognitive neuroscience detasks to elicit the neural phenomena osystem that allows for the straightforMatlab. We nd that, in most cases, mnot are usually related to predictable,system, benchmark its performance in/ locate / jneumeth

    havioral control in Matlab

    on carefully designed and precisely implemented behavioralrest. To facilitate this process, we have developed a softwarecoding and temporally-reliable execution of these tasks inond accuracy is attainable, and those instances in which it isammed events. In this report, we describe the design of ourl-world setting, and describe some key features.

    2008 Elsevier B.V. All rights reserved.

  • 246 W.F. Asaad, E.N. Eskandar / Journal of Neuroscience Methods 174 (2008) 245258

    2. Materials and methods

    Our tested systemwas composed of aDell Computerwith a Pen-tium Core 2containingsystemwasmond, WA)nVidiaQuaddual-headein full-screementers coof 102476suring 15 inthe tests repThe experimduring behaof the subje

    Matlab sMA), includcessing Tootestedhere,tions comprare directlynon-multi-display (creOpenGL witplay, low-leMicrosoft CPerry at theMatlab withmatlab n

    Anoptim(Asaad andsor time thIn addition,effectively dtask by othtrials weredows (Readuring thetime to exeaccessible tsoftware).

    Behavioacquisitiontwo PCI-62a BNC-2090ThesewereAlthough th(Asaad andcapable of dlevel behav

    We splitboards to athesedata. Tboard woulstorage of s(Asaad andically detecallocate one

    Digital eacquisitionthan the digour systemEskandar, 2

    To assess the performance of our software, we analyzed datafrom the on-going training of a subject. Specically, we employed asimple behavioral task in which a rhesus monkey (macaca mulatta,

    .1 Kgto led-erhead foresenic resata fere cof tThises, those ommilog Xof annning use behcy iss sucller foralamoe (tidard(theionsbloc

    ults

    t weentiae thtextt tese so

    ign

    inteo mringthe swo a

    ulusts toor anevertput)are si avahe wth ththanadv

    his tby

    trialDuo processor (model 6300) running at 1.86GHz and1GB of RAM (Dell Inc., Round Rock, TX). The operatingMicrosoft Windows XP, service pack 2 (Microsoft, Red-. The graphics hardware in this machine consisted of anroNVS285with256MBofvideoRAM.Output fromthisdgraphics cardwas split to twosubject displays runningnmode at pixel resolutions of 800600, and an experi-ntroldisplay, running inwindowedmodeat a resolution8. The displays were standard cathode-ray tubes mea-ches in the diagonal, also from Dell. The refresh rate fororted herewas 100Hz, and videowas double-buffered.enters displaywindowwas set to update every 100msvioral monitoring to allow near-real-time observationcts performance.oftware (version r2007b, The Mathworks Inc., Natick,ing the Data Acquisition Toolbox and the Image Pro-lbox, was used to write the behavioral control softwareand to analyze the timingdata reportedbelow.All func-ising our softwarewerewritten as simple .mles thateditable by any user. Matlab was run in the default,

    threaded mode. Matlab gures for the experimentersated using the built-in graphics functions) relied uponh hardware acceleration enabled. For the subjects dis-vel routines for video control (based on DirectX fromorp.) were obtained through the generosity of Jeffrey S.University of Texas at Austin. All tests were run withinthe Java Virtual Machine disabled (launched by typing

    ojvm at the windows command prompt).ized systemprolewas createdasdescribedpreviouslyEskandar, 2008) to minimize the amount of proces-at could be stolen by other applications and devices.increasing the process priority of Matlab in Windowsecreased the amount of time stolen fromthebehavioraler applications (Asaad and Eskandar, 2008). Therefore,run at the highest process priority allowed by Win-l Time), and the priority was lowered to Normalinter-trial-intervals to allow other pending activitiescute (Note that setting the process priority for trials iso the user through an option in the main menu of our

    ral signals were monitored using two identical databoards (a.k.a., DAQ boards) from National Instruments:29 multi-function DAQ cards were each connected toa break-out box (National Instruments, Austin, TX).interfaced toMatlabusing theDataAcquisitionToolbox.is toolbox is not intended for real-time control, our testsEskandar, 2008) have suggested that it is neverthelesselivering the performance necessary for millisecond-

    ioral monitoring.the incoming behavioral signals into two analog inputllow more rapid sampling and simultaneous storage ofhis is because logging and samplingdata fromthe samed be subject to upload delays caused by the temporaryamples in the acquisition boards local memory bufferEskandar, 2008). An option in our software automat-ts the presence of two identical DAQ boards, and willfor storage and one for on-line sampling.vent markers were written to a separate neural datasystem (Plexon,Dallas, TX) using theparallel port ratherital outputs on the DAQ card because we found that, on, the parallel ports were signicantly faster (Asaad and008).

    male, 6neededtrial-anusing arequirecue prsaccadused.Dday) wso onebelow.all timand thUse Co

    AnasistedMA) rutrackin(uniquaccurasystemcontrobehavi

    Forthe cod(a stanFig. 4condithandle

    3. Res

    Firsthe potexaminthe conabstracdescrib

    4. Des

    Theonly twmonitotive ofthese teach.

    Stimof inpubehaviany of slog ouThesestimullar to tbut wiratherhas theview, tdenedticular) was presented with four objects simultaneously, andarn to pick one of the four at the end of a short delay byror. To allow eye-tracking, head xation was achievedd-post system (Judge et al., 1980). Visual xation wasa total of2.5 s (1 sof initialxation followedbya500mstation and then a 1 s delay) before the execution of aponse to select a target. An inter-trial-interval of 2 swasromthree consecutivedaysof training (one sessioneachollected and conrmed to yield nearly identical results,hese sessions was chosen arbitrarily for presentationsession consisted of 1601 trials over 2h and 53min. Ate animal was handled in accord with NIH guidelinesf the Massachusetts General Hospital Animal Care andttee.& Y position signals conveying behavioral output con-optical eye-tracking system (Iscan, Inc., Burlington,

    g at 120Hz. Although the relatively slow speed of eye-ed here did not necessitate millisecond-level accuracyavioral sampleswere available only every 8.3ms), 1msnevertheless the standard to which behavioral controlh as ours aspire. A joystick (JC200 multi-axis inductiverom Penny & Giles, Dorset, U.K.) was used to test thetask during initial scripting.re straightforwarddemonstration, a schematicdiagram,ming script) and the conditions le for a simpler taskdelayed-match-to-sample, or DMS, task) is shown intask actually tested, described above, consisted of 96and more lines of code to assess on-line learning andk selection).

    describe the basic design strategy of the software andl strengths andweaknesses of our approaches.We thene temporal performance of the software, specically inof an actual behavioral task (rather than as in the morets described in Asaad and Eskandar, 2008). Finally weme features intended to enhance usability.

    ractive structure of any behavioral task is dened byain activities: stimulus presentation and behavioral(corresponding to input and output, from the perspec-ubject). As such, our software is designed to facilitatectivities by providing one function corresponding to

    presentation consists of the activation or inactivationthe subject that are intended to drive or constrain

    d/or neural activity. Stimuli can be delivered throughalmodalities, including visual, auditory, electrical (ana-, or chemical (digital or analog output to an injector).pecied by the experimenter in a table that lists theilable for each trial type, or condition (Fig. 3b), simi-ay that is done in CORTEX (White et al., 19892008),e direct specication of stimulus objects in this lethrough an index into another items le (while thisantage of placing all the stimulus information in directable can appear overly-dense at times). A condition isthe collection of stimuli that are employed for a par-, or the contingencies that determine the timing and

  • W.F. Asaad, E.N. Eskandar / Journal of Neuroscience Methods 174 (2008) 245258 247

    Fig. 1. Programat which blockTable 1 (the ev

    manner inindividual tplished throwith argumoff.

    While than instantasist in time)that expliciof variablesprocessed uof time hassist of electjoystick-posioral monitactivities: 12) waiting fsoftware, thitoring functhese two aget(s), the tachieve theschematic. The minimum elements that must be provided by the user are marked withs change, the selection of blocks, and the selection of conditions. The darker portions ofentmarker function is not shown, but would appear intermixed with toggle and track).

    which these stimuli are presented. Then, within anyrial, the activationor inactivationof a stimulus is accom-ugh a call to the stimulus presentation function, toggle,ents corresponding to the stimuli to be turned on or

    e activation or inactivation of a stimulus is treated likeneous event (even though the stimulus itself may per-, behavioralmonitoring, on the other hand, is an activitytly species the passage of time. Repeated observationsreecting a subjects behavioral or neural output arentil a certain condition is met or a specied amountelapsed. These variables, in practice, generally con-rical signals corresponding to such things as eye- orition, button presses, etc. Formally, the task of behav-oring can be subdivided into just two complementary) waiting for a signal to enter a target range, or goal oror a signal to leave a target range, or goal. Within ourese tasks are accomplished through a call to the mon-tion, track, with arguments corresponding to which ofctivities is required, the signal to be monitored, the tar-hreshold around the target(s), and the time allowed togoal.

    When trioral signalbe speciedtion. The tragainst oneized naturebe a scalarwas acquire

    A task istoggle and tra subjectsto be delivecalls these fconditionstypes of triover-all pro

    Calling tessary low-of the loop scalibrated cpositions ofbased uponboxes. In addition, users can write Matlab scripts to control the timethe arrows correspond to the entry and exit times measured in

    acking for target acquisition (i.e., waiting for the behav-to enter a particular goal range), multiple stimuli canas potential targets using standardMatlab vector nota-

    ack function can check the behavioral signals positionor many targets very efciently because of the vector-of theMatlab language. The output of this functionwillindicating which target was acquired, or zero if noned.constructed simply by interleaving these two functions,ack, within the requisite conditional structures relatingbehavioral or neural output to the appropriate stimulired. Concretely, the usermustwrite aMatlab script thatunctions at the appropriate times, and provide a table ofthat indicates which stimuli are to be shown on whichals. The way these user-provided elements t into thegram ow is shown in Fig. 1.he track function invokes a loop that performs the nec-level functions tomonitor the selected signal. Each cycleamples thebehavioral signal, transforms this signal intooordinates, and compares these coordinates against thethe possible targets. All of this is done automatically,the parameters specied initially by the user. This is

  • 248 W.F. Asaad, E.N. Eskandar / Journal of Neuroscience Methods 174 (2008) 245258

    in contrast to many existing behavioral control software packagesthat leave the programming of this loop partly or entirely up tothe user. We prefer the former approach for a several reasons. First,the executiso requiringdeal of dupacross diffewith someon thatmacduced meanspecial-purher specicterms of cyultimate bepose loop isscale (see s

    This strpunctuateappearanceHowever, thtemporal inpresentatiomonitoringmanner nottion. In adtrack and toing betweemonitoringporate movtemporal pthough this

    Althougfunctionalitare providebehavioralldening int

    Becauserich syntaxIn particulaideal for thinto these sbehavioralcorrect resp

    where t is tn correct tover the lasthreshold, m

    where r iszero otherwonly a certonly the cuneeded:

    se arpli

    ractivoralehavs veof thcord

    form

    nera

    he beializeo buAQtedtookhe entampsibleThesal ense thointse theI.T.Itry ase dain Trial,nd e

    ns aore, ft triwhi

    ppropeterudesusce).ing eelecmenscrion of this loop is a core feature of any behavioral task,each user to code it independently results in a great

    licated effort. Second, because the loop is standardizedrent tasks, the performance of a task can be estimated,degree of certainty, based on previous experiments runhine (for example, on ourmachine, all tasks tested pro-cycle rates between 900 and 1050Hz). Lastly, while a

    pose loop coded by an individual user tailored to his orrequirements is likely to offer superior performance, incles executed per second, this is not likely to be of anynet in most cases, as the cycle rate of this general pur-sufciently high to monitor behavior at the 1ms time

    ection 5, below).ategy works well when relatively infrequent eventsbehavioral tracking, such as the appearance and dis-of stimuli at several hundred millisecond intervals.is approach becomes increasingly cumbersome as thetervals between behavioral monitoring and stimulusns decrease. For example, presenting a movie whilebehavior requires interleaving these functions in aeasily afforded by a single, general-purpose track func-dition, because there are entry or exit costs to theggle functions (see section 5, below), rapidly alternat-n them could produce unacceptable gaps in behavioral. Nevertheless, it may be possible to nd ways to incor-ie presentation into this function without sacricingerformance (Markus Siegel, personal communication),has yet to be conrmed.h the toggle and track functions contribute the bulk ofy for the creation of any behavioral task, other functionsd to allow for things such as digital time-stamping ofy-relevant events, marking trials as correct or incorrect,eractive hot keys, etc.scripting is done within the Matlab environment, itsand function set is available for maximum exibility.r, the vectorized nature of the Matlab language ise manipulation of trial-by-trial data that naturally falltructures. For example, suppose one wanted to followperformance to nd instances of several consecutiveonses as an indication of learning:

    he number of the rst trial that begins a sequence ofrials. Alternatively, one could take a moving averaget k trials to assess if performance has surpassed some(where 0

  • W.F. Asaad, E.N. Eskandar / Journal of Neuroscience Methods 174 (2008) 245258 249

    Table 1Function timing

    Function Mean (ms) Max (ms) First trial (ms)

    TrialEntry time 0.11 0.11 0.15Exit time 14.07 28.06 45.76

    ToggleEntry time 1.02 2.21 14.94Core time 0.18 0.29 0.91Exit time 25.85a 29.99a 26.60a

    TrackEntry time 1.15 1.29 1.16Core timeb 0.98 2.27 2.04Exit time 1.09 1.24 1.09

    EventmarkerEntry time 0.24 0.35 4.30Core time 0.50 0.69 1.40Exit time 0.01 0.11 0.46

    Inter-trial-intervalPreparation time 99.51 115.51 234.43

    The measured times based on 1601 trials of the sample task (see section 2) are pre-sented. Entry time refers to the amount of time required for initialization of eachfunction, before the execution of the essential activity. The Core time in each casereects the amount of time required to execute that activity. Lastly, the Exit timeis the amount of time required to clean up and leave the function (i.e., return controlto the user) after the core activity has completed.For the Trial row, the entry time corresponds to the time required to initialize allthe trial sub-functions (i.e., toggle, track, and eventmarker, as well as others), and toinitiate analog data acquisition. The core time here would be wholly dependent onthe users timing script specifying the timing and contingencies of the behavioraltask itself, so this is not shown. The exit time reects the time from the end of theusers script to the end of the trial, during which analog data and event markers areretrieved for storage in the local data le (eventmarkerswere also sent to a separateneural data acquisition system in real-time).In the case of the toggle sub-function, the entry time is the time required to parse theusers command options and blit the appropriate stimuli to the screens back-buffer.Then, the core activity is ipping the screen to display or extinguish the requestedstimuli. Note there will be a variable delay (excluded from the values shown here)between the completion of these entry tasks and the execution of the ip; the exactdelay is inversely linearly dependent on the time remaining until the next ip at thetime this function is called. The exit time is the time required to display the controlscreen symbols corresponding to the currently visible stimuli before returning con-trol to the user.In the case of the track sub-function, the entry time corresponds to the time requiredto parse the users command options and calculate the target thresholds. The coreactivity consists of one cycle retrieving the most recent analog data samples, trans-forming these into calibrated coordinates, and comparing these coordinates againstthose of the possible targets. The exit time here corresponds to the time required toextinguish the target rings and return control to the user.In the case of the eventmarker sub-function, the entry time is the time required toparse the users command options and convert the decimal integers to binary formfor digital output. The core time is the time required to write the digital bytes tothe neural data acquisition system (two operationswere required: rst set the valuebits, then trigger the strobe bit). The exit time is the time needed to reset the digitalstrobe bit, buffer the time-stamp for local storage, and to return control to the user.The I.T.I. preparation time is the time needed to select the next trial according tothe built-in block and condition-selection options (a user-specied function couldtake longer or shorter), load the necessary stimuli from disk to videomemory (here,six 100pixels100pixels true-color images, each le 4KB in size, as well as veprogram-generated xation dots, 8 pixels8pixels in size, were used), and updatethe control-screen graphics to reect the updated behavioral performance mea-sures (e.g., percent correct over-all, per-block, and per-condition, and reaction timesover-all and per-condition). The graphical updates constitute the bulk of this time(only 24ms are required for trial selection and stimulus preparation under theseconditions).

    a These exit times for the toggle function are modiable by the user: one canelect to skip drawing the control-screen symbols corresponding to the currentlyactive visual stimuli, in which case, the exit times averaged 0.91ms (1.77msmaximum). This option is useful when many stimuli must be presented in rapid-successionwith precise timing, orwhen a25ms delay before the next trial event issub-optimal.

    b The core times in the case of the track function shown here exclude the rstcycle, during which the initial control-screen update is performed. This is the cyclein which rings around the specied targets are drawn to the experimenters display.This one initial cycle took 22.0ms on average (26.7ms maximum).

    memory and then transferred to video RAM. In addition, gures onthe experimenters display are updated to reect the statistics ofbehavioral performance (e.g., percent correct, reaction times, etc.).When assestook aboutThis time wto be proceversely to tconstant ne

    5.2. Video p

    The abilidisplays has2005; Asaavisual stimuchophysicsof any suchscreen refrethe timeofation). In ourof time-stamrelative to tphotoresistin what folla surrogate

    Four stepcomputer dputers mempassed to aitself. In outhe inter-trciated withchosen, allgoal was tofromwithinconsideredegy is the resystem memor several laplayed in anavailable viory on modto hold at lpsychophys

    Next, thtransferredthat serves aory pointerscreen. Redping the dip could oing tearingboth perforissuance of

    In our sohe or she nto be displaof visual anappropriatedrivers arelow-level gtwo publiclof video harPerry, 2008sed using our sample task (see section 2), these events99.5ms on average (115.5ms maximum, see Table 1).ill vary, however, with the number and size of stimulissed. Therefore, the scheduled I.T.I. time is varied con-his measured preparation time to keep the actual timear the users desired value.

    erformance

    ty ofMatlab to accurately control and time-stampvideobeen described previously (Meyer and Constantinidis,d and Eskandar, 2008), and Matlab is employed forlus presentation by software widely used in the psy-

    community (Brainard, 1997;Pelli, 1997). Thekey featuresystem is the ability to accurately mark the time of thesh (ip) in order to use this information to determineppearanceof avisual stimulus (basedon its screenposi-particular system, the standard deviation of the jitterps (stored upon software detection of a vertical blank)

    he actual appearance of a stimulus (measured using aor) was 0.3ms (Asaad and Eskandar, 2008). Therefore,ows, we use software detection of the vertical blank asfor the photoresistor.s are required topresent a visual stimulusona standardisplay. First, that stimulusmust be loaded into the com-ory (usually from disk). Then, the image data must be

    memory buffer created for that image on the video cardr software, these rst two steps are performed duringial-interval. Every trial type (condition) can be asso-one or more possible stimuli, and once a condition isof its stimuli are loaded into video RAM. Because ourminimize the amount of time needed to present stimulia trial itself, performing these steps during the I.T.I.wasthe best option. The potential disadvantage of this strat-latively smaller amount of video memory compared toory; it is possible that a trial using many large imagesrge movies (not all of which would necessarily be dis-y one trial, but all must be available) could exhaust thedeo memory. Fortunately, the amount of video mem-ern graphics cards (currently 128 to 512MB, enougheast several hundred medium-sized stimuli typical ofics experiments) is sufcient in most cases.e particular stimuli to appear at a given time must befrom video RAM to a specialized area of video memorys a screenbuffer (an operation termeda blit). Amem-can then indicate which buffer is currently the activeirecting this pointer to a newbuffer is described as ip-isplay. We kept the vertical sync enabled so that thisccur during only the vertical blank interval, prevent-artifacts. These processes of blitting then ipping aremed at the time a stimulus is called up to appear (at thea toggle command).ftware, these events are largely hidden from the user;eeds to be concerned only with selecting the image(s)yed, dening the screen position of each in degreesgle, and then toggling the image(s) on or off at thetimes. Because graphics hardware and the associatedconstantly changing, our system interfaces with theraphics drivers via a single gateway function. At leasty-available software packages allow low-level controldware from within Matlab (Brainard, 1997; Pelli, 1997;). All calls to video hardware act through this single

  • 250 W.F. Asaad, E.N. Eskandar / Journal of Neuroscience Methods 174 (2008) 245258

    function to execute the basic steps of visual stimulus presentation.If future generations of video hardware and associated control soft-ware necessitate a different method of interaction, changes will berequired inMatlab funcminimizing

    The temin Table 1. Nby the usertime) wasever, becausblank, a vardelay to theindicating t

    By defauplay with rstimuli befconsumingThe user caexit time av

    5.3. Behavi

    Behaviomore analocoordinatescursor), comintermittenthe experimthe ability trequired. Sobe assessedond. We fo1kHz, thereat about onepling intervthan 2ms wPreviously,cycle durinEskandar, 2trackingepothan 2.3mscall to the trlatency was

    The cyclThe distribuin Fig. 3b (oThere is ammodes (thoposition ofwas update

    Yet, simpthat these ein close temating systemthe plottingperiods of uate this posbehavioral tthan 1.5msshortest intAnother waa histogramlatency evelatencies in

    easur thebrate oyclescle raterformes (>2te thaal typrage cess timion tricycle

    ect br700 a

    t increase above the following bins or 0.11ms in absolute

    ore parametric way of showing this, at least for adjacentis depicted in Fig. 3d. Here, no threshold was applied toa. Rather, each cycle time is plotted against the subsequentelding two interesting observations. First, the slope of thentally-orientedpointswasabout12%, equivalent to the result3b, and the slope of the vertically-oriented points was aboutative to the vertical), showing that there is indeed a veryendency for increased cycle times in those cycles preceding-latency event. Second, multiple modes are clearly visible.ster that appears between 1.2 and 1.3ms (labeled as modesists nearly entirely of points corresponding to the secondithin each tracking period. The lowest mode (below 1.2ms)ed 99.0% of all points (because of a ceiling density effect inure, the relative magnitudes of these populations are moreappreciated in the logarithmic plot of Fig. 3a).se data conrm that behavioral tracking is generally verywith a period of about 1ms on our tested system, and cyclescreased latency occur at predictable times with respect tommed events. Importantly, over all 1601 trials lasting nearlyrewasnot a single non-initial cycle timegreater than2.3ms.

    ability

    ask scripting4 shows the elements necessary for constructing a simpled-match-to-sample (DMS) task. This task requires the sub-maintain xation throughout an initial xation period, auent cue period, and nally a brief delay. Two pictures areresented simultaneously and the subject must pick the oneonly this gateway routine (written as a directly editabletion) to interface with the updated drivers, in this waythe time and labor required.poral performance of the toggle function itself is shownote that the time from when this function is invokedto when it is ready to ip (i.e., the function entry

    always less than 2.3ms (except in the rst trial). How-e the ip will occur only at the time of the next verticaliable delay ensues. In our case, running at 100Hz, theip was always between 0 and 10ms, and never more,hat there were no skipped frames.lt, the toggle function updates the experimenters dis-ectangles reecting the size and color of the visualore returning control to the user. This is a time-event, taking 25.9ms on average (30ms maximum).n elect to skip this step, in which case this functionserages 0.9ms (1.8ms maximum).

    oral monitoring performance

    ral monitoring most often involves sampling one org signals, transforming those signals into calibrated(e.g., angle of gaze or screen position of a joystickparing those coordinates against possible targets, andtlyupdating a representationof thebehavioral signal onenters display. For most psychophysical applications,o perform these steps about once every millisecond is, at a rst approximation, the speed of a system canby counting the number of cycles executed per sec-

    und mean cycle rates on our test system approachedby suggesting that behavioral signals can bemonitoredpermillisecond, on average (Fig. 2). Furthermore, sam-als were generally regular; specically, delays greaterere rarely observed (99.9% of cycles were under 2ms).we identied a temporal cost associated with the rstg which a screen update was requested (Asaad and008). Here, that update is called on the rst cycle of eachch.Data fromthis task conrms that cycle timesgreaterwere observed exclusively on the rst cyclewithin eachack routine.Within this rst cycle, the highest recorded26.7ms.

    e times for a typical tracking epoch are shown in Fig. 3a.tion of individual cycle times across all trials is shownnly the rst cycle during each track call is excluded).ulti-modal distributionof cycle timeswhere thehighestse above 1.5ms) corresponded to cycles in which thea behavioral trace (a simple dot) on the control screend (in our case, this occurred about once every 100ms).ly because increased cycle timeswere rare, it is possiblepisodes of increased latencywere occasionally groupedporal proximity (as could be due to a burst of high oper-activity, or to on-going background activity related toof the behavior trace), such that there were sporadicnacceptably infrequent behavioral sampling. To evalu-sibility, we examined the data from 1600 trials of ourask (all but therst) andnotedall cycle latencies greater(i.e., those cycles in the higher mode of Fig. 3b). Theerval between such high-latency events was 81 cycles.y of representing this data is shown in Fig. 3c. Here,of cycle latencies is plotted relative to every high-

    nt (at time zero). There is only a small tendency forthe immediately following cycle to be increased (11.6

    Fig. 2. Mtrial ovethe cycleing 913 cmean cycycles pcycle rattems. Noother tri(i.e., aveatively lno-xatrange ofthe subjbetween

    percentime).

    A mcycles,the datone, yihorizoin Fig.2% (relslight ta highThe clu2) concycle wcontainthis gclearly

    Theregularwith inprogra3h, the

    5.4. Us

    5.4.1. TFig.

    delayeject tosubseqthen pred Cycle-rates This histogram shows the mean cycle-rates for eachehavioral session.Aspreviouslydescribed (AsaadandEskandar, 2008)f the rst trial is typically lower, asmarked herewith a dot, represent-per second. The next-slowest trial averaged 941 cycles per second. Thee across all these trials except the rstwas 960. These rates include alled within the track function, including the rst. Signicantly faster000 per second) have been observed on newer-generation PC sys-t only correct-choice and incorrect-choice trials are shown, becausees placed different demands on stimulus presentation and trackingycle-rates for break-xation trials tended to be slightly lower, as rel-e was spent in tracking than was spent in updating the screen, andals were slightly faster for the opposite reason). Over all trials, therates varied from 800 to 1020Hz except for 3 instances in which

    oke xation nearly instantaneously, resulting in average cycle ratesnd 800Hz.

  • W.F. Asaad, E.N. Eskandar / Journal of Neuroscience Methods 174 (2008) 245258 251

    Fig. 3. Behavioral tracking performance. (a) The cycle times for a typical behavioral tracking epoch are plotted. The y-axis is truncated at 3ms (the rst cycle time here is26.5ms). Note the periodically increased times corresponding to the 100ms interval betweenupdates to the behavioral trace in the experimenters display. (b) The distributionof individual cycle times across all epochs of behavioral tracking in 1600 trials (the entire rst trial, and the rst cycle in each tracking epoch of all subsequent trials, wereexcluded). Cycle time is plotted against the number of cycles on a logarithmic scale. Cycle times in the higher mode (above 1.5ms) were found to correspond exclusivelyto those cycles during which the eye-trace on the experimenters display was updated. (c) The relative distribution of high-latency events during behavioral tracking isshown. This histogram was triggered on the occurrence of behavioral tracking cycle times greater than 1.5ms (here at time 0). The lighter shaded region at the top of eachbar represents the area of the mean value the standard deviation. The cycles immediately following the high-latency instances tended to be slightly increased in time(increased relative to baseline by 11.6% or 0.11ms). The minimum interval in any trial between two high latencies, each greater than 1.5ms, was found to be 81 cycles. (d) Ascatter diagram plotting the time for each cycle against the time for the subsequent one (excluding the rst cycle in each tracking epoch). Multiple modes are more clearlyvisible in this plot, but the relative numbers within each cluster are more difcult to ascertain than in (a) because of density saturation. Mode 1 contained 99.0% of all points,corresponding to a typical tracking cycle. Mode 2 corresponded to the second cycle within each tracking period. Modes 3 and 4 corresponded to those cycles in which ascreen update request was made. No clear pattern of occurrence distinguished these last two modes.

  • 252 W.F. Asaad, E.N. Eskandar / Journal of Neuroscience Methods 174 (2008) 245258

    Fig. 4. Example construction of a simple delayed-match-to-sample (DMS) task. The over-all task design of a standard DMS task is shown in (a). The task consists of a xationperiod, sample period, delay period, and then the presentation of choices. The subjects goal is to select that object among the choices thatmatches the sample cue, bymakinga saccade to that object. The subject must xate on the central dot throughout the task until the choices are presented. (b) A conditions table describing this task. This tableallows for either of two pairs of objects to be used on any given trial: pictures A and B, or pictures C and D. Relative Frequency determines how likely a particular conditionis to be chosen, relative to the other conditions. Conditions in Block enumerates the blocks in which that particular condition can appear (for instance, running block #2would play only conditions 58, and so would use only pictures C and D). Timing File refers to the Matlab script that organizes the contingencies that relate behavioralmonitoring to stimulus presentation (as in (c), below). Here, all conditions use the same timing le. Object columns list the stimuli that can appear in each condition. Thesecan be visual objects, sounds, analog waveforms, or TTL pulses; any type of stimulus object can be triggered with the same toggle command. Note that, to simplify codingof the timing script, objects serving the same purpose are always in the same column (so, here, the sample object is always Object #2 and the target is always #3). (c) TheMatlab script constituting the timing le used for this DMS task is shown. Functions that are provided by our software are highlighted in bold. Several other functions exist fortime-stamping behaviorally-relevant events, repositioning-objects on-the-y, turning the joystick-cursor on and off, dening interactive hot keys that initiate user-denedfunctions when the a key is pressed, etc.

  • W.F. Asaad, E.N. Eskandar / Journal of Neuroscience Methods 174 (2008) 245258 253

    correspondconditions tof stimuli (Atimes at whit consists othese, 28 arthe trial if t(e.g., breakicient scriptian option wcome at the

    5.5. Interfac

    Using thconstructednals can bewords, analdigital inpufor reward dFig. 4. (Continued ).

    ing to the cue presented earlier by saccading to it. Theable (Fig. 4b) shows 8 conditions comprising two pairs& B or C &D). The timing script (Fig. 4c) determines theich stimuli appear in relation to the subjects behavior;f 43 lines of code (excluding the variable denitions). Ofe involved in simply saving an error code and abortinghe subject fails to perform the requisite steps properlyng xation). Therefore, it is possible that evenmore ef-ng could be achievedby incorporating error handling asithin the track function itself (though this would likelyexpense of a more complicated track syntax).

    ing with I/O devices

    e Matlab Data Acquisition Toolbox as a foundation, wea system whereby a set of behaviorally-relevant sig-mapped directly to specic inputs or outputs. In otherog input signals representing eye or joystick position,ts representing button presses, digital or analog outputselivery, and digital outputs for sending event markers

    to a separatof signals, cdigital) usinto an inputtion), thenthen the su(e.g., 1), bput recognisame mannports, althouser.

    This apprun on sepanections, orlong as the

    A potenbehavioralsystem. Insignals sucward, and oe neural data acquisition system, as well as other typesan be assigned to specic channels (if analog) or lines (ifg a straightforward GUI. For example, to assign a signalour output, simply select that signal (e.g., Eye X Posi-select the board (e.g., National Instruments PCI-6229),bsystem (e.g., Analog Input), and nally the channelefore clicking Assign. Thus, anyhardware input or out-zed by the Data Acquisition Toolbox can be used in theer. However, no explicit support is provided for serialugh these objects can be created and monitored by the

    roach allows the same task to be recongured easily torate machines that may have different hardware con-to run on different types of hardware altogether, so

    relevant channels or lines are visible to Matlab.tial disadvantage of this approach is that the types ofsignals that can be acquired are hard-coded into theother words, while interfacing with two-dimensionalh as eye-position and joystick-position is straightfor-ne-dimensional inputs such as buttons and levers are

  • 254 W.F. Asaad, E.N. Eskandar / Journal of Neuroscience Methods 174 (2008) 245258

    also recognized, more complicated signals that involve more thantwo dimensions of input are not directly supported (e.g., a signalrepresentingmultiple variables such as joint position or location in3-D space).acquisitionthese sortssupport forused to direfrom a sepaaspect of besubroutine.

    Becauseneural dataallel ports (is a menu osystems usethe polarityas well.

    Currentluously. Thithe duratioeral hours)possibly resbehavioraleach trial ising the I.T.Ithe analogyses on eyeneed to splisystem thatthen use dievents.

    5.6. Signal

    Behavioin either abrated fromin sequencetionwith a kthat take inmethod alsThis greatlymultiple, cl

    Most eyover time, eto the corneresult of somsion. To cou(similar toSpecicallyfrom the cocompared terrors are tof those tarmethod, weseveral hou

    5.7. Data

    In our extrol systemstimuli shomay contain(condition)

    ioral events, as well as a record of any acquired analog data. Thesenumbers are then referenced to tables stored in separate les ornotebooks that allow the reconstruction of trial events. Because

    ata ated,eldtructue nemebehaes cof theexpecreenallowis siny tooauserisinily. Ine onlproa qu

    ials aials curewine f

    perim

    minill

    -goineye-peter,n wis forant ton thw thaanalstimure

    n artentiriatemandit

    f thesnt mfounwas

    in the thinvoerfoplifyce adispe on

    ngu

    ingleexpeA user would need to create directly the Matlab dataobjects and code a behavioral monitoring loop to trackof behavioral variables. Similarly, there is currently nomonitoring multi-bit digital inputs (such as might bect the course of the behavioral task based upon inputrate computer system dedicated to monitoring somehavioral output); this would also require a user-coded

    the polarity of the read (or strobe) trigger bit on someacquisition systems is reversed relative to that of par-e.g., the Neuroport system from Cyberkinetics), thereption to invert this bit. Likewise, because some rewarda falling rather than a rising voltage to deliver reward,of this function can be reversed through amenu option

    y, analog data is not acquired into our system contin-s is because acquiring continuous analog signals forn of an experiment (typically many minutes to sev-would require intermittent transfer of that data to disk,ulting in delays at inopportune moments during thetask. Instead, analog data acquired to memory duringretrieved at the end of that trial and saved to disk dur-. A major disadvantage of this approach is the loss ofdata record during the I.T.I. Therefore, to perform anal--position during this interval, for example, one wouldt the analog signal into a separate neural data recordingis capable of continuous analog data acquisition, and

    gital event markers to align this signal with behavioral

    calibration

    ral signals such as eye or joystick position can be usedraw (i.e., pre-calibrated) manner, or they can be cali-within the software. This is done by presenting dotsand marking the instant of xation or joystick acquisi-ey-press. In contrast to someother calibrationmethodsto account only signal offset and gain, our calibrationo takes into account skew using a projective transform.improves the quality of the calibration, especiallywhenosely-spaced targets are used.e-tracking systems are subject to some degree of driftven with measures such as tracking the pupil relativeal reection (on optical systems). Often, this drift is thee slight degree of head-movement during the task ses-nteract this, we employed a drift-correction algorithmwhat is available in other systems, such as CORTEX)., at the completion of each trial, xations are extractedntinuous eye-position record, and these xations areo the position of targets that had been displayed; smallhen assumed to reect intended xation on the centergets, and a fraction of this error is corrected. Using thisnd that no manual intervention is needed, even overrs, to keep the calibration exact.

    le record

    perience, data les generated by most behavioral con-s contain cryptic references to the conditions run, thewn, and the timing of key events. For example, theyunlabellednumericmarkers that identify the trial type

    , the code numbers and time-stamps of critical behav-

    these dassociain thisreconscolleag

    To rof thedata ltions oentire(e.g., srecordfromthactuall

    Becunsurple easbecom

    Twoto allowover trtion, trthis ga timel

    5.8. Ex

    At ausers wthe onwhichparampositioreasonwill wvisiblewindotion ofvisualobjectspositioing poapprop

    Thistrial, coeach othe eve

    Wedisplaydrawnthe timchangefaster pby simformanof ourmachin

    5.9. Co

    A sing anre separate, it is possible that they can be lost or mis-rendering the events undecipherable.Many researchershave had the troubling experience of attempting tothe events of an experiment performed years ago by aow long gone.dy this, we included in our data les the fullest accountvioral record we thought possible and practical. Thesentain the actual stimulus images used, the text descrip-event markers, the conditions-table structure for the

    riment, and a variety of task conguration informationresolution, signal calibration matrices, etc). This richs the reconstruction and replaying of any given trialgle datale, so there is noambiguity aboutwhat eventsk place.of their relative completeness, these data les are,gly, somewhat large; a 3-h session can generate a 50MBthe long-term, however, we believe such le sizes will

    y more practical as the power of typical PCs increases.vided functions plot behavioral data in a graphical formick over-viewof events. The rst plots the performancend shows the reaction time histogram (Fig. 5a). In addi-an be re-played, andmovies of those trials created, fromindow. The second is calledup fromtherst, and showsor the currently selected trial (Fig. 5b).

    enters display

    imum, during the execution of a behavioral task, mostwant some sort of real-time feedback which reectsg behavior of the subject. For example, in a task inosition (i.e., angle of gaze) is the monitored behavioral

    amovingpoint corresponding to the instantaneous gazell allow an observer to follow behavior to determine thea subjects successes or failures. Also, the experimentero know this position relative to any objects currentlye subjects screen. Therefore, we constructed a displayt included markers corresponding to the current posi-og inputs such as eye or joystick position in relation toli currently visible to the subject (Fig. 6). In addition,

    ecting lever position or digital inputs such as buttone visible if those inputs are active, and rings reect-al targets with their threshold radii also appear at thetimes.

    in display is surrounded by text indicating the currention, and block, graphs showing the performancewithine groups, a reaction-time histogram, and a time-line ofarkers stored for the previous trial.d that the time required to update the experimenterslinearly related to the number of graphical elements

    is gurewindow. Themore polygons present, the longerat was required to update the display, even if the onlylved repositioning a small dot. Therefore, it is likely thatrmance (shorter cycle times) could have been achieveding this display. However, we chose to balance per-nd usability. Specically, we calibrated the complexitylay to maintain cycle times of about 2ms on our testthose cycles when updates were requested.

    ration menu

    user interface (Fig. 7) is the starting point for load-riment, setting and saving a variety of conguration

  • W.F. Asaad, E.N. Eskandar / Journal of Neuroscience Methods 174 (2008) 245258 255

    Fig. 5. Behaviole informatiorecord. The triX- and Y-eye odelivery (here

    parametersmanner inwto do if theselect condincreasing nals are expused in plaral graphs. A basic overview of performance is shown in (a). Behavior over time is plotten and trial-selection is at the bottom-left. In the bottom-middle is an area which shows thal can be played back in this window by pressing Play, and amovie can be created from ar joystick traces are shown at the bottom. The horizontal red bars indicate that the object, three pulses of juice).

    (e.g., video settings, I/O mappings), and for setting thehich blocks and conditions are to be selected andwhatsubject makes an error. While simple options such asitions randomly with replacement, choose blocks inumerical order or immediately repeat incorrect tri-licitly available in this menu, Matlab functions can bece of these options to execute more complicated logic,

    such as deteing blocks,arranging thconguratiotask, launch(which thenand timingd at the top, reaction times are shown at the bottom-right, and basice objects used on the currently-selected trial and the eye- or joystickny trial. (b) A time-line representation of the currently-selected trial.at left was currently visible. The vertical green lines indicate reward

    ctingwhen a subject has learned and only then switch-or always selecting certain conditions after others, ore order of blocks in some specic manner, etc. Once anhasbeencreatedand saved for aparticular behavioraling an experiment involves loading the conditions leautomatically loads the associated conguration le

    script(s)), entering a data le name, and clicking Run.

  • 256 W.F. Asaad, E.N. Eskandar / Journal of Neuroscience Methods 174 (2008) 245258

    Fig. 6. Experima red ring maroccur every 50and performan

    5.10. Troub

    To aid semain menuselected vidand stimuluto assess cuual analogevent-marksampling rapresent. Forcollect 3 s oenters and subjects screens. The experimenters display (a) contains a representation oks the boundaries of the current xation (or joystick target) window and a dot represenor 100ms, depending on user preferences). In the top-right, the experimenters displayce over-all, over the current block, and for the current condition. Reaction time histogram

    leshooting

    t-up, several diagnostic functions can be called from the(Fig. 7). Video tests are available to assess currently-eo display options (e.g., resolution and refresh rate)s appearance at those settings. I/O tests are availablerrent DAQ assignments, to acquire or deliver individ-or digital signals, and to test each digital line used forer output. In addition, themaximumachievable on-linete can be estimated for the type and number of DAQsoptimizing on-line drift correction, an option exists tof eye data and show where saccades and xations are

    detected aclect a few hcycle timestext of currsome of thebehavioral

    6. Discussi

    Being unioral controthat simplif the stimuli currently visible on the subjects display (b). In addition,ts the current eye- or joystick-position (updates are generally set torelays information about the current trial, condition, block numberss over-all and for the current condition are plotted at the lower-right.

    cording to theusers current settings. Lastly, one can col-undred thousand latencies corresponding to open-loopto assess, grossly, Matlabs performance in the con-ent system settings. In our experience, these addressmore common problems to arise when rst building a

    control system.

    on

    satised with currently available options for behav-l, we sought to develop a high-level software systemes task design and execution while maintaining a

  • W.F. Asaad, E.N. Eskandar / Journal of Neuroscience Methods 174 (2008) 245258 257

    Fig. 7. The conwithin the leftin the middle

    high degreeexcellent plbe met whexibility oitations.

    Windowwhile sub-mpsychophysapplicationsuitable. Inwithin a vean on-goingvariable. Likof processiunless thesguage.

    There isioral trackiinterval, itTherefore, ioccur veryprecisely, thadequate. Iupdates, bubehavioral e

    Other limor translatinioral signalsguration menu. This menu is where an experiment and all of its conguration parametpanel. Input-output assignments and other settings are found within the right panel. Taspanel. This menu is the point from which an experiment is launched.

    of temporal precision. Matlab turned out to be anatform for this project, as the timing constraints couldile providing the user access to the simplicity andf that environment. Nevertheless, there are notable lim-

    s cannot support hard real-time operation. Therefore,illisecond jitter is acceptable in many, if not most,

    ical settings, there are nonetheless many potentials for which the software described here would not beparticular, experiments that must provide feedbackry small temporal window (for example, to inuencesynaptic event) would nd 12ms jitter simply tooewise, delivering feedback that requires a great deal

    ng could potentially incur unacceptably long delayse computations are programmed in a lower-level lan-

    a25ms blind period at the beginning of each behav-ng episode. If a subject were to respond within thatwould not be appreciated until the end of this period.n tasks in which behavioral responses are expected toearly in each tracking epoch and must be measuredis software as it is currently designed would not be

    t would be possible to disable experimenters displayt that would signicantly hinder ones ability to followvents in real-time.itations include the current inability to displaymoviesg visual stimuli while simultaneously tracking behav-. In addition, behavioral signals are not currently stored

    during thenals continneurophysiacquisitionnels, samplalongside bpurpose. Foa separate nless, such aapplication

    We use tneurophysihasbeenveis especiallynumbers ofwithin an h

    As withdetails havewhile suchadoption annical detailmisapplicaters are capaallows). Becronment, wthat the lowparadigms,possible varers can be loaded, modied, and saved. Video settings are organizedk-execution settings (e.g., condition- and block-selection criteria) are

    inter-trial interval. The ability to store analog sig-uously would benet not only behavioral signals, butological ones as well. In other words, although manycards are clearly capable in terms of number of chan-ing rates and PC storage of recording neural dataehavioral signals, no support has been built-in for thisrtunately, most users so far have preferred relying oneural data acquisition system (e.g., Plexon). Nonethe-capability would likely be useful for many potential

    s.his software on several differentmachines dedicated toology in humans or non-human primates. This softwareryadept at the creationofbasic sensori-motor tasks, anduseful for the creation of cognitive tasks with greaterstimuli and contingencies. These tasks are often codedour, and modications are simple to test.any endeavor, abstraction layers hiding lower-levelcertain benets and potential pitfalls. For example,abstraction can improve ease-of-use and encouraged innovation, it may also isolate one from those tech-s that are critical to the task at hand; this could resultion or a false sense of limitation (ultimately, comput-ble of much more than any particular software systemause our software constitutes a highly-abstracted envi-e hope that the benets outweigh these costs. We hopeer the barrier to designing and executing behavioralthe more likely it is that one will explore the space ofiations and implementations.

  • 258 W.F. Asaad, E.N. Eskandar / Journal of Neuroscience Methods 174 (2008) 245258

    For lack of imagination and in the absence of a clever acronym,we refer toour software as MonkeyLogic. The software is availableby request to the authors. Documentation for users is available atwww.monkeylogic.net.

    Acknowledgements

    The authors thank David Freedman, Tim Buschman, CamilloPadoa-Schioppa, Valerie Yorgan, Markus Siegel, and John Gale forcontributions to the software, beta testing, and helpful discussions.Wealso thank Jeffrey Perry formaking the low-level graphic driverspublicly available and for helpful advice regarding their implemen-tation. Anne-Marie Amacher, Ming Cheng, Jason Gerrard, Rollin Hu,EarlMiller, AndrewMitz and ZivWilliams are appreciated for offer-ing useful ideas for program testing and execution.We are indebtedto theCORTEXdevelopment team (http://www.cortex.salk.edu) fortheir widely used behavioral control software, which provided aninitial framework for the creation of our software. Fundingwas pro-vided by a Tosteson Fellowship from theMassachusetts BiomedicalResearch Council to WFA and NEI grant 1R01DA026297, NSF IOB0645886 and the HHMI to ENE.

    References

    Asaad WF, Eskandar EN. Achieving behavioral control with millisecond resolu-tion in a high-level programming environment. J Neurosci Methods 2008;173:23540.

    Brainard DH. The psychophysics toolbox. Spat Vis 1997;10:4336.Ghose GM, Ohzawa I, Freeman RD. A exible PC-based physiological monitor for

    animal experiments. J Neurosci Methods 1995;62:713.Hays AV, Richmond BJ, Optican LM. A UNIX-based multiple-process system for real-

    time data acquisition and control. In: WESCON Conference Proceedings; 1982.p. 110.

    Judge SJ, Wurtz RH, Richmond BJ. Vision during saccadic eye movements. I. Visualinteractions in striate cortex. J Neurophysiol 1980;43:113355.

    Maunsell JHR. LabLib, http://maunsell.med.harvard.edu/software.html; 2008.Meyer T, Constantinidis C. A software solution for the control of visual behavioral

    experimentation. J Neurosci Methods 2005;142:2734.Pelli DG. The VideoToolbox software for visual psychophysics: transforming num-

    bers into movies. Spat Vis 1997;10:43742.Perry JS. XGL Toolbox, http://.cvis.psy.utexas.edu/software.shtml; 2008.RamamrithamK, Shen C, Sen S, Shirgurkar S. UsingWindows NT for real-time appli-

    cations: experimental observations and recommendations. In: IEEE Real TimeTechnology and Applications Symposium; 1998.

    White TM, Norden-Krichmar TM, Benson J, Boulden E, Macknik S, Mitz A, Mazer J,Miller EK, Bertini G, Desimone R. Computerized Real-Time Experiments (COR-TEX), http://www.cortex.salk.edu/; 19892008.

    A flexible software tool for temporally-precise behavioral control in MatlabIntroductionMaterials and methodsResultsDesignPerformanceGeneral performanceVideo performanceBehavioral monitoring performanceUsabilityTask scripting

    Interfacing with I/O devicesSignal calibrationData file recordExperimenters displayConfiguration menuTroubleshooting

    DiscussionAcknowledgementsReferences