-
c o m p u t e r m e t h o d s a n d p r o g r a m s i n b i o m
e d i c i n e 1 1 2 ( 2 0 1 3 ) 607623
jo ur nal ho me p ag e: www.int l .e lsev ierhea l t h.com/
journa ls /cmpb
An open-source, FireWire camera-based,Labvi onautomdetec
John KenPedro Ga
a Graduate P31270-010 Bb Graduate P6627, 31270c DepartmenFederal
de Md DepartmenAv. Antnio e BiomedicalSala 327, Ci
a r t i c
Article histor
Received 21
Received in
Accepted 17
Keywords:
Image acqu
Pupillometr
Blink detect
CorresponGerais, Av. A CorresponH, Sala 327,
E-mail a0169-2607/$http://dx.doated, dynamic pupillometry and
blinktion
nedy Schettino de Souzaa, Marcos Antonio da Silva Pintoa,brielle
Vieirab, Jerome Barona,b,d,, Carlos Julio Tierra-Criolloa,c,e,
rogram in Electrical Engineering, Universidade Federal de Minas
Gerais, Av. Antnio Carlos 6627,elo Horizonte, MG, Brazilrogram in
Physiology and Pharmacology, Universidade Federal de Minas Gerais,
Av. Antnio Carlos-010 Belo Horizonte, MG, Brazilt of Electrical
Engineering, School of Engineering, Laboratory of Biomedical
Engineering, Universidadeinas Gerais, Av. Antnio Carlos 6627,
31270-010 Belo Horizonte, MG, Brazilt of Physiology and Biophysics,
Institute of Biological Sciences, Universidade Federal de Minas
Gerais,Carlos 6627, 31270-010 Belo Horizonte, MG, Brazil
Engineering Program COPPE Federal University of Rio de Janeiro,
Av. Horcio Macedo 2030, Bloco H,dade Universitria, Caixa Postal
68510, CEP 21941-972 Rio de Janeiro, Brazil
l e i n f o
y:
May 2012
revised form 9 July 2013
July 2013
isition system
y
ion
a b s t r a c t
The dynamic, accurate measurement of pupil size is extremely
valuable for studying a large
number of neuronal functions and dysfunctions. Despite
tremendous and well-documented
progress in image processing techniques for estimating pupil
parameters, comparatively
little work has been reported on practical hardware issues
involved in designing image
acquisition systems for pupil analysis. Here, we describe and
validate the basic features
of such a system which is based on a relatively compact,
off-the-shelf, low-cost FireWire
digital camera. We successfully implemented two congurable modes
of video record: a
continuous mode and an event-triggered mode. The
interoperability of the whole system
is guaranteed by a set of modular software components hosted on
a personal computer
and written in Labview. An ofine analysis suite of image
processing algorithms for auto-
matically estimating pupillary and eyelid parameters were
assessed using data obtained
in human subjects. Our benchmark results show that such
measurements can be done in
a temporally precise way at a sampling frequency of up to 120 Hz
and with an estimated
maximum spatial resolution of 0.03 mm. Our software is made
available free of charge to
the scientic community, allowing end users to either use the
software as is or modify it to
suit their own needs.
2013 Elsevier Ireland Ltd. All rights reserved.
ding author at: Department of Physiology and Biophysics,
Institute of Biological Sciences, Universidade Federal de
Minasntnio Carlos 6627, 31270-010 Belo Horizonte, MG, Brazil. Tel.:
+55 31 3409 2921; fax: +55 31 3409 5480.ding author at: Biomedical
Engineering Program COPPE Federal University of Rio de Janeiro, Av.
Horcio Macedo 2030, BlocoCidade Universitria, Caixa Postal 68510,
CEP 21941-972 Rio de Janeiro, Brazil. Tel.: +55 21 2562-8601; fax:
+55 21 2562-8591.ddresses: [email protected] (J. Baron),
[email protected], [email protected] (C.J.
Tierra-Criollo).
see front matter 2013 Elsevier Ireland Ltd. All rights
reserved.i.org/10.1016/j.cmpb.2013.07.011ew-controlled image
acquisiti system for
-
608 c o m p u t e r m e t h o d s a n d p r o g r a m s i n b i
o m e d i c i n e 1 1 2 ( 2 0 1 3 ) 607623
1. Introduction
Pupillomettonic and decades, thapproach hand clinicaing the
valinon-invasifunctioningabnormalita large nummultiple sc[9],
depresdiseases [1[15]. In ophprotocols fo(e.g. [1618visual
eldvisual pathdistinguishquently ocAmong othoutside
ophanesthesioaddiction [Several repfatigue stat
Today, infrared-seinterface foin an automof these sement pioneto
allow ptopic condicommerciaobviously iparing databy the conative
perfoHowever, bcommerciaaddition tofor extensithese shortand
descrilometry (e.certain aspof disadvanothers: poobuilt-in
synatively higparts themavailable; unents, raisof solid knC/C++),
nargrammers.
In an attempt to circumvent many of these problems,we developed
a low-cost, easily assembled and reproducible
acqund-pme iuite s thoragize. ing
-the-s of
mind hlar o
of tmpozatioiable
of pn efage ftwating d a pons osion at h-for
Ma
Ov
ystea (Ptgre
illussiveransctor fr in m
capf a dons
merad (IRamb
sofd Visngtommiment
two Vis
of ilibra94 (Fs arry aims at producing accurate measurements
ofphasic changes in pupil size. Over the last fewe breadth of
application of this methodologicalas considerably expanded in both
basic researchl practice due to increasing evidence demonstrat-dity
of using pupillary response as an objective andve physiological
marker of normal and abnormal
of the nervous system (for reviews see [14]). Pupilies have
indeed been shown to be correlated withber of physiological and
mental disorders such aslerosis [5,6], migraine [7], diabetes [8],
alcoholismsion [10], anxiety/panic desorder [11], Alzheimer2],
Parkinson [13], autism [14], and schizophreniathalmology,
pupillometry is now part of routiner the preoperative assessment of
refractive surgery]) and is considered a valuable aid for
screening
defects and diagnosing lesions of the anteriorway (e.g. [1924]).
It is also considered essential to
physiological anisocoria from the much less fre-curring syndrome
of Horner (for review see [25]).er clinical elds of applications of
pupillometrythalmology are the monitoring of central states in
logy [26,27], the follow-up of medicamentous drug-28,29], and
evaluation of cognitive process [3033].orts have also sucessfully
detected drowsiness andes on the basis of pupil-motility proles
[3443].modern pupillometers generally consist of annsitive (IR)
imaging sensor coupled with a digitalr recording, processing and
reporting pupil dataated fashion. Although the operational
principlesnsors differ, they share the same basic require-ered by
Lowestein and Loewenfeld [44], which is
upil measurements to be performed under sco-tions. Several
models of pupillometers are availablelly and their widespread use
in clinical practice isnteresting for standardizing procedures and
com-. Efforts in this direction are actually evidenced
siderable number of published reports on the rel-rmance of such
commercial systems (e.g. [4552]).eing generally designed for specic
applications,l devices typically lack versatility of use and,
in
their relatively high cost, offer little prospectsbility, due to
their proprietary nature. Because ofcomings, several research
groups have developedbed their own solutions for automated IR
pupil-g. [5356]). Though exhibiting high performance inects, such
custom prototypes also have their sharetages and limitations, which
may include, amongr spatial resolution; low sampling frequency;
nochronization capabilities with other devices; rel-
h degree of complexity for assembling hardwareselves often
highly specialized and not so easilyse of high end off-the-shelf
proprietary compo-
ing the overall cost of the system; and necessityowledge in
low-level language programming (e.g.rowing the realm of development
to expert pro-
imageplug-areal-tiware sensureand stpupil
sgrammout-ofteristicwith ation ana simiresultsthree ichronifor
relolution
In aour imand socontaccreatequestidiscussible system
2.
2.1.
The scamerwww.pand anprogredata tconneto beaimageneed ois
respthe cainfrareunder
TheXP anWashiprograInstrutage of(1) thelibrarydriver
IEEE13systemisition system based on a compact, off-the-shelf,lay
FireWire digital camera capable of autonomous,mage capture and
digitalization. A modular soft-running in standard Windows-based PC
platformse interoperability of the camera, the streaminge of raw
image data, and the off-line analysis ofDeveloped in LabVIEW, a
high-level graphical pro-environment, the software offers easily
extendable,box functionality. The design and technical charac-our
system are described such that any developerimum of technical
expertise in hardware integra-igh-level programming will be able to
implementr perhaps even better solution. We also reportests aimed
at benchmarking our system againstrtant application criteria: (i)
time accuracy of syn-n procedures; (ii) hardware/software
constraints
real-time video acquisition; and (iii) spatial res-upil size
measurements.fort to ensure and motivate the reproducibility
ofacquisition system, complementary informationre source code can
be obtained free of charge [email protected] or
[email protected]. We alsoublicly accessible discussion forum
through whichr comments about the system can be posted. Thisforum
is hosted by Google Groups and is
acces-ttp://groups.google.com/group/image-acquisition--pupillometry?src=email&hl=pt-BR&pli=1.
terials and methods
erview
m hardware is composed of a FireFlyMVoint Grey Research,
Richmond, Canada, USA,y.com/products/reymv/), a desktop
computermination source. The camera consists of a 1/3
scan CMOS, an IEEE1394a standard interface formission and a
4-pin general purpose I/O (GPIO)or device control and powering
[57]. It is importantind that this digital camera enables
autonomous
ture and digitalization, thereby eliminating theedicated
acquisition board. The desktop computerible for receiving and
storing the data sent by. For illumination, our system uses a
controllable) source but can also carry out pupil measurementsient
lighting conditions.tware was developed and tested under Windowsta
operational systems (Microsoft Corporation,n, USA) using LabView
8.5, a high-level graphicalng language environment developed by
Nationals (www.ni.com, Texas, USA). We also took advan-
add-on software tools from National Instruments:ual Development
Module 8.5, a comprehensivemage processing routines and (2) the
NI-IMAQdxry which handles the low level interfacing with
theireWire) camera bus. Although Windows operatinge not
deterministic hard real-time systems, the
-
c o m p u t e r m e t h o d s a n d p r o g r a m s i n b i o m
e d i c i n e 1 1 2 ( 2 0 1 3 ) 607623 609
autonomy of the acquisition module based on standardIIDCIEEE
1394a interfaces ensures the identication ofmissing frasampling
gofine. In tdescriptionin our deve
All humthe Ethics under the form.
2.2. Ha
The core futively comat the timboard stanoffers
appracquisitionare requireguaranteeinable latencon the y, again,
exposstandard knfor short, fof specic fprovided brelies on fabe
encount
Of particinlay in eacregistering ture, knowsystem,
benon-capturtamps fromin all USB cthe accurac
In our ptivity was edevices or tThe rmwasynchronizframe
modecapture eacto reach its[57]; (2) a activate thesequence owithin
a ddemonstraation mode, beginning osensor is exused to initthe
camera
A potentface is that
and the computer are in theory limited to 5 m, with
longerdistances being possible only using hubs and repeaters.
More-
hilecing. Thi
is reowerh a sE139weentiond weible A or sidea and
to eypese tois un; thehe cd to ance
Het-upa, onunte
emits in erna
(10 md forlled e diy o
nt ligper t pix
DesreFly4.4 mesktour sent
modith ar to d thl zootom
an Ive suchanmes. In our design, image processing as well asap
identication and correction are performedhe following sections, we
provide a more detailed
of the hardware and software components usedlopment project.an
protocols used in this study were approved byCommittee of the
Federal University of So Paulolicense no. 0763/10. All subjects
signed a consent
rdware for image acquisition
nctionality of our system is provided by a rela-pact, low-cost
FireWire camera (around US$ 275e of writing). Like any digital
cameras with on-dardized IEEE1394a communication protocols,
iteciable advantages for robust and exible image, especially when
timely controlled applicationsd. It supports isochronous data
transfer, therebyg real-time image transmission with very low,
reli-
y time. It also permits the user to set, exibly and wide variety
of parameters such as video modes,ure time, or area of interest
dened by the industryown as 1394 TA Digital Camera Specication
(IIDC
or more details see [58]). Although the
availabilityunctionalities is tied to the rmware of the cameray a
particular manufacturer, our implementationirly standard
interoperability features that shouldered in most
IIDCIEEE1394a-based cameras.ular importance for our project is the
possibility toh captured frame a sequence of characters therebythe
time of occurrence of that frame. This fea-n as timestamps, is
important for non real-timecause it guarantees the post hoc
identication ofed frames. Note that the independence of times-
the computer bus clock, a feature not encounteredameras, but
intrinsic to digital cameras, increasesy of this process.roject,
the main utility of the camera GPIO connec-ither to allow the
trigger of the camera by externalo transform the camera itself as a
triggering device.re of the camera allows actually three modes
ofation with other external devices: (1) a frame-by-, where an
external pulse must be generated toh frame. In this case, the
acquisition rate is unable
maximum nominal rate of 120 Hz in free modevideo clip mode,
which relies on a single pulse to
I/O channel responsible to initiate the record of af frames.
This mode starts the capture at randomuration period of less than
one frame, as will beted further below in the Section 3; (3) a
strobe gener-which emits a TTL pulse of adjustable width at thef
the image integration process when the cameraposed. Through this
mode, the strobe signal can beiate the stimulation process. For
modes (2) and (3),
is able to work at its maximum acquisition rate.ial drawback of
the standard IIDCIEEE 1394a inter-
point-to-point connections between the camera
over, winterfanectorsupplyfully pthrougthe IEEing
betadaptaworkeis possPCMCI
Concamersionedprototsuitablanalysmentssince tbe useof dist
2.2.1. This secamerall mopowervalue ithe
Inttectionplannecontroera. Thanatomambiethe upconver
2.2.2. The Fition (2on a dSince omovemaccomlens win ordeject
anmanuaport auor/andwe hamulti- desktop connectivity is done via a
6-pin connector, with a laptop can only be done with a 4-pin con-s
means that, for the latter case, an external powerquired. To
overcome this limitation, we success-ed our camera via a Universal
Serial Bus (USB5V),imple adaptation of the IEEE1394 cable. Note
that4 specications recommend voltage supplies rang-
8 and 35 V, which is a priori incompatible with our, but in
practice, at least in our hands, this solutionll. For portable PCs
without native IEEE1394 port, itto use readily available expansion
cards known asExpress cards.ring the aforementioned characteristics
of the
the diversity of experimental paradigms we envi-stablish, we
decided to develop two monocular
of image acquisition. The rst one is portable, human anatomy and
designed to perform pupilder controlled scotopic and/or photopic
environ-
second prototype is in principle more versatileamera is not tied
to the subject and can thereforelm the eye of human subjects from a
wide range
and angles.
ad-mounted arrangement consists of a scuba diving mask (Fig. 1A)
as well as ae white light-emitting diode (LED) and four IR LEDs,d
on a printed circuit board (Fig. 1B). The maximumted by each IR LED
was xed to 0.12 mW/cm2. Thisconformity with the security range
determined bytional Commission on Non-Ionizing Radiation Pro-W/cm2
for a period of 17 min). The white LED was
providing pupillary reex stimulation. All LEDs areby software
through the I/O channels of the cam-ving mask guarantees a precise
adjustment to thef an adult human head and good isolation fromht
(Fig. 1C). The white circular patch apposed oneyelid of the subject
shown in Fig. 1D is used toel values into metric units (see item
Section 2.3.1.2).
k-mount camera arrangementMV camera is sold with protective
encapsula-m 44 mm 34 mm) which allows easy mountingp stand for
remote pupillometric measurements.oftware was not designed to
compensate for heads, it is necessary to restrict such movements
byating the subject on chin rest and forehead rest. Adjustable
focal length and aperture was also addedaccount for variation of
distance between the sub-e camera. We recommend to chose a lens
withm since the FireFlyMV rmeware does not sup-atic focus control.
For illumination, ambient lightR light source may be used. For
visual stimulation,ccessfully employed a CRT monitor as well as
anel LED device described by Pinto et al. [59].
-
610 c o m p u t e r m e t h o d s a n d p r o g r a m s i n b i
o m e d i c i n e 1 1 2 ( 2 0 1 3 ) 607623
Fig. 1 Portable, head-mounted prototype for pupil analysis. (A)
Overview of the internal part of the goggle. (B) Close up ofthe
electronic hardware of the prototype, showing how the IIDC-IEEE
1394a FireFlyMV camera is assembled onto a smallprinted circuit
board together with four IR LEDs distributed around the micro-lens
of the camera (for a uniform illumination),and a white LED for
visual stimulation. (C) Subject wearing the goggle during a
recording session. (D) Circular patch afxedto the uppe d for
Fig. 2 Blothe systemrelationshimodules a
2.3. Sys
The softwacomponen(1) an imagdata to thpupil analyof pupil
esinserting ecation amothrough a dles and d
els sing elatiing
see oug
thei systce or lid in order to estimate the pixel/mm
relationship require
channStreamoped rstreamdetails
Althallowsing theexistenck diagram summarizing the basic
operations of software, emphasizing in particular thep between data
acquisition as well as analysisnd their associated data storing
les.
tem software
re is made of two independent, albeit cooperative,ts that
communicate through a database (Fig. 2):e acquisition component,
responsible for feedinge system in a timely controlled way and (2)
asis component, responsible for manual correctiontimation as well
replenishing the databank by
stimated values of pupil diameter. The communi-ng the two
software components is accomplishedatabank whose structure is
formed by AVI movieata about timestamps, frame indices, and I/O
a certain ddure.
Fig. 3 scponents, wown indepthese modu
2.3.1. Im2.3.1.1. Systhree selec(1) 640 48at 120 Hz. like gain,
bdefault valcongurabl
2.3.1.2. Pixenables thesize in pixe
Fig. 3 System software architecture (see Se absolute
measurements of pupil size.
tatus, all saved in the Technical Data Management(TDMS) le
format. The latter has been devel-vely recently by National
Instruments for rapidlylarge amounts of data to the hard drive (for
morehttp://zone.ni.com/devzone/cda/tut/p/id/3727).h the autonomy of
the two software componentsr partial or entire modication without
compromis-em functional structure, it is worth mentioning thef a
hierarchical dependency between them due toegree of serialization
in the data processing proce-
hematizes the internal structure of the two com-hich is dened by
intermediate modules with theirendent user interface. Below, we
describe each ofles in more detail.
age acquisitiontem conguration module. This module presentstable
8-bit achromatic video modes to the user:0 at 30 Hz; (2) 640 480 at
60 Hz and (3) 320 240For each video mode, several image
propertiesrightness and gamma correction have been set asue though,
in principle, all of these properties aree.el-to-metric unit
conversion module. This module user to dene the relationship
between the imagel and its real world metric size, a necessary
step
ction 2 for details).
-
c o m p u t e r m e t h o d s a n d p r o g r a m s i n b i o m
e d i c i n e 1 1 2 ( 2 0 1 3 ) 607623 611
Fig. 4 Corimage. (B) I
for later repWhen the cprocedure (e.g. ruler) iof the to-bethis
image,version acca data to beto the practa referenceto-metric uthe
subjectdiameter, sapplicationity and rigithe softwarestimates
iation endsnumber of ation of thethe eyelid pilized with it does
not quickly.
Due to distance bemounted csuffer fromprocedure o
problem. It is based on NI Vision Assistant routines (for
moredetails, see NI, [60]), which calculate an appropriate
correction
thats of d
. Plunt imre dacheo fraryi
rial nume in
framch, tlloca
reStaitiond n
FSM waionizar ofratioventturnis em
har
. Plurection of the lens radial distortion. (A) Originalmage
after radial distortion correction.
matrixproces
2.3.1.3differeins, westate ming vidtrials vinter-tsmall is
madmizingapproato be a(Captuacquisory an
Thesystem(Synchrnumbethe duboth etem rebuffer RAM to
2.3.1.4
orting pupil size measurements in absolute terms.amera is at a
distance from the subject, a simple
is used: rst, a snapshot of an object of know sizes placed on
the sample vertical plane of view as that-measured eye; then, the
user select two points on
whose distance in pixel will be used for metric con-ording to
the real world size of this image segment,
entered beforehand by the software operator. Dueical difculty of
exibly introducing (and removing)
object within the head-mounted goggle, pixel-nit conversion is
accomplished by sticking over
eyelid a thin, circular patch of known size (6 mmee Fig. 1D),
which adheres to the eyelid without
of adhesive products, due to its adequate concav-dity. Next,
continuous video capture is started ande detects the boundaries of
the circular patch andts diameter on a frame-by-frame basis. This
oper-
when the patch size estimated in a user-denedconsecutive frames
fall below one standard devi-
sample. It is worth mentioning at this point thatatch is
disposable and does not deform when ster-alcohol. Furthermore,
according to subject reports,cause discomfort and becomes
unperceivable very
the small focal length (3.6 mm) and the shorttween the subjects
eye and the lens of the head-amera (30 mm), images are more
susceptible to
radial distortion of the lens (Fig. 4A). An optionalf the
software module permits to circumvent this
approach iforms softwfor as long12 min). Hebecause
ofwriting-to-basis. A prmore suscein speed nCaptureStathe recordiIf
the run msession is session is Informationdata are strecording
msystem, meduration macquisition
2.3.1.5. Prowas build recorded imbefore initiof detectinof
images. lost framescan also be have to be applied to each image
frame before theetection and estimation of the pupil (e.g. Fig.
4B).
g-in for event-triggered video clip acquisition. Twoage
acquisition modes, herein referred as plug-
eveloped. The rst one is controlled by the niteine (FSM)
depicted in Fig. 5A and consists in captur-ames for a user-dened
number of short-durationng from milliseconds to seconds. A
congurableinterval is also executed. Due to the relativelyber of
frames acquired in each trial, data storageitially in primary
memory (RAM), thereby mini-e loss during acquisition. Note that, to
use this
he amount of memory required for each trial needsted before the
initiation of the recording session
rtUp). The detection of camera failures aborts the process,
which in turn ushes out allocated mem-alizes the FSM.
for a single trial is shown in Fig. 5B. Initially, thets for an
external trigger to start capturing framestion state). The end of a
trial occurs when the total
allocated frames is effectively reached or whenn of the trial
congured by the user has elapsed,s being controlled by the Timing
state. As the sys-s to the Record Session FSM (Movie record state),
the
ptied by transferring the captured frames fromd disk.
g-in for long-term continuous recording. Thiss controlled by the
FSM shown in Fig. 6. It per-are-triggered continuous acquisition,
in theory,
as disk space is available (longest time tested:re, buffering of
image data in RAM is impossible
the large volume of frames to be stored. Instead,disk operation
is performed on a frame-by-frameoblem with this approach is that
the system isptible to frame loss due to data-saving
bottlenecksormally introduced by slow writes to disk. TherUp state
initializes the different parameters ofng session like stimulus
(LED) onset and offset.ode of the camera is not veried, the
recording
aborted. The presentation of stimuli during adened by the
Stimulus event conguration state.
about timestamps, frame index, and subjectored at the end of the
session. A drawback of thisode is that it is based on the clock of
the operatinganing that small cumulative delays in recordingay be
introduced depending on the priority of the
process.
tocol validation module. A validation moduleto allow users to
identify missing frames in theage sequence as a pre-processing
screening step
ating the more computationally demanding taskg and estimating
pupil parameters in a large stackThis module indicates the time of
occurrence of
for each trial of the record session. Trial validation done on
the basis of three additional variables:
-
612 c o m p u t e r m e t h o d s a n d p r o g r a m s i n b i
o m e d i c i n e 1 1 2 ( 2 0 1 3 ) 607623
Fig. 5 Finshown amby a black
the acquisitconsecutive the total nuacquisitionfor a givenof
frames lofailure is anto be adopite state machine (FSM) for the
hardware triggered trial-based viong states that dene (A) the whole
recording session and (B) a sdot, the end is indicated by the same
black dot but encircled.
ion rate, the total number of frame loss and the largestfailure.
The acquisition rate, calculated according tomber of frames divided
by the time duration of the, permits to check if the nominal rate
congured
recording session was achieved. The total numberst evaluated
together with the largest consecutive
important indicator for deciding on the strategyted for
recovering the lost samples. Note that, in
addition toprocess caevents, like
2.3.2. OfIn our desofine. In tous image-deo recording mode.
Transition dependency isingle trial. The beginning of a FSM is
indicated
compromise pupil analysis, errors in the recoveryn potentially
lead to misestimation of signicant
blinks.
ine plug-ins for pupil analysisign, extraction of pupil
parameters is performedhis respect, it is important to stress that
numer-processing techniques already exist, ranging from
-
c o m p u t e r m e t h o d s a n d p r o g r a m s i n b i o m
e d i c i n e 1 1 2 ( 2 0 1 3 ) 607623 613
Fig. 6 Fincontinuousand withouthe FSM, re
essentially cated modeinformationinvolve thethe
implemtechniquesthresholdinanced traderobustnessthe
imaginimplementcircular. A each subjec
2.3.2.1. Pupan AVI datspeed, the rectangularfunction ofprevious
frgets automcally updatimage contpupil. Notewhen prioris not
availframe of thentirely occin signal-toa circular sthe low inteof
the ROI. represent awith pupil-
Flowchart of the pupil segmentation algorithm. (A)ite state
machine (FSM) for the long-term recording mode. As Fig. 5, the
black dot witht the circle indicates the end and beginning
ofspectively.
heuristics methods to mathematically sophisti-l-based estimation
algorithms (for more detailed, see for example [61,62]). Our
solution did not
construct of genuinely new algorithms, but ratherentation of
rather standard image-processing
used for this type of application, such as intensityg and edge
detection. Our aim was to obtain a bal-off between processing
efciency and estimation
. Obviously, the success of our approach depends ong quality of
the eye. The pupil extraction algorithm
Fig. 7 ed herein assumes that the outlined of the pupil ispriori
information on the range of pupil sizes fort category was also used
in the procedure.
il segmentation. The procedure starts by loadinga le. For the
purpose of improving processingsize of each input image (Fig. 7A)
is reduced to a
region of interest (ROI, Fig. 7B), which is set in pupil center
and radius values obtained in theame. As a result, the pupil in the
current imageatically centralized and this process is dynami-ed.
For this to work, it is assumed that the inputains only one eye as
well as at least a portion of the
that no ROI-based image reduction is performed information about
the locus and size of the pupilable, as it is the case, for
example, when the rste lm is being considered or when the pupil is
beingluded by the eyelid (blinks). Further improvements-noise ratio
are obtained by cropping the ROI intohape (Fig. 7C), which
effectively removes most ofnsity pixels typically clustered around
the cornersFrom the viewpoint of pupil detection, such pixels
noise source because of their similarity in intensitydening
pixels.
For each raincreases tdelimiting the resultacomputes a(E)
determisecond zerintensity pimage. Resof each pro
The nexhistogram distributionthe strong rcreates a pby in large
dened mohistogram of the histates a typicand minimw image of a
video footage, the procedurehe signal-to-noise ratio of the pupil
by (B)an ROI centralized on the pupil and (C) croppingnt image into
a circular shape. (D) It thenn intensity prole of the remaining
pixels andnes a pupil segmentation threshold as theo-crossing of
the derivative of the gray-scalerole. (F) The threshold is used to
binarize theult examples are shown on the right-hand sidecessing
step box.
t step in the procedure is to construct a gray-scale(Fig. 7D) in
order to analyze the pixels intensity
on the resultant image. This approach relies onesponse of the
iris to infrared illumination, whicheculiar histogram whose rst
peak correspondsto the pupil (low-intensity pixels). A
heuristicallyving average lter (n = 7 bins) is applied over theto
smooth noisy peaks and valleys. The derivativeogram is then
computed (Fig. 7E), which gener-al curve with a point of maximum
(positive peak)um (negative peak) for each histogram peak. A
-
614 c o m p u t e r m e t h o d s a n d p r o g r a m s i n b i
o m e d i c i n e 1 1 2 ( 2 0 1 3 ) 607623
Fig. 8 Floestimates tthe other oprocess. (Gmodied vshown on
pupil-segmsecond zerple shown of the rst pthen obtainall the
rem
2.3.2.2. Pupthe pupil, twwere furthbinary imaaims at esThe
centroobject provrespect to the image area that lighting
conspurious pis applied. black blobsmorphologAfter haviwchart of
the pupil estimation algorithm. The latter subdivides ihe centroid
and radius of the pupil from binarized images obtaine generates an
edge-map of each image obtained after gain inc) Results from both
concurrent algorithms serve as input data foersion of the Randomize
Circle Detect (RCD) algorithm (see bodythe right-hand side of each
processing step box.
entation threshold is obtained by detecting theo-crossing of the
derivative (bin 48 in the exam-in Fig. 7E), which corresponds to
the rightward taileak in the gray scale histogram. Binary images
areed by setting all pixels below threshold to one, andaining
pixels to zero (Fig. 7F).
il size estimation. For robust size estimation ofo independent,
albeit complementary, algorithms
er implemented. The rst one is applied on theges obtained
earlier (Fig. 8A, same as Fig. 7F) andtimating the centroid and
radius of the pupil.id of an object is the center of mass of
thatided that the object has a constant density withbackground.
Following the segmentation process,often contains spurious pixels
outside the pupilare generated by artifacts such as
non-uniformditions, eyelashes and shadows. To remove these
ixels, a morphological lter known as erosion [63]Within the
pupil area, noise is characterized by
caused by IR illumination. To remove it, anotherical lter known
as dilatation [63] is applied.ng used these two morphological
operations
(Fig. 8B) mof the pupmates arealgorithm, by eyelashRather,
thealgorithm below).
A concuedge map orization (seoperator [6lights the
bcircumscribAND operaThe latter binary imacal operatifurther
erosize. This gon the bordend result free regionnto two concurrent
algorithms: (AC) onened at the end of the segmentation process;
(DF)rease in signal-to-noise during the segmentationr selecting the
best pupil-tting circle, using a
text for more details). Result examples are
ore accurate estimates of centroid and radiusil can be computed
(Fig. 8C). Because both esti-
not so accurate when derived by the aboveespecially when the
pupil is partially occludedes and eyelids, they are not used as nal
values.y serve as input parameters to a more robustfor nding the
circumference of the pupil (see
rrent algorithm was implemented to provide anf each gray-scale
ROI image obtained before bina-e Fig. 8D, same as Fig. 7C). A
conventional Canny4] is used to obtain this map, which accurately
high-order of the pupil (Fig. 8E). Noisy pixels within theed pupil
area are removed by applying a logical
tion onto the binary edge-map using a mask lter.is created by
inverting the contrast polarity of theges obtained after
application of the morphologi-ons (Fig. 8B). Beforehand, these
binary images areded up to three times, so that they get reduced
inuarantees that the mask lter does not encroacher of the pupil
edge. As can be seen in Fig. 8F, the
is a well-delineated pupil border enclosing a noise.
-
c o m p u t e r m e t h o d s a n d p r o g r a m s i n b i o m
e d i c i n e 1 1 2 ( 2 0 1 3 ) 607623 615
The circumference of any round shapes present in the edgemaps is
identied by a Randomize Circle Detect algorithm(RCD), descalgorithm
uy3), along tthe center
a123 =
4((x
b123 =
4((x
and the
r123 =
(x1
Validatiolating the dcircumfereformulated
d4123 =
Thus, p4distance d4edge of the
Nonetheup the Cancess accomof pupil cirvideo frameye, this
mtence of anformed by t
To circurithm by mnon-randomthe search dened by is the
centltered imasoftware muntil a borconguratiof quadrupof one andcan
be chotive circle. sets of quabest pupil-candidate set with thrion,
the searea differerization anof best pupsteps can b
Congurations of candidate points for dening besttting circles
according the Directed Circle Detecthm. (A) Ensemble of candidate
points used to denent sets of quadruple points. (B and D)
Congurationsdruple points commonly retained by the algorithm
insence of pupil occlusion. (C, E and F) Congurationsonly used by
the algorithm when the pupil ised, during an eyeblink for
instance.
druple sets usually improves estimation performance,ers
processing speed. The user is free to determine
sired congurations beforehand, using heuristic crite-ptimality
for coping with the widest possible spectrumations putatively
generated by the motility of the pupilelid of the particular group
of subjects under study. Forhuman subjects, we found that choosing
all congu-
except that shown in Fig. 9E usually yields the best. Fig. 10
illustrates which one of the candidate sets is
selected in function of different topologies of ocularres and
demonstrates the capacity of our algorithm
e with adverse conditions for estimating pupil size. Ingard, it
is interesting to mention that in addition to pro-dius and center
estimates of the pupil, our algorithmrovides robust information
about the aperture of theribed in details by Chen and Chung [65].
Briey, thisses three points, p1 (x1, y1), p2 (x2, y2) and p3 (x3,he
edges detected by the Canny lter and denesof circumference as:
x22 + y22 (x21 + y21) 2(y2 y1)
x23 + y23 (x21 + y21) 2(y3 y1)
2 x1)(y3 y1) (x3 x1)(y2 y1))
2(x2 x1) (x22 + y22) (x21 + y21)
2(x3 x1) (x23 + y23) (x21 + y21)
2 x1)(y3 y1) (x3 x1)(y2 y1))radius as:
a123)2 + (y1 b123)2
n of the circumference is accomplished by calcu-istance between
a fourth point p4 (x4, y4) and thence center (a123, b123), which
can be mathematically
as:
(x4 a123)2 + (y4 b123)2 r123
will belong to the circumference if, and only if, the
123 is zero, or close to zero, since the width of the pupil is
wider than a single pixel.less, due to the large number of pixels
makingny-ltered images and the random selection pro-plished by the
RCD algorithm, the identicationcumference becomes cumbersome when
1000 ofes need to be processed. Moreover, for the humanethod is
more susceptible to error due to the exis-other circumference
corresponding to the edgehe iris and the sclera (limbus).mvent
these problems, we modied the RCD algo-aking the search for points
on a potential circle a
process. The goal of this modication is to forcefor 12 points
present in the putative pupil borderthe edge map. The point of
origin for this searchroid estimated in binarized and
morphologicallyges, as illustrated in Fig. 8C. From this point,
theakes a pixel-by-pixel walk in 12 cardinal directionsder is found
(Fig. 9A). On the basis of predenedons (Fig. 9BF), the algorithm
then selects four setsle points out of the 12 points detected. A
minimum
a maximum of ve different sets of quadruplessen, each
potentially associated with its respec-Last, the algorithm selects,
among all candidatedruple points, the one that will allow drawing
thetting circle. This selection process rst rejects allsets that do
not congure a circle and elects thee shortest d4123 (see above). As
a tie-break crite-t yielding a circle with the smallest
proportionalnce in relation to the area obtained after bina-d
morphological ltering is retained. An exampleil-tting circle
derived from the above algorithmice seen in Fig. 8G. Note that
increasing the number
Fig. 9 pupil-algoritdiffereof quathe abcommocclud
of quabut lowthe deria of oof situand eyadult
rationsresultsusuallystructuto copthat revide raalso peyelid.
-
616 c o m p u t e r m e t h o d s a n d p r o g r a m s i n b i
o m e d i c i n e 1 1 2 ( 2 0 1 3 ) 607623
Fig. 10 Rereal imagethe result oderived aftpoint towawill be
opton whetheupper eyelconvert pix
2.3.2.3. Posmatic proceanalysis timYet, it is ntain
continconsecutivocclusion mFor this reand correctiforth throua
frame-bysuperimpopupil centeical elds the appliedmodule. Aframes to
bthe whole some pupilpoints for cient optiopresentative examples of
pupil estimation performance under d of the pupil with the
circumference derived from the Directed Cif the binarization and
morphological ltering process as illustraer a Canny ltering
operation and intra-pupillary noise removal.rd the 12 candidate
points detected by the DCD algorithm on theimal for determining
pupil circumference. Optimal sets are indicr the pupil is entirely
visible (AC), partially occluded in its uppeid (GI). The circular
patch, indicated by the white arrow in (G), reel values into metric
units (see item Section 2.3.1.2).
t-processing artifact rejection module. The auto-dure described
above evidently reduces the overalle that a manual procedure would
actually take.
ot completely immune to estimation errors. Cer-gencies such as
large pupil displacements acrosse frames, non-homogeneous
illumination or pupilay indeed potentially lead to erroneous
estimates.ason, we decided to build an error visualizationon module
that allows the user to scroll back andgh the whole video record
and to visualize, on-frame basis, the estimated outline of the
pupilsed on the raw image of the eye. New values ofr and radius can
be assigned using two numer-
available for this purpose. The option to save all changes is
given to the user when exiting thelternatively, depending on the
number of totale manually corrected, reprocessing
automaticallyvideo record, or part of it, after having redened
analysis criteria (e.g. inclusion of more than fourdening
candidate vectors) might be a more ef-n. For further analysis of
pupillary and palpebral
responses, facts introdinformatiothe TDMS diameter.
3. Re
3.1. Tim
Many expetion betweerecording aup to whawe rst vethis test,
tously (free a strobe fogurable bcamera). Wifferent ocular
conditions. First row shows thercle Detect (DCD) algorithm. Second
row showsted in Fig. 8C. Third row shows the edge-map
The black arrows going in centrifugal directions basis of which
one or two sets of quadruplesated at the top of the gure and differ
dependingr and bottom part (DF) or only covered by thefers to the
round exible plastic piece used to
it is also important to correct for the temporal arti-uced by
occasional frame acquisition failures. Thisn is contained in the
vector of timestamps saved inle together with the estimation
results of pupil
sults
ing accuracy of synchronization
rimental paradigms require accurate synchroniza-n image
acquisition of the eye and other bio-signalnd/or stimulus
presentation devices. To evaluatet point our system can fulll this
requirement,ried the temporal delity of frame capture. Forhe camera
was set to acquire images continu-mode) at a 120 Hz sampling rate
and to generater each frame build. The strobe duration is
con-etween 0 and 4095 times 1.024 MHz (Tclock of thee chose a value
of 512, which generated a strobe
-
c o m p u t e r m e t h o d s a n d p r o g r a m s i n b i o m
e d i c i n e 1 1 2 ( 2 0 1 3 ) 607623 617
Fig. 11 Te d wiepochs dis tor toutputted b een of microsec n
inusing a TT ty caframe. The A) hawith accur ical
duration ofDSO 3202Agies, Santa of 1 ms thaan externashows two
pulses (botspecicatiorect width voltage outthe camerathat need
teye.
Next, wtion externapproach, upper row)of a frame.ger the
camcannot be lpoint to beone frame. tus of all I/the TTL puframe
acquof experiment widthsmporal accuracy of the strobe signal of the
camera measureplayed column wise. (A) Signal generated by a signal
generay the camera. For both recording epochs, the duration
betwonds jitters. Lack of synchronization between signals showL
pulse to trigger the camera, which is that timing uncertain
digital oscilloscope (Agilent Technologies, model DSO 3202acy of
100 ppm on the horizontal scale and 3% in the vert 0.524 ms. The
strobe signal was displayed on a precision digital oscilloscope
(Agilent Technolo-Clara, USA) together with an aperiodic pulse
signalt was outputted by a signal generator to simulatelly
triggered signal. As depicted in Fig. 11, whichrepresentative
epochs of both signal traces, strobetom row) were emitted as
expected by the cameran: They occurred at a regular interval and a
cor-(temporal jitter on the order of tenth of s) with aput of 3.3
V. This means that the strobe output of
can be securely used for triggering other deviceso be
synchronized with image acquisition of the
e tested the reliability of triggering image acquisi-ally. Fig.
11 illustrates an inherent limitation of thiswhich is that the
trigger signal (simulated in the
may arrive at anytime during the build-up process This means
that if an external device is to trig-era, response-timing
uncertainty of the camera
ess than the duration of a single video frame. The tested is if
such an uncertainty can be more thanDuring frame assembly, the
camera veries the sta-O channels. However, depending on the width
oflse and its temporal offset at the beginning of rstisition, the
I/O signalization may be lost. Resultsents in which we applied
trigger pulses of differ-
(from 5 to 10 ms and steps of 1 ms) show that a
minimum ploss of sync
3.2. Asreal-time v
Because outime Windpresent anprecedenceware. In thof its
prioriware and hfor jitter ininto Labvietant to meand procesworsen
theimized befothe host PCa signicanthe extent deterioratea battery
oplug-ins de
For the ment was pth a digital oscilloscope during two
differento simulate a trigger pulse. (B) Strobe signalstrobe pulses
was around 8.33 ms with only tens
(A) and (B) exemplies a problem inherent innnot be less than the
duration of a single videos a 200 MHz bandwidth, 1 Gs/s sampling
rate,scale.ulse width of 6 ms was actually necessary to
avoidhronism.
sessing hardware constraints for reliableideo acquisition
r Labview-based software operates in a non real-ows environment,
our video acquisition methods
opportunity for a higher priority thread to take over the
control loop of our data acquisition soft-eory, any Windows
application thread, regardlessty, can indeed be preempted by one of
many soft-ardware sources. This introduces the possibility
our control loop once the data are being broughtw and saved to
disk. In this respect, it is impor-ntion that the number of
concurrent applicationsses running at the time of recording is
likely to
occurrence of such an outcome and should be min-re starting
video acquisition. Hardware features of
such as RAM and processor speed should also havet impact on
image acquisition reliability. To assessto which the aforementioned
factors may in effect
the performance of our prototypes, we performedf tests on
standard PCs and for both acquisitionscribed in the Section
2.trial-based trigged recording method, the experi-erformed on
three different platforms chosen for
-
618 c o m p u t e r m e t h o d s a n d p r o g r a m s i n b i
o m e d i c i n e 1 1 2 ( 2 0 1 3 ) 607623
Table 1 Computer congurations used to evaluate the system
performance.
Congura tes)
1 2 3
Fig. 12 Sytrial-basedconguratinumber of1 and 50 s.and 80 s.
having distational sysTable 1). Foperiods (1, The camersample
freqof data. Fidurations btive failureslikely relatecesses runthat
there uration 3. Fconguratioto conguration Processor RAM (Gby
Intel core 21.6 GHz 2 Pentium IV 3.0 GHz 1 Intel core 21.8 GHz 1
stem timing performance of the triggered, video acquisition mode
for different desktopons (see Table 1). (A) Histogram count of
the
frame lost for trials of duration varying between (B) Same as
(A), but for durations between 60
inct features especially with respect to their oper-tem, memory
capacity and processor speed (seer this purpose, we tested seven
different recording10, 30, 50, 60, 70 and 80 s), each repeated 10
times.a was set to a spatial resolution of 320 240 and auency of
120 Hz, yielding a total of 9.216 Mbytes/s
g. 12A shows the results obtained for recordingetween 1 and 50
s. Clearly, the highest consecu-
occurred for conguration 1, a problem that is mostd to the
operating system and the number of pro-
ning in the background. On the other hand, noteis no frame loss
when tests are run under cong-or time durations between 50 and 80 s
(Fig. 12B),n 2 showed a clear drop in performance comparedtion 1.
This can be straightforwardly explained by
Fig. 13 Sycontinuous24 repeats linked up bloss identidotted
linerow for eac
the fact thacation of vibetween cosimilar haroperating sThe
excelleperiod of ubased expe
For the lonly one plsuch large frame loss.running Wcessor as
wrepeats, alllasting tweexample, inuated by cand/or the during a
ginumber of missed for number of very few fra0.03%). Moing their
inespecially aserious datOS Number ofopenedprocess
VistaTM SP2 70XP SP2 30XP SP2 33
stem timing performance of the long-term recording mode.
Recording session consisted inof 12 min video frame acquisition.
Black dotsy a solid line depict the total number of frameed for
each repeat. Open circles linked up by a
show the highest number of frames lost in ah repeat.t recording
durations longer than 60 s required allo-rtual memory in hard disk.
The difference observednguration 1 and conguration 3, which have
ratherdware features, again reects the workload of theystem and the
inuence of concurrent processes.nt performance of conguration 3 for
a recordingp to 70 s should fulll the demand of most trial-rimental
paradigms.ong-term continuous recording approach, we usedatform. In
this case, the inability to store in bufferdata blocks inevitably
increased the probability of
To minimize this problem, we chose a computerindows XP and
featuring an Intel Core i5 750 pro-ell as 2 Gb of RAM. Data were
gathered using 26
performed at a sampling frequency of 120 Hz andlve minutes, a
duration that is often chosen, for
drowsiness studies (e.g. [34,35]). Results were eval-alculating
the average proportion of frames lostmean highest number of frames
consecutively lostven acquisition process. Fig. 13 shows the
highestframes lost in a row and the proportion of frameseach
repeat. It is evident that considering the largeframes acquired in
this protocol (n = 87,600), onlymes were actually lost (average
across all repeats:
st often, such frame drops occurred singly, allow-terpolation
using immediately anking data points,t a sampling frequency of 120
Hz. Instances of morea loss were rare. In our tests, the worst case
was
-
c o m p u t e r m e t h o d s a n d p r o g r a m s i n b i o m
e d i c i n e 1 1 2 ( 2 0 1 3 ) 607623 619
Fig. 14 Resubject recPupil diamperiod. (B) denoting a
seen duringat a samplimasking imblink.
3.3. Pup
Experimenracy of our diameter (tthe eye oflar estimata spatial
reset to videofrequency pixel densiof 0.03 mm
We alsovolunteers,conditions of pupil dito be dealttion 2
yieldshows an ein darknestion), usingthe points diameter ra5.4 0.40
msimilar con
Coreml anJ softpondf a l
, the mmmoreo recopor
[55]asselly
J 1.44presentative pupillogram of a normal adultorded under
constant dim light conditions. (A)eter estimated with two blinks
events for a 60 sZoom in on the rst large negative peakn
eyeblink.
repeat 19 with a sampling gap of 256 ms (31 framesng frequency
of 120 Hz) that may have resulted in
Fig. 15measumanuaImagecorresslope o
Fig. 14Aeter = 0work, order tthe prfeature
To manuaImageportant information such as the occurrence of a
il size measurements
ts were also carried out to assess the spatial
accu-pupillometer. By lming a circular patch of knownhe same as
that shown in Fig. 1D) positioned on
a plastic human-head model and averaging circu-es of that patch
(n = 79,684 frames), we calculatedsolution of 0.07 mm/pixel when
the camera was
mode 320 240 at 120 Hz. Reducing the samplingof the camera by
half (60 Hz) and increasing thety to 640 480 pixels resulted in an
improvement/pixel.
recorded several long-lasting sessions in human a situation
which obviously imposes more severefor analyzing pupil size, since
not only variationsameter but also eye and eyelid movements
need
with. Overall, the algorithm detailed in the Sec-ed robust size
estimates of human pupils. Fig. 14Axample of uctuations in pupil
diameter, measureds during a period of 60 s (no prior dark
adapta-
the head-mounted goggle system. Filtering outon the curve where
the blinks occurred, the pupilnged from 4.6 to 5.8 mm, with an
average value ofm, in agreement with other studies carried
underditions (see, for example, [66]). As can be seen in
pare with swere randoobtained frAs can be sp < 0.0001; Sand
automnear the linagreement line (dotted0.981.02. Nslightly
higlooking theand 75th pepersion): 54(4057) for pared,
thesigned-rantendency aessarily an in the man
4. Di
The prototour laboratand powerthe basis oical applicamparison
between manual and automaticents of pupil diameter. Correlation
betweend automatic estimation performed using NIHware for manual
measurement. The solid lines to x = y line, and dashed line
indicates theinear t; n = 120.
algorithm is also able to detect blinks (pupil diam-) and
estimate their duration (Fig. 14B). In future
sophisticated algorithms may be implemented inover the
kinematics of eyelids, taking for exampletion of ocular area being
covered as a referential.ss the accuracy of the estimation
algorithm, wet a circle to the perceived pupil boundary, using
the
software (NIH, http://rsb.info.nih.gov/ij/) and com-ystems
automatic process. A total of 120 framesmly chosen from
pupillometric video sequencesom three human subjects (40 frames per
subject).een in Fig. 15, we found a high correlation ( =
0.91;pearmans rank correlation test) between manualatic estimates.
The pairs of points were distributede of equality (solid line),
indicating a high level of
between measurements. The slope of regression
line) was 1.0 with a 95% condence interval ofote that manual
measurements tend to provideher values than the automatic. This is
conrmed
median values of the two types of estimates (25thrcentile was
used as a measure of population dis-
pixels (3956) for the automated and 55.5 pixelsImageJ software.
However, when statistically com-se values were not different (p
> 0.05; Wilcoxonk test). Lu and collaborators [67] also observed
thisnd conjectured that the minor difference is not nec-error of
the automatic method but an inconsistenceual measurements.
scussion
ypes described herein are now in routine use inory. They provide
a complete, exible, low-budgetful solution for pupillometric
measurements onf which numerous research paradigms and clin-tions
can be developed. Our benchmark results
-
620 c o m p u t e r m e t h o d s a n d p r o g r a m s i n b i
o m e d i c i n e 1 1 2 ( 2 0 1 3 ) 607623
show that changes in pupil size can be recorded in a tem-porally
precise way and at a sampling frequency of up to120 Hz. In spatial
resoour systempupillometand 0.01 msurement ooccurs becabeginning
ais capable be used to the relationdiameter.
A distinin the use digital cameliminates In generalFireWire
cacomprehenform solutmarket anddards that specicallyassociated
great beneera via IEEcontaminaimportancebiopotentiathe
identireal-time oof digital lasory stimuaccurate sydifferent bufor
details)used in thiincorporabcompact phwe were sua specic
dprototype, mesoscopicapplied in revaluationstype for
repupil-camelation devifront of a sview. The laa wide rangto
diagnosesystem (revindexing ps
The redescribed pware strucstandard ca
i.e. Labview and the independent pluggable module phi-losophy
used in our software implementation. Labview is
ivelyge ging.ticatnd ges gram
alsolementegns) ned itionaptinmenunitye th
workackan fomerapful
acqure dout tsta uf thiws 7ing N
late
ther resu
ransfms pic vimeranmey schperalack illaryre p, timeatuing sg
pred (eindoo be ter ws sysore
s anationeci
disperiethe spatial domain, our estimation shows that alution of
0.03 mm can be achieved. This situates
in a competitive position compared to commercialers, whose
nominal resolution ranges between 0.1m. Blink detection is also
accurate, but the mea-f its duration need improvements. This
problemuse the algorithm does not estimate precisely thend the end
of eyelid closure. However, the system
of measuring eyelid opening (Fig. 10G), which canderive precise
metrics of blink duration, based onship between eyelid opening and
estimated pupil
guishing feature of our pupillometric system liesof an
independent acquisition module (FireWireera) based on IIDCIEEE1394a
interface, whichthe necessity of acquisition board interfacing.
, for biomedical image processing applications,meras have
recognized potentials for providingsive, high-performance and
cost-effective plat-ions. Moreover, they are widely available in
the
count on a large body of user expertise and stan-help to drive
and sustain economies of scale. More
in relation to our application, several featureswith the
FireWire camera interface proved to be oft, such as: (i) power
supply capability of the cam-
E1394 connection, thereby reducing risks of noisetion from
external sources, a feature of particular
when pupil measurements is to be combined withl recordings; (ii)
retrievable timestamps, enablingcation and correction of frame loss
inherent to nonperating systems, such as Windows; (iii)
insertionbels to identify experimental variables such as sen-li or
behavioral events; (iv) output digital lines fornchronization and
triggering of other devices; (v)ild-in synchronization methods (see
methodology. Furthermore, the particular FireWire camera wes work
has the great advantage of being exiblyle into different
experimental set-ups due to itsysical format and demountable
casing. As a result,ccessful in building two prototypes, each
targetingomain of application: (1) a head-mounted goggleideally
suited for pupil tests necessitating stable
or scotopic light conditions such as those typicallyefractive
surgery (e.g. [16,17]) or sleepiness-related
(e.g. [35,36,41,42]); and (2) a desk-mount proto-mote recordings
at a xed and sufciently longra distance, such that a variety of
visual stimu-ces (e.g. video monitors, LEDs) can be placed inubject
without causing obstruction in its eld oftter experimental
arrangement is actually used ine of experimental scenarios
designed, for example,
lesions at various processing stages of the visualiewed in
[1,3]) or to evaluate pupillary metrics forychiatric disorders
(e.g. [68]).
producibility and modiability of the hereinrototypes are
facilitated not only by their hard-tures based on PC platforms and
IIDCIEEE1393meras, but also by the programming environment
a relatlanguadebugsophission) alanguafor proview isto impas an
i(plug-icombiacquisfor adexpericommmanagin thisware psolutiothe
cabe helimageFireWipoint and Vibility oWindofollowSP1 or8).
Anoidationdata tplatforministthe caenviropriorittime o3, this of
pupsoftwathumbware fOperatrunninpreferrlike Wneed tcompuSeriouwise.
Msuch aapplicnon-spof hardour ex easy-to-learn high-level graphical
programmingoptimized for fast development and run-time
It has a rich set of data visualization options,ed libraries to
capture and process images (NIVi-supports interactions with several
programming(e.g. MatlabTM, C/C++), providing great exibilityming
strategies heavily based on code reuse. Lab-
inherently modular, a feature that has helped usnt a complete
software solution for pupillometryrated set of specic and
independent componentsand user interfaces. This modular
architecture,with our choice to clearly separate online image
from ofine data analysis, offer interesting optionsg
pupillometric methods in function of specictal needs. We actually
hope to help the research
in that direction by making all the modules thate two different
image acquisition modes described
open-access and open-source software. This soft-ge should
deliver a self-sufcient and ready-to-user time-controlled image
acquisition provided that
specied herein is wired properly. It should alsoas a starting
point for the implementation of newisition solutions incorporating,
for example, otherigital cameras. At this point, it is important
tohat our system was developed for Windows XPsing Labview 8.5. In
order to guarantee compati-s system with more recent operating
systems like
and 8, Labview version 8.5 needs to be upgradedational
Instruments instructions (version 2009
r ones for Windows 7; version 2012 for Windows
important point to be discussed, and for which val-lts have been
presented in this paper, is that directer from FireWire cameras to
Windows-based PCresents some limitations for time-critical,
deter-
deo acquisition. The problem does not reside in itself, but in
popular Windows-based operatingnts, which are not designed to
support preemptiveeduling and minimal interrupt latency as
real-
ting systems do. As demonstrated in the Sectionof determinism
does not hamper rigorous analysis
behavior. Notwithstanding, certain hardware andrecautions need
to be taken. As a general rule ofing performance scales up with
computer hard-res such as RAM capacity and processor speed.ystems
that allow users to minimize the number ofocesses competing for
resources should be also be.g. Windows XP). If more recent
operating systemsws 7 or 8 are to be chosen, particular attention
willpaid in matching the hardware features of the hostith the
specic demands of the operating system.tem performance drops are to
be expected other-over, it is strongly recommended to disable
tasksti-virus scanning, disk defragmentation and others during
recording sessions. Moreover, althoughc tests have been carried out
to test the inuencek storage capacity on system timing
performance,nce shows that it is preferable to reserve a large
-
c o m p u t e r m e t h o d s a n d p r o g r a m s i n b i o m
e d i c i n e 1 1 2 ( 2 0 1 3 ) 607623 621
amount of memory (at least 5 times the size demanded by
arecording session) to minimize frame loss and signal
disconti-nuity durinfor disk spimages), difOne strategstreaming
athe originarecovery ofond approaemployed iAccording tof images
utwo-dimenpupil diamabove reduloss due to ers descrieZIP, ARJ or be
applied
Finally, developed prototypes camera. Thby installinslightly
moWe actuallsignicant another adthe scheduasynchronomultiple
IIDing.
Conict o
The authorwas condurelationshiinterest.
Acknowle
Financial sAmparo Pselho Nacio CNPq andNvel SuperPinto and
Dcomments Pompia e metric data
r e f e r e n
[1] I.E. LoeophthaOphtha
[2] H. Wilhelm, B. Wilhelm, Clinical applications
ofpupillography, J. Neuroophthalmol. 23 (2003) 4249.
. Bar pupiuritiBremsorde
Pozzorocupillo
Frauewe,
moded. 6
Mylipillaton.L. Ferhneipillouropg. OnS. RuurosJ. Sied copres.
KojimeydiovurosF. Fotrlovarkinsycho
Horikampillarkins
Fan,pillasorde991
Grantentisk inM. Sceasualyz000) 8S. Roswaletientrg. 2J.
DufitedS. Thsfunc. 78. Foleyer,tachH. Karimebject5-486
Schmymmg video acquisition. To overcome this high demandace (76
Kbytes per frame for 8 bits, 320 240 pixelferent strategies of data
reduction can be adopted.y is to apply compression lters during
online datand saving. However, most of these lters degrade
l image (modify raw pixel values) and prevent the the timestamps
embedded in the frame. A sec-ch is to use the kind of interlacing
methods widelyn the analog camera, television or VHS video world.o
a battery of tests in which we downscaled batchessing conventional
interlacing protocols (one and
sional line reduction), no signicant variations ineters were
found. In general, the strategies citedce signicantly the data
volume, but increase framethe time necessary for processing for the
comput-d in Table 1. Non-destructive compactors such as7z can
reduce the le size up to 60%, but can onlyafter acquisition and
storage of the lm.it is worth mentioning that although originallyas
monocular systems, our head- and desk-mountcan be modied to support
a second (FireFly MV)is binocular conguration can be achieved
simplyg a two-port Firewire PCI card in the PC and bydifying a part
of the image acquisition software.y tested this new conguration and
observed nochanges in performance. This again demonstratesvantage
of using FireWire cameras in that bothled real time (isochronous)
and demand-drivenus I/O capabilities are in principle guaranteed
forCIEEE1394a devices and with minimal CPU load-
f interest statement
s declare that the present research developmentcted in the
absence of any commercial or nancialps that could be construed as a
potential conict of
dgments
upport for this work was provided by Fundaco deesquisa do Estado
de Minas Gerais FAPEMIG, Con-nal de Desenvolvimento Cientco e
Tecnolgico
Coordenaco de Aperfeicoamento de Pessoal deior CAPES of Brazil.
We would like to thank Lucasr. Dirceu de Campos Valladares Neto for
helpfulon the manuscript. We are also grateful SabineGiuliano
Emereciano Ginani for providing pupillo-
(FAPESP, process no. 2011/01286-0).
c e s
wenfeld, Otto Lowenstein: neurologic andlmologic testing methods
during his lifetime, Doc.lmol. 98 (1999) 320.
[3] J.Lofne
[4] F. di
[5] G.Mpu
[6] B.PotoM
[7] V.puAu
[8] G.ScpuneEn
[9] L.Ne
[10] G.ande
[11] MSoauNe
[12] D.KaPaPs
[13] N.NaPuPa
[14] X.pudi14
[15] E.atta
[16] E.man(2
[17] E.KopaSu
[18] R.Un
[19] H.dySo
[20] J.CDrde
[21] R.pesu49
[22] R.asbur, S. Moro, J.A. Harlow, B.L. Lam, M. Liu,
Comparisonl responses to luminance and colour in severe optics,
Clin. Neurophysiol. 115 (2004) 26502658.ner, Pupil evaluation as a
test for autonomicrs, Clin. Auton. Res. 19 (2009) 88101.essere, P.
Rossi, E. Valle, C.P. Froio, A.F. Petrucci, C.tti, Autonomic
involvement in multiple sclerosis: ametric study, Clin. Auton. Res.
7 (1997) 315319.scher, R. Egg, E. Brandauer, H. Ulmer, T. Berger,
W.
B. Hogl, Daytime sleepiness is not increased in milderate
multiple sclerosis: a pupillographic study, Sleep(2005) 543547.us,
H.J. Braune, K. Schepelmann, Dysfunction of thery light reex
following migraine headache, Clin.
Res. 13 (2003) 1621.rari, J.L. Marques, R.A. Gandhi, S.R.
Heller, F.K.der, S. Tesfaye, H.R. Gamba, Using dynamicmetry as a
simple screening tool to detect autonomicathy in patients with
diabetes: a pilot study, Biomed.line 9 (2010) 26.
bin, Pupillometric studies of alcoholism, Int. J.ci. 11 (1980)
301308.gle, S.R. Steinhauer, M.E. Thase, Pupillary
assessmentmputational modeling of the Stroop task insion, Int. J.
Psychophysiol. 52 (2004) 6376.ma, T. Shioiri, T. Hosoki, H.
Kitamura, T. Bando, T.a, Pupillary light reex in panic disorder. A
trial usingisual stimulation, Eur. Arch. Psychiatry Clin.ci. 254
(2004) 242244.iou, V. Stergiou, D. Tsiptsios, C. Lithari, M. Nakou,
A.sitou, Cholinergic deciency in Alzheimers andons disease:
evaluation with pupillometry, Int. J.physiol. 73 (2009) 143149., M.
Takamori, M. Hirayama, H. Watanabe, T.ura, F. Yamashita, H. Ito, N.
Mabuchi, G. Sobue,ry supersensitivity and visual disturbance inons
disease, Clin. Auton. Res. 18 (2008) 2027.
J.H. Miles, N. Takahashi, G. Yao, Abnormal transientry light
reex in individuals with autism spectrumrs, J. Autism Dev. Disord.
39 (2009)508.holm, S.P. Verney, Pupillary responses andonal
allocation problems on the backward masking
schizophrenia, Int. J. Psychophysiol. 52 (2004) 3751.hnitzler,
M. Baumeister, T. Kohnen, Scotopicrement of normal pupils: colvard
versus video visioner infrared pupillometer, J. Cataract Refract.
Surg. 2659866.en, C.L. Gore, D. Taylor, D. Chitkara, F. Howes,
E.wski, Use of a digital infrared pupillometer to assess
suitability for refractive surgery, J. Cataract Refract.8 (2002)
14331438.fey, D. Leaming, Trends in refractive surgery in the
States, J. Cataract Refract. Surg. 30 (2004) 17811785.ompson,
R.C. Watzke, J.M. Weinstein, Pupillaryction in macular disease,
Trans. Am. Ophthalmol.
(1980) 311317.k, H.S. Thompson, S.G. Farmer, T.W. OGorman,
R.F.
Relative afferent pupillary defect in eyes with retinalment,
Ophthalmic Surg. 18 (1987) 757759.rdon, P.A. Kirkali, H.S.
Thompson, Automated pupiltry. Pupil eld mapping in patients and
normals, Ophthalmology 98 (1991) 485495 (discussion).id, B.
Wilhelm, H. Wilhelm, Naso-temporaletry and contraction anisocoria
in the pupillomotor
-
622 c o m p u t e r m e t h o d s a n d p r o g r a m s i n b i
o m e d i c i n e 1 1 2 ( 2 0 1 3 ) 607623
system, Graefes Arch. Clin. Exp. Ophthalmol. 238
(2000)123128.
[23] K. SkorpostchAugenh
[24] S. Thomsorting4548.
[25] H. Wilh[26] M.D. La
SympapupillaAnesth
[27] A.E. IbrSimultand higon the fentan85386
[28] J.E. Richevaluadetecti17518
[29] S.G. PatechniqEng. M
[30] S. MoreJolles, PPsycho
[31] E. Grancogniti(2004) 1
[32] G. PortsearchPsycho
[33] J. Beattand th(1982) 2
[34] J.W. Mcof pupiVis. Sci
[35] H. WilhtestingExp. Op
[36] B. WilhWilhelactivatSleep R
[37] P.P. Cafevaluameasu
[38] S.L. MePupil sPsycho
[39] P. BitsioAlegakis a valapplicaapnea,
[40] A. NikoSiafakasensitiwith sl16717
[41] B.J. WilObjectiphysici30731
[42] O. Lowenstein, R. Feinberg, I.E. Loewenfeld,
Pupillarymovements during acute and chronic fatigue, Invest.
htha. Nakeepinspon08, p
Loww inhtha
. Chaeasud my. Poprd anfract
Kohngital termrg. 2. Bra
colveasutaraL. Ching tith a 006) 2
Bootmpar refr432. Schonocesop010) 6. Bramonobinocrg. 3
Watas. ReFotiolikarticalsearc
Braccordiringbbit,
Hachwojsygleding fint Ganuachmoade Agital94 Tr.A. Piw-coulti-c7
(20.I.C. N
De Spilloramkovska, H. Wilhelm, Afferent pupillary disorders
iniasmal lesions of the visual pathways, Klin. Monbl.eilkd. 226
(2009) 886890.pson, S.F. Pilley, Unequal pupils. A ow chart for
out the anisocorias, Surv. Ophthalmol. 21 (1976)
elm, The pupil, Curr. Opin. Neurol. 21 (2008) 3642.rson, F.
Tayefeh, D.I. Sessler, M. Daniel, M. Noorani,thetic nervous system
does not mediate reexry dilation during desurane
anesthesia,esiology 85 (1996) 748754.ahim, J. Feldman, A. Karim,
E.D. Kharasch,aneous assessment of drug interactions with
low-h-extraction opioids: application to parecoxib
effectspharmacokinetics and pharmacodynamics ofyl and alfentanil,
Anesthesiology 98 (2003)1.man, K.G. McAndrew, D. Decker, S.C.
Mullaney, An
tion of pupil size standards used by police ofcers forng drug
impairment, Optometry 75 (2004)2.til, T.J. Gale, C.R. Stack, Design
of novel assessmentues for opioid dependent patients, Conf. Proc.
IEEE
ed. Biol. Soc. 2007 (2007) 37373740.si, J.J. Adam, J. Rijcken,
P.W. van Gerven, H. Kuipers, J.upil dilation in response
preparation, Int. J.physiol. 67 (2008) 124130.holm, S.R.
Steinhauer, Pupillometric measures ofve and emotional processes,
Int. J. Psychophysiol. 526.
er, T. Troscianko, I.D. Gilchrist, Effort during visual and
counting: insights from pupillometry, Q. J. Exp.l. (Hove) 60 (2007)
211229.y, Task-evoked pupillary responses, processing load,e
structure of processing resources, Psychol. Bull. 9176292.Laren,
J.C. Erie, R.F. Brubaker, Computerized analysisllograms in studies
of alertness, Invest. Ophthalmol.. 33 (1992) 671676.elm, H. Ludtke,
B. Wilhelm, Pupillographic sleepiness
in hypersomniacs and normals, Graefes Arch. Clin.hthalmol. 236
(1998) 725729.elm, H. Giedke, H. Ludtke, E. Bittner, A. Hofmann,
H.m, Daytime variations in central nervous systemion measured by a
pupillographic sleepiness test, J.es. 10 (2001) 17.er, U. Erdmann,
P. Ullsperger, Experimentaltion of eye-blink parameters as a
drowsinessre, Eur. J. Appl. Physiol. 89 (2003) 319325.rritt, H.C.
Schnyders, M. Patel, R.C. Basner, W. ONeill,taging and EEG
measurement of sleepiness, Int. J.physiol. 52 (2004) 97112.s, S.E.
Schiza, S.G. Giakoumaki, K. Savidou, A.K.
is, N. Siafakas, Pupil miosis within 5 min in darknessid and
sensitive quantitative measure of alertness:tion in daytime
sleepiness associated with sleep
Sleep 29 (2006) 14821488.laou, S.E. Schiza, S.G. Giakoumaki, P.
Roussos, N.s, P. Bitsios, The 5-min pupillary alertness test isve
to modanil: a placebo controlled study in patientseep apnea,
Psychopharmacology (Berlin) 196 (2008)5.helm, A. Widmann, W. Durst,
C. Heine, G. Otto,ve and quantitative analysis of daytime
sleepiness inans after night duties, Int. J. Psychophysiol. 72
(2009)3.
Op[43] M
slre20
[44] O.neOp
[45] Wman
[46] McaRe
[47] T.dideSu
[48] J.CofmCa
[49] E.usw(2
[50] S.Cofo32
[51] Mmm(2
[52] J.Ca a Su
[53] T.Vi
[54] F. Paopre
[55] V.redura
[56] A.ZaWus
[57] PoMRi
[58] TrDi13
[59] Mlom19
[60] N[61] A.
pupalmol. 2 (1963) 138157.ayama, K. Yamamoto, F. Kobayashi,
Estimation ofess using frequency components of pupillaryse, in:
Biomedical Circuits and Systems Conference,p. 357360.enstein, I.E.
Loewenfeld, Electronic pupillography. Astrument and some clinical
applications, Arch.lmol. 59 (1958) 352363.idaroon, W.
Juwattanasomran, Colvard pupillometerrement of scotopic pupil
diameter in emmetropesopes, Jpn. J. Ophthalmol. 46 (2002)
640644.
, Y. Payette, E. Santoriello, Comparison of the pupild
pupillometer in measuring pupil size, J. Cataract.. Surg. 28 (2002)
283288.en, E. Terzi, J. Buhren, E.M. Kohnen, Comparison of aand a
handheld infrared pupillometer forining scotopic pupil diameter, J.
Cataract. Refract.9 (2003) 112117.dley, J.E. Anderson, K.T. Xu,
S.M. Brown, Comparisonard pupillometer and infrared digital
photography forrement of the dark-adapted pupil diameter, J.ct
Refract. Surg. 31 (2005) 21292132.aglasian, S. Akbar, L.E. Probst,
Pupil measurementhe colvard pupillometer and a standard pupil
cardcobalt blue lter penlight, J. Cataract Refract. Surg.
3255260.sma, N. Tahzib, F. Eggink, J. de Brabander, R. Nuijts,rison
of two pupillometers in determining pupil sizeactive surgery, Acta
Ophthalmol. Scand. 85 (2007)8.effel, C. Kuehne, T. Kohnen,
Comparison ofular and binocular infrared pupillometers underic
lighting conditions, J. Cataract Refract. Surg. 3625630.dley, C.D.
Cohn, P.W. Wu, S.M. Brown, Comparison ofcular pupillometer and the
pupillometry function ofular free-viewing autorefractor, J.
Cataract Refract.7 (2011) 12571262.nabe, S. Oono, A solid-state
television pupillometer,s. 22 (1982) 499505.u, K.N. Fountoulakis,
A. Goulas, L. Alexopoulos, A.as, Automated standardized
pupillometry with
method for purposes of clinical practice andh, Clin. Physiol. 20
(2000) 336347.ha, W. Nilaweera, G. Zenitsky, K. Irwin, Videong
system for the measurement of eyelid movements
classical conditioning of the eyeblink response in theJ.
Neurosci. Meth. 125 (2003) 173181.ol, W. Szczepanowska-Nowak, H.
Kasprzak, I.ka, A. Dudzinski, R. Kinasz, D.owska-Promienska,
Measurement of pupil reactivityast pupillometry, Physiol. Meas. 28
(2007) 6172.rey Research, FireyMV Technical Referencel, Copyright
2009 Point Grey Research Inc.,nd (2009) 53.ssociation, TA Document
2003017 IIDC 1394-based
Camera Specication, Copyright 1996-2004 by theade Association,
Grapevine (2004) 85.nto, J.K. de Souza, J. Baron, C.J.
Tierra-Criollo, Ast, portable, micro-controlled device forhannel
LED visual stimulation, J. Neurosci. Methods11) 8291.I, NI Vision
Assistant Tutorial, Austin (2005).antis, D. Iacoviello, Optimal
segmentation ofmetric images for estimating pupil shapeeters,
Comput. Meth. Prog. Biomed. 84 (2006) 174187.
-
c o m p u t e r m e t h o d s a n d p r o g r a m s i n b i o m
e d i c i n e 1 1 2 ( 2 0 1 3 ) 607623 623
[62] D. Iacoviello, M. Lucchetti, Parametric characterization
ofthe form of the human pupil from blurred noisy images,Comput.
Meth. Prog. Biomed. 77 (2005) 3948.
[63] R.C.E. Gonzalez, R.E. Woods, Digital Image
Processing,Addison-Wesley Publishing Company, 1992.
[64] J. Canny, A computational approach to edge detection,
IEEETrans. Pattern Anal. Mach. Intell. 8 (1986) 679698.
[65] T.-C. Chen, K.-L. Chung, An efcient randomized algorithmfor
detecting circles, Comput. Vis. Image Understand. 83(2001)
172191.
[66] B.C. Goldwater, Psychological signicance of
pupillarymovements, Psychol. Bull. 77 (1972) 340355.
[67] W. Lu, J. Tan, K. Zhang, B. Lei, Computerized mouse
pupilsize measurement for pupillary light reex analysis,Comput.
Meth. Prog. Biomed. 90 (2008) 202209.
[68] E. Granholm, S. Morris, D. Galasko, C. Shults, E. Rogers,
B.Vukov, Tropicamide effects on pupil size and pupillary
lightreexes in Alzheimers and Parkinsons disease, Int.
J.Psychophysiol. 47 (2003) 95115.
An open-source, FireWire camera-based, Labview-controlled image
acquisition system for automated, dynamic pupillometry and...1
Introduction2 Materials and methods2.1 Overview2.2 Hardware for
image acquisition2.2.1 Head-mounted arrangement2.2.2 Desk-mount
camera arrangement
2.3 System software2.3.1 Image acquisition2.3.1.1 System
configuration module2.3.1.2 Pixel-to-metric unit conversion
module2.3.1.3 Plug-in for event-triggered video clip
acquisition2.3.1.4 Plug-in for long-term continuous
recording2.3.1.5 Protocol validation module
2.3.2 Offline plug-ins for pupil analysis2.3.2.1 Pupil
segmentation2.3.2.2 Pupil size estimation2.3.2.3 Post-processing
artifact rejection module
3 Results3.1 Timing accuracy of synchronization3.2 Assessing
hardware constraints for reliable real-time video acquisition3.3
Pupil size measurements
4 DiscussionConflict of interest
statementAcknowledgmentsReferences