Life: The Excitement of Biology 4(3) 174 An Illustrated, Step by Step Workflow for Digitizing Video Home Systems, Enhancing Their Visual Quality, Placing a Screen-Visible Time Stamp, and Tracking Movement Using Computer Vision Technology 1 Jorge A. Santiago-Blay 2 , Michael A. Caprio 3 , Kevin McLane 4 , Rebecca Maawad 5 , Patrick Blake Leeper 6 , Cody Andrew Henry 6 , Joseph Royer 6 , and Loren Glen Brewster 6 Abstract: We present an illustrated workflow for digitizing Video Home Systems (VHS) videos, enhancing the visual quality of the digitized videos, placing a screen-visible time stamp, and tracking the movements of imaged entities with computer vision technology. We illustrate this protocol using videos of adult Drosophila melanogaster reproductive behavior originally stored as VHS videos. A vast amount of biological information is now potentially easily analyzable by unleashing the awesome power of digital technology. Key Words: VHS tapes, analog format, digital format, digital video quality improvement, digitally time stamping, computer vision technology, digital automatic tracking, Drosophila, behavior, evolution, sexual selection, quantification of observations Movement is generally considered tantamount with life. When curious about whether something that looks biological is alive, we tend to jolt at the slightest sign of movement realizing that, if it moves, likely, it is alive. Scientific, quantifiable answers to numerous questions pertaining to motion are now more available than ever owing to rapid advances in digital technology. For instance, to ascertain differences between typical and extreme motion of living things (be it of excellence, as in that of sports superstars, or of underachievers), markers are attached to various joints and bony prominences. Thereafter, scientists use computerized optical motion analyses systems identify the markers as they move, creating a detailed report (Maheswaran 2015, O’Sullivan et al. 2014). The machines that garner these data not only have become increasingly miniaturized (as in wearable biometric devices, such as the Fitbit ® ), abler to learn, faster, more 1 Submitted on September 1, 2016. Accepted on November 5, 2016. Last revisions received on November 28, 2016. 2 217 Wynwood Road, York, Pennsylvania 17402 USA. E-mail: [email protected] . 3 Department of Biochemistry, Molecular Biology, Entomology, & Plant Pathology, Mississippi State University. Mississippi State, Mississippi 39762 USA. E-mail: [email protected] . 4 432 Gun Club Road, York, Pennsylvania 17406 USA. E-mail: [email protected]5 Department of Psychology, Philadelphia, Pennsylvania 19104 USA. E-mail: [email protected]6 Information Sciences and Technology Center, Office of Computer and Information Systems, The Pennsylvania State University, York, Pennsylvania 17403 USA. E-mails: [email protected] (CAH, current affiliation unknown), [email protected] (JR), and [email protected] (LGB), respectively. DOI: 10.9784/LEB4(3)SantiagoBlay01 Electronically available on November 28, 2016. Mailed on November 28, 2016.
41
Embed
An Illustrated, Step by Step Workflow for Digitizing Video ......8 As the digitizing time is 1:1 with respect to the VHS time, a small, battery-operated visual alarm was set to blink
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Life: The Excitement of Biology 4(3) 174
An Illustrated, Step by Step Workflow for Digitizing Video
Home Systems, Enhancing Their Visual Quality, Placing a
Screen-Visible Time Stamp, and Tracking Movement
Using Computer Vision Technology1
Jorge A. Santiago-Blay2, Michael A. Caprio3, Kevin McLane4,
Rebecca Maawad5, Patrick Blake Leeper6, Cody Andrew Henry6,
Joseph Royer6, and Loren Glen Brewster6
Abstract: We present an illustrated workflow for digitizing Video Home Systems (VHS)
videos, enhancing the visual quality of the digitized videos, placing a screen-visible time
stamp, and tracking the movements of imaged entities with computer vision technology.
We illustrate this protocol using videos of adult Drosophila melanogaster reproductive
behavior originally stored as VHS videos. A vast amount of biological information is now
potentially easily analyzable by unleashing the awesome power of digital technology.
Key Words: VHS tapes, analog format, digital format, digital video quality improvement,
digitally time stamping, computer vision technology, digital automatic tracking,
Drosophila, behavior, evolution, sexual selection, quantification of observations
Movement is generally considered tantamount with life. When curious about
whether something that looks biological is alive, we tend to jolt at the slightest
sign of movement realizing that, if it moves, likely, it is alive. Scientific,
quantifiable answers to numerous questions pertaining to motion are now more
available than ever owing to rapid advances in digital technology. For instance,
to ascertain differences between typical and extreme motion of living things (be
it of excellence, as in that of sports superstars, or of underachievers), markers are
attached to various joints and bony prominences. Thereafter, scientists use
computerized optical motion analyses systems identify the markers as they move,
creating a detailed report (Maheswaran 2015, O’Sullivan et al. 2014). The
machines that garner these data not only have become increasingly miniaturized
(as in wearable biometric devices, such as the Fitbit®), abler to learn, faster, more
1 Submitted on September 1, 2016. Accepted on November 5, 2016. Last revisions received on
November 28, 2016. 2 217 Wynwood Road, York, Pennsylvania 17402 USA. E-mail: [email protected] . 3 Department of Biochemistry, Molecular Biology, Entomology, & Plant Pathology, Mississippi State
University. Mississippi State, Mississippi 39762 USA. E-mail: [email protected] . 4 432 Gun Club Road, York, Pennsylvania 17406 USA. E-mail: [email protected] 5 Department of Psychology, Philadelphia, Pennsylvania 19104 USA. E-mail: [email protected] 6 Information Sciences and Technology Center, Office of Computer and Information Systems, The
little ruler would assist in the eventual digital estimation of x and y positions of
the flies. The flies were videotaped until shortly after the last remaining virgin
female in a vial mated. As in commonly done in cinematography, we refer to each
of the videotaped portions as a "take". Although the videos were qualitatively
analyzed (JASB unpublished data), Santiago-Blay and coworkers began
digitizing the tapes in 2015 with the long-term goal of quantifying the behaviors.
Some 150-200 hours of videotaped courtship from approximately1000 flies have
been now digitized.
Figure 2. A digitized (and unimproved) VHS
showing six Drosophila melanogaster flies,
three flies per vial. Note the ruler eventually
used to generate x and y coordinates. Compare
quality of this image with that on Figure 12.
Digitizing analog VHS videos using Elgato Video Capture software7
Surprisingly, the VHS videos remained in excellent
condition after 20 years inside cardboard boxes located in
unspecialized storage facilities. Each video was reviewed in a
VHS cassette recorder and the length of each take rapidly
noted using the fast forward control. Takes were digitized
using the Elgato Video Capture software (https://www.elgato.com/en/video-
capture, San Francisco, California, USA; cost approximately 100 US Dollars,
excluding taxes and other charges)8. Numerous other software packages are
available (see Table 1). The Elgato Video Capture software was chosen for its
ease of use, going directly from VHS to digital, without intermediate steps.
Below, we show a series of screen shots (Figures 3-11) illustrating the steps
we followed to digitize an analog video stored in VHS format. Sometimes, takes
longer than 45 minutes crashed our computer, necessitating a second (or, rarely,
a third) digitizing attempt.
7 The video capture software options are very large as there are a variety of quality devices to capture
the output of an analog VHS player and convert it to digital PC input. There are also many different software options available. The different variations of these products are in most cases of good
quality and are easy to use, but would not be immediately recognized by the typical consumer.
Additionally, there are options for all major operating systems. A range of video capture products available for sale can be seen on newegg.com, a popular computer hardware vendor. The ranges of
complexity and price are rather wide. Other mainstream devices might come from Hauppauge or
Blackmagic. We consider that Elgato is probably one of the easiest and most used option, and it is a complete bundle.
8 As the digitizing time is 1:1 with respect to the VHS time, a small, battery-operated visual alarm
was set to blink (go off) shortly before the end of the take. The visual alarm alerted us to prepare to stop digitizing.
who wishes to run the program will have to re-enter the entire texts. The compiler
then compiles all those files into a new executable file. Although the files
available through the link are listed alphabetically, they are called into action as
the main tracking program are invoked.
The techniques used are based on routines for video surveillance. The goal
of most tracking software is to isolate objects that do not move, or the
“background”, from objects that are moving, the “foreground”. These terms have
nothing to do with the relative position of objects and only refer to still or moving
objects. Tracking objects in videos normally starts by taking a standard reference
frame with no subjects in it as in previous work with bedbugs (Cimex lectularius
L., Insecta: Hemiptera: Cimicidae) (Goddard et al. 2015). This frame serves as a
static background reference model or comparison frame. Each subsequent frame
is compared to this initial frame and altered pixel values are assumed to represent
moving objects. In the case of these videos, there were no blank frames without
insects (Figure 27A). OpenCV includes several routines that allow for dynamic
background model updating.
Author Caprio developed a hybrid system, using the
cv::BackgroundSubtractorMOG210 class (Zivkovic 2004) on the initial 1800
frames of each video file to build a statistical model of the reference background.
The difference between frame 1800 and the background model constructed from
the first1800 frames clearly shows that for the most part the insects have been
eliminated from the background model (Figure 27B). While the
cv::BackgroundSubtractorMOG2 class can also segment images into background
9 For m or e i n fo rma t i on on Op en CV l i b ra r y, p l ea s e v i s i t t h i s l i n k : h t t p : / / d ocs .op en c v .o r g / t ru n k / an n ota t ed .h tml# gsc . t ab =0 Fo otn ot es 9 -1 5 r e f er
t o s i t es on t h e h u g e Op en CV l i b ra r y( Op en CV (Op en Sou rc e Comp u te r
Vi s ion ) . 2 0 1 6 . h t t p : / / op en cv . or g / 10 http://docs.opencv.org/trunk/d7/d7b/classcv_1_1BackgroundSubtractorMOG2.html#gsc.tab=0
_Peafowl.gif). By carefully dissecting the behavior of the Drosophila flies with the
workflow we have shown in this paper, we hope to detect the elements of those
behaviors that make males flies more (or less) successful with their conspecific
females.
Although nobody knows how many hours of VHS tapes with potentially
valuable information there are out there, two anonymous reviewers of an earlier
version of this paper said “there must be huge amounts of videotaped research
material sitting on shelves gathering dust” and “I am certain great many
individuals will be interested in the technical aspects of the work”. (A vast
amount of data is now potentially easily analyzable by unleashing the awesome
power that digital technology provides. Although the tracking software is not
perfect – errors caused by objects – the flies - moving on top of or close to others,
known as occlusions, need to be corrected manually, the tracking software does
the tedious tracking job more quickly and accurately than a human can and frees
the investigator to do what s/he is best at, annotating the spreadsheet with the
observed behaviors and interpreting them.
In summary, the workflow (Figure 1) herein presented is, as follows. First,
transform analog data into digital (Figures 3-11), potentially unleashing the
awesome power of digital technology. Second, improve the visual appeal of the
newly digitized video (Figures 12-15). This is also important as tracking software
operating on low-quality digitized video will yield nearly useless digital videos.
Third, time-stamp the digitized video such that it is visible on the computer screen
as the user will need to study, correct, and interpret the video on the spreadsheet
(Figures 16-25). Fourth, track the movement data, correcting it as needed16, using
computer vision technology, as shown in this paper (Figures 26-33).
Acknowledgments
Coauthor JASB thanks Dr. Chung-I Wu (Department of Ecology and Evolution, University of
Chicago) for providing the laboratory facilities in which the original videotapes were generated in the mid-1990s. We would like to wholeheartedly thank Mr. James Oplinger and Ms. Suzanne Shaffer
16 The tracking software yields x, y and time coordinates automatically but, as always, the human user needs to check the output as the tracking is not perfect.
Life: The Excitement of Biology 4(3) 212
(both at Pennsylvania State University, York) for their help in during several, technology-related,
stages of this project. Ms. Jessica Petrie and Dr. Robert Farrell facilitated the assistance of Mr. Papa Kojo Kuranchie (The Pennsylvania State University, York. Pennsylvania) during the earliest phase of
the digitization project. Dr. Wayne Rasband (National Institutes of Health, National Institute of
Mental Health, Bethesda, Maryland, USA) and Dr. Nico Stuurman (University of California, San Francisco, California, USA), who created and made available the MTrack2 plugin, made us aware of
pertinent references. Mr. Adam Steinberg (Elgato Systems, San Francisco, California, USA) granted
permission to use Figures 3-11 and a colleague from Blackmagic Design who requested anonymity granted permission to use Figures 16-25. Mr. Andy Ghozali (“Zocster”, Christchurch, New Zealand),
a volunteer for iMore, a community of Apple product users, was available for discussions with author
JASB on the use of Figures 12-15. Blay Publishers LLC assumes responsibility for their use. Robert Costello (National Museum of Natural History, Smithsonian Institution, Washington, District of
Columbia, USA) provided pertinent references. Mr. Jie Jun Zhu (theITSupportCenter, LLC,
Conshohocken, Pennsylvania, USA) created the cross-reference links that facilitate navigation between the major sections of this paper. Four anonymous colleagues and a Guest Editor reviewed
several iterations of this paper and offered numerous constructive suggestions. We are profoundly
grateful to all.
Literature Cited
Abramoff, M.D., P. J. Magalhaes, S. J. Ram, 2004. Image Processing with ImageJ. Biophotonics International 11(7):36-42. https://imagej.nih.gov/ij/docs/pdfs/Image_Processing_with_ImageJ.pdf
Andersson, M. 1994. Sexual Selection. Princeton University Press. Princeton, New Jersey, USA. 599 pp.
Block, B. 2010. Tagging tuna in the deep ocean. Filmed in April 2010 at Mission Blue Voyage. 20’06”
https://www.ted.com/talks/barbara_block_tagging_tuna_in_the_deep_ocean?language=en Bradski, G. 2000. The OpenCV Library. Dr. Dobb's Journal: Software Tools for the Professional
Programmer (Redwood City, California, USA). No volume (issue) or inclusive pagination available.
Burke, R. 2014. How stores track your shopping behavior. TEDxIndianapolis. TEDx Talks. 16’14”.
https://www.youtube.com/watch?v=jeQ7C4JLpug
Crump, M. 2014. The small and surprisingly dangerous detail the police track about you. Filmed in
Davidson, S., G. Bohrer, R. Weinzierl, R. Kays, and M. Wikelski. 2014. Scaling up the impact of local
animal telemetry studies using Movebank. American Fisheries Society 144th Annual Meeting. Conference Paper. (Centre des congrès de Québec // Québec City Convention Centre).
_Telemetry_Studies_Using_Movebank Goddard, J., M. Caprio, and I. I. Goddard. 2015. Diffusion rates and dispersal patterns of unfed versus
recently fed bed bugs (Cimex lectularius L.). Insects 6:792–804.
http://dx.doi.org/10.3390/insects6040792 Gould, J. L. and C. G. Gould. 1996. Sexual Selection. Mate Choice and Courtship in Nature. Scientific
American Library. A Division of HPHLP. New York, NY, USA. 277 pp.
He, Z., R. Kays, Z. Zhang, G. Ning, C. Huang, T. X. Han, J. Millspaugh, T. Forrester, and W. McShea. 2016. Visual Informatics Tools for Supporting Large-Scale Collaborative Wildlife Monitoring
with Citizen Scientists. IEEE Circuits and Systems Magazine 16(1):73-86.
10.1109/MCAS.2015.2510200 Humphreys, T. 2012. How to fool a GPS. Filmed in February 2012 at TEDxAustin. 15’45”.
uage=en#t-30837 ImageJ. Image Processing and Analysis in Java. [2016]. https://imagej.nih.gov/ij/index.html (Asking site
recommended way to render this citation.)
Kays, R., M. C. Crofoot, W. Jetz, and M. Wikelski. 2015. Terrestrial animal tracking as an eye on life and planet. Science 348(6240):1222, aaa2478-1 to aaa2478 DOI: 10.1126/science.aaa2478
Killingsworth. M. 2011. Want to be happier? Stay in the moment. Filmed in November 2011 at
Klin, A. 2011. A new way to diagnose autism. TEDxPeachtree. Filmed in September 2011. 19’44”. https://www.ted.com/talks/ami_klin_a_new_way_to_diagnose_autism?language=en#t-814944
Klopfenstein, D. R. and R. D. Vale. 2004. The lipid binding pleckstrin homology in UNC-104 kinesin
is necessary for synaptic vesicle transport in the Caenorhabditis elegans. Molecular Biology of the Cell 15:37293739.
Koh, L. P. 2013. A drone's-eye view of conservation. Filmed in June 2013 at TEDGlobal 2013. 13’27”.
https://www.ted.com/talks/lian_pin_koh_a_drone_s_eye_view_of_conservation?language=en Kovacs, G. 2012. Tracking Our Online Trackers. TED2012. Filmed in February 2012. 6’39”.
Laskin, D. 2013. Tracking grizzly bears from space. TED-Ed. 4’14” https://www.youtube.com/watch?v=mW1xBc1dwqI , http://ed.ted.com/lessons/tracking-grizzly-bears-
from-space-david-laskin
Maheswaran, R. 2015. The math behind basketball's wildest moves. TED2015. Filmed in March 2015. 12’08”. https://www.ted.com/talks/rajiv_maheswaran_the_math_behind_basketball_s_wildest_moves?languag
e=en#t-73090
Movebank. for Animal Tracking Data. 2016. https://www.movebank.org/ Last accessed on Sepptember 1, 2016.
OpenCV (Open Source Computer Vision). 2016. http://opencv.org/ Last accessed on Sepptember 1, 2016.
O'Sullivan, S. B., T. J. Schmitz, and G. D. Fulk. 2014. Physical Rehabilitation. Sixth Edition. F. A.
Davis Company. Philadelphia, Pennsylvania, USA. 1505 pp.
Oxman, N. 2015. Design at the intersection of technology and biology. TED2015. Filmed in March 2015. 17’36”
=en Parker, J. E. A., N. Angarita-Jaimes, M. Abe, C. E. Towers, D. Towers, and P. J. McCall. 2015.
Infrared video tracking of Anopheles gambiae at insecticide-treated bed nets reveals rapid
decisive impact after brief localised net contact. Scientific Reports 5(13392). http://dx.doi.org/10.1038/srep13392
Price, P. W. 1996. Biological Evolution. Saunders College Publishing. Hartcourt Brace College
Publishers. Fort Worth, Texas, USA. 418 pp. Rasband, W. S. 1997-2015. ImageJ. United States National Institutes of Health. Bethesda, Maryland,
USA http://imagej.nih.gov/ij/ , doi:10.1038/nmeth.2089
Ren, X., T. X. Han, and Z. He. 2013. Ensemble video object cut in highly dynamic scenes. 2013 IEEE Conference on Computer Vision and Pattern Recognition (CVPR). 23-28 June 2013. Portland
Oregon, USA. pp. 1947-1954. 10.1109/CVPR.2013.254
Rowcliffe, J. M., P. A. Jansen, R. Kays, B. Kranstauber, C. Carbone. 2016. Wildlife speed cameras: measuring animal travel speed and day range using camera traps. Remote Sensing in Ecology
and Conservation 2(2):84-94. DOI: 10.1002/rse2.17 ,
http://onlinelibrary.wiley.com/doi/10.1002/rse2.17/epdf Samadani, U. 2015. Eye-tracking: adding insight to injury. TEDMED. 6’23”.
https://www.youtube.com/watch?v=Pq3PPcXE4Xc
Schneider, C. A., W. S. Rasband, and K. W. Eliceiri. 2012. NIH Image to ImageJ: 25 years of image analysis. Nature Methods 9:671-675.
Shinozuka, K. 2014. My simple invention, designed to keep my grandfather safe .
TEDYouth 2014. Filmed in November 2014. 5’46”. https://www.ted.com/talks/kenneth_shinozuka_my_simple_invention_designed_t
o_keep_my_grandfather_safe?language=en#t -125956
Spitz, M. 2012. Your phone company is watching. TEDGlobal 2012. Filmed in June 2012. 9’56” https://www.ted.com/talks/malte_spitz_your_phone_company_is_watching?language=en#t-
33031
Stuurman, N. 2003. MTrack2. http://valelab.ucsf.edu/~nstuurman/ijplugins/MTrack2.html . (Accessed on May XX, 2016.) (Asking site recommended way to render this citation.)
WINanalyze. Motion Tracking & Analysis Software WINanalyze. 2016. http://winanalyze.com/ Last
accessed on Sepptember 1, 2016. A substantial listing of references to recent uses of WINanalyze can be found here, http://winanalyze.com/motion-tracking-references/ .
Wolf, G. 2010. The Quantified Self. TED@Cannes. 5’10”.
https://www.ted.com/talks/gary_wolf_the_quantified_self?language=en Zimmer, C. and D. J. Emlen. 2016. Evolution. Making Sense of Life. Second Edition. Roberts and
Company. Greenwood Village, Colorado, USA. 707 pp.
Zivkovic, Z. 2004. Improved adaptive Gaussian mixture model for background subtraction. pp. 28–31. In, Kittler, J., M. Petrou, M. S. Nixon, and E. R. Hancock (Editors). Proceedings of the 17th
International Conference on Pattern Recognition. Volume 2. August 23-26, 2004. (Cambridge,
England, United Kingdom). Institute of Electrical and Electronics Engineers (IEEE) Computer Society Press. Los Alamitos, California, USA. xxxiv, 1005
pp. http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=1333992&tag=1,