Top Banner
LeafView: A User Interface for Automated Species Identification LeafView: A User Interface for Automated Species Identification and Data Collection and Data Collection Overview LeafView is a Tablet-PC–based user interface for automated identification of botanical species in the field, developed by the Columbia University, University of Maryland and Smithsonian Institution Electronic Field Guide Project. Botanists participated in the design process and user testing. The prototype has been field tested on Plummers Island, MD and is currently in use by Smithsonian botanists. Funded in part by NSF Grant IIS-03-25867. Any opinions, findings, conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the official views, opinions, or policy of the National Science Foundation. Collection Leaf image is captured with wireless camera and transferred to tablet. Context, such as GPS coordinates, date/time, and collector, is saved. Segmentation and Search Image is automatically segmented and displayed to user to verify segmentation quality. Matching algorithm is initiated in background. (Segmentation and matching algorithms are developed by our colleagues.) History Browsable history of samples and context from collection are maintained for tracking and comparison during a collection trip. Sean M. White Dominic M. Marino Steven K. Feiner Columbia University, Department of Computer Science {swhite,dmm2141,feiner}@cs.columbia.edu http://herbarium.cs.columbia.edu Working Prototypes The field prototypes use Motion Computing LE1600 and Lenovo ThinkPad X41 Tablet PCs, a Delorme Earthmate GPS, a Nikon Coolpix P1 Wi-Fi camera, and a Sony Ericsson T616 Bluetooth camera phone. Inspection, Comparison, and Matching Specimen is displayed, along with ranked results from matching algorithm. Text and images of matched species from Smithsonian collection are inspected through zooming to examine venation and additional data. Prospective identifications are recorded in history with sample data. To appear in ACM UIST 2006 Conference Companion, Montreux, Switzerland, October 15-18, 2006.
1

LeafView: A User Interface for Au LeafView: A User ...

Jan 30, 2022

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: LeafView: A User Interface for Au LeafView: A User ...

LeafView: A User Interface for Automated Species Identification LeafView: A User Interface for Automated Species Identification and Data Collectionand Data Collection

OverviewLeafView is a Tablet-PC–based user interface for automated identification of botanical species in the field, developed by the Columbia University, University of Maryland and Smithsonian Institution Electronic Field Guide Project. Botanists participated in the design process and user testing. The prototype has been field tested on Plummers Island, MD and is currently in use by Smithsonian botanists.

Funded in part by NSF Grant IIS-03-25867. Any opinions, findings, conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the official views, opinions, or policy of the National Science Foundation.

CollectionLeaf image is captured with wireless camera and transferred to tablet. Context, such as GPS coordinates, date/time, and collector, is saved.

Segmentation and SearchImage is automatically segmented and displayed to user to verify segmentation quality. Matching algorithm is initiated in background. (Segmentation and matching algorithms are developed by our colleagues.)

HistoryBrowsable history of samples and context from collection are maintained for tracking and comparison during a collection trip.

Sean M. White Dominic M. Marino Steven K. FeinerColumbia University, Department of Computer Science

{swhite,dmm2141,feiner}@cs.columbia.edu

http://herbarium.cs.columbia.edu

Working PrototypesThe field prototypes use Motion Computing LE1600 and Lenovo ThinkPad X41 Tablet PCs, a Delorme Earthmate GPS, a Nikon Coolpix P1 Wi-Fi camera, and a Sony Ericsson T616 Bluetooth camera phone.

Inspection, Comparison, and MatchingSpecimen is displayed, along with ranked results from matching algorithm. Text and images of matched species from Smithsonian collection are inspected through zooming to examine venation and additional data. Prospective identifications are recorded in history with sample data.

To appear in ACM UIST 2006 Conference Companion, Montreux, Switzerland, October 15-18, 2006.