Wearable Computing First Semester Report Fall Semester 2010 by Nick Brantley Ethyn Feldman Celia Pietsch Prepared to partially fulfill the requirements for ECE401 Department of Electrical and Computer Engineering Colorado State University Fort Collins, Colorado 80523 Project advisor: Sudeep Pasricha
28
Embed
Wearable Computing - Colorado State University · SixthSense, a groundbreaking wearable computer, which was the inspiration for our wearable computing project. Microsoft’s Kinect
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Wearable Computing First Semester Report
Fall Semester 2010
by
Nick Brantley
Ethyn Feldman
Celia Pietsch
Prepared to partially fulfill the requirements for
ECE401
Department of Electrical and Computer Engineering
Colorado State University
Fort Collins, Colorado 80523
Project advisor: Sudeep Pasricha
ii
ABSTRACT
With the growing fascination in mobile/portable devices comes the question, what is the
easiest way to access information throughout the day? Clearly, having to pull out a laptop, tablet
or cell phone every time you need access to information can be frustrating. What if there was a
device that could be worn everyday that put this information at your fingertips without having to
dig into your bag or pockets? Our wearable computer allows users to interact with their system
via simple hand gestures and view data in an easy to use graphical user interface that can be
displayed virtually anywhere.
We have taken current color tracking techniques used in the Touchless software
development kit (SDK) and spun it so hand gestures can be recognized by using only a webcam.
The webcam is worn on an apparatus that hangs from the user’s neck that also harnesses a pico-
projector. The webcam takes in gestures which manipulate the graphical user interface that is
projected onto nearby surfaces by the pico-projector. This system will allow users to easily
access information using nothing but their hands. Not only is this a new and exciting way of
using computers, but the hand gestures used to control the system are simple, which enables this
computing system to be used by almost anyone.
Human-computer interaction (HCI) is a heavily researched area in today’s technology
driven world. Most people experience HCI using a mouse and keyboard as input devices. We
wanted a more natural way to interact with the computer that also allows instant access to
information. We are developing a real-time system that is always on and available to collect data
at any moment but also is accessible “on-the-fly”. Our aim is to make our machine usable and
receptive to the user’s needs in a timely manner. We do this by developing algorithms to detect
hand gestures that map to controlling the graphical user interface (GUI) in an expected manner.
This allows the user to access this information without the need to turn on a laptop or reach in
their pocket to view their computer device. In the future we will be tethering multiple health
sensors to our system that will provide a user and their physician or other family members access
to up-to-the-minute health statistics. We will also be looking at shrinking our apparatus as well
as moving from a laptop to a smaller portable device for our processing engine.
iii
TABLE OF CONTENTS
Title i
Abstract ii
Table of Contents iii
List of Figures iv
I. Introduction 1
II. Review of Previous Work 4
III. Technical Features 7
IV. Technical Problems 11
V. Design Decisions and Alternative Approaches 12
VI. Conclusion and Future Work 13
VII. Product Marketing 15
VII. Ethics 16
References 17
Bibliography 18
Appendix A – Glossary 19
Appendix B – Budget 21
Appendix C – IEEE Grant Proposal 22
Acknowledgements 24
iv
LIST OF FIGURES
Figure 1 Microsoft Kinect 5
Figure 2 PlayStation Move 5
Figure 3 Wii Remote 5
Figure 4 SixthSense 6
Figure 5 GUI Main Menu 8
Figure 6 Map Application 8
Figure 7 Wearable Computing Prototype 10
1
Chapter I – INTRODUCTION
Modern society has quickly embraced the world of technology, and has openly accepted
for the integration of computers in our everyday tasks. According to the International
Telecommunication Union, close to 30% of the 2010 world population connects to the internet
on a regular basis [1]
. With the ever increasing popularity of the internet, and the use of
computers in our daily lives, developers are creating useful and manageable technology that is
easily accessible whenever and wherever it is needed.
As senior Electrical and Computer Engineering majors at Colorado State University, we
have taken on the task of developing a computing system that will effectively integrate a
computer with the user’s daily life. With a successful project, we will have created a wearable
computing system that anyone and everyone can easily carry with them every day that includes a
sensor network to monitor and report various information about the user.
Our wearable computing system is being developed with the incorporation of a display
and controls that can easily be seen and used without requiring a laptop or other cumbersome
device. By wirelessly connecting a smartphone with a webcam and a pocket projector, we are
creating a device that will be non-intrusive and non-restrictive. The projector will project a
graphical user interface in front of the user, and the webcam will take in video of the user’s
surroundings. With the webcam, the user will have the ability to control the system by using
simple hand gestures. This wearable interface will allow the user to manage the different
functions of the system and wirelessly communicate with the smartphone device that will never
need to leave the user’s pocket. Our prototype can take many forms: a pendant worn around the
neck, a clip on device that can be attached at numerous locations or even a single-strap backpack
with the hardware embedded in the strap.
A key attribute of the developed system will be the incorporation of various health
monitoring sensors. These devices may include a heart rate monitor, pulse oximeter, skin
thermometer, accelerometer and/or gyroscope. The inclusion of these devices will allow the user
to monitor their personal health in real-time, and will have the ability to send alert notifications
and health information to healthcare professionals.
Ideally, the wearable system will be used in a way that will allow the user to keep track
of their health and easily call for emergency help if needed. A heart rate monitor and pulse
oximeter will be able to monitor pulse and cardiac functions, a skin thermometer will check the
user’s body temperature. An accelerometer or gyroscope will be used to observe the user’s
movements, and will use a “free-fall detection” protocol to identify if the user collapses or falls
down. Emergency assistance can be immediately alerted based on the sensor data collected.
2
Within this report, we will describe in detail all of the different processes and procedures
we have been working on in our effort to develop a working and effective prototype of our
wearable computing design. In Chapter II, we discuss similar various projects that are being
developed concurrently with our wearable system. Today’s recent and most innovative
interactive computing systems include features that are comparable to our design. Similar
devices that incorporate a gesture-aware interface include cutting edge video gaming systems
like the Wii, Xbox, and PlayStation. The SixthSense gestural interface project from the MIT
media lab has many elements we were looking for in our prototype and has had a major
influence on the development of our project.
Using the related work of other interactive computer designs as reference for our project
design, we describe in Chapter III the technical features that we have integrated into our project.
Within this section we will describe our creation of an easy to use graphical user interface (GUI),
our software coding process and details, and our physical project design specifications.
While developing our wearable computing system and its features, there were some
technical problems encountered. Chapter IV specifies the technical issues that we have
encountered in the design process of our prototype. Some concerns we have include inefficient
video quality, unreliable color identification and tracking, inaccessible webcam features and
programmatic difficulties with the pico-projector.
The problems that we have encountered in the development of our project have not
deterred us from the exciting direction our design is headed. Chapter V explains the decisions
and approaches we have taken to arrive at our project’s current state. The strenuous trial and
error process led us to the realization that a great deal of work would be needed to complete this
exciting design project. After much research & work, we came up with a plausible and
marketable product, and are very excited to see the end result.
After making a decision as to what we wanted to create for our project, we set goals for
the development of our design. Chapter VI discusses these goals and plans that will hopefully be
set in motion over the coming spring semester. Our primary task is to effectively incorporate
medical sensors into our prototype’s list of functions. In addition to working on the sensors, we
also will be attempting to downscale our system to make it more portable. The gestural interface
design currently uses a laptop as a computing engine, but ideally should run off a smartphone or
tablet PC. After we have a working and compact design, we would like to investigate different
power consumption algorithms so that we can decrease our power usage and increase our battery
life.
Once we have created a working and efficient product, we will move on to the business
side of the project. The marketing demographics for our design project are described in Chapter
VII. One working design will be marketed to the general health services community so they can
get constant real-time health information. The second target market that we would want to
3
pursue would include the elderly population. Senior citizens would be able to use our product to
decrease the number of trips to the doctor, and more importantly, have the ability to notify
emergency personnel if needed.
Every engineering organization has a written “Code of Ethics” that all professional
engineers need to abide by. This code is set in place so that every engineer and client can be
protected from unethical practices such as copyright infringement and the right to confidentiality.
Chapter VIII touches on some ethical concerns that we may encounter with our design, and
discusses the actions that are being taken to prevent any violation of the Engineering Code of
Ethics.
4
Chapter II – REVIEW OF PREVIOUS WORK
Gesture recognition is prevalent in numerous types of technology today. Currently, all
major gaming consoles implement some type of gesture based game play. Microsoft implements
their gesture gaming via Kinect, Sony through PlayStation Move and PlayStation Eye and
Nintendo with its Wii Remote. Pranav Mistry, a PhD candidate at the MIT media lab created
SixthSense, a groundbreaking wearable computer, which was the inspiration for our wearable
computing project.
Microsoft’s Kinect is a controller-free gaming experience for the Xbox 360. The Kinect
sensor device uses a simple RGB camera along with 3D depth sensors to adapt to the gamer’s
environment.[2]
It accomplishes this by mapping the environment into a 3D picture and locating
the player’s body which can then be used to control games on the console. This popular and fast
selling technology shows that people are very interested in getting rid of input devices for their
entertainment machines and control the environment with their gestures. The Wii and
PlayStation also accomplish similar gesture-based gaming with the use of gesture control
devices. The Nintendo Wii uses a wireless controller that has an accelerometer to sense motion.
The PlayStation uses a uniquely colored wand that their PlayStation Eye camera tracks to detect
motion. The aforementioned gesture-based devices are shown in figures 1, 2 and 3, respectively.
Pranav Mistry’s SixthSense is a wearable gestural interface that lets a user interact with
digital information using natural hand gestures. (See figure 4.) We used the SixthSense design
when creating our own prototype as it has very similar qualities that we wanted to implement.
SixthSense also enables hand gesture recognition using color fiducial tracking and computer-
vision techniques. SixthSense incorporates the following applications that demonstrate how well
this type of gesture-based system works: a map application that uses natural hand gestures to
zoom and pan the images, a drawing application that allows a user to draw anything using their
fingers, a camera application that takes pictures if the user presents a “frame” gesture, and the
ability to draw symbols in the air using the index finger to accomplish certain tasks.
Mistry’s wearable device incorporates a pendant like design that hangs from the neck.
He uses a plastic ruler to support a digital webcam, pico-projector and mirror. He wears four
different colored fiducials on the tips of his fingers for his color tracking algorithms. SixthSense
has incorporated many design phases including one built in to a hat, another on a helmet and the
one we are temporarily using, the around-the-neck pendant. We are designing a similar
mechanism to incorporate hand gesture interaction between digital information and the user.
However, we will be tethering medical sensors such as a heart-rate monitor, pulse oximeter and
accelerometer to collect real-time statistics on the user’s health and make it accessible by a few
flicks of their hands.
5
Figure 1: Microsoft Kinect
Figure 2: PlayStation Move Figure 3: Wii Remote
6
Figure 4: Pranav Mistry’s SixthSense
NOTE: It can be seen that we borrowed Pranav Mistry’s pendant design while producing a
prototype of our project. This design is a temporary installment as we test our software package
and will reevaluate our design in the coming semester.
7
Chapter III – TECHNICAL FEATURES
The first element of our wearable computer involves the interaction layer. We first had to
design an easy-to-use graphical user interface. This GUI needed to be easy to read and
straightforward in order to manipulate with simple hand gestures, all the while being
aesthetically pleasing. We first developed a home screen on paper that is used to select from
various applications and has a local clock at the top center of the screen.
Before we implemented our GUI in code, we first had to find a way to implement our
hand gesture interaction layer. After numerous hours of research we discovered Touchless
SDK.[3]
This software development kit provides us with the color tracking algorithms that we
desired for our project. This C# solution analyzes each image captured from the webcam’s video
feed and searches for the RGB value of each colored marker. It then provides access to current
data on each marker that we can use for recognizing gestures. Visual Studio was used as our
development environment and our code was done in C#. This kit also has a webcam library that
is linked in that enables us to easily capture and manipulate the webcam’s video feed. Touchless
SDK was added to our project using the statement: using TouchlessLib;
After researching and testing the Touchless library we began working on our GUI’s main
menu panel. Figure 5 shows a screen shot of this main screen. It was then important to map the
location of the colored markers onto the main screen so the user knows how their fingers are
manipulating the GUI. The user can select applications by moving their fingers left or right from
the center of the screen and then pinching their thumb and index finger together to select the
desired application. Once the fingers are pinched a routine is called to bring the wanted
application’s panel to the front. A current application variable keeps track of the current
application layer and is used to determine which user interaction maps to which action in the
GUI. An example map application is shown in Figure 6.
8
Figure 5: GUI Main Menu
Figure 6: Map Application
9
To update the markers from the video feed we added the following marker event handlers:
_touchMgr.Markers[0].OnChange +=new EventHandler<MarkerEventArgs>(UpdateMarkerBlue); _touchMgr.Markers[1].OnChange += new EventHandler<MarkerEventArgs>(UpdateMarkerRed);
With these event handlers we can add implementations for hand gesture recognition as well as
calling methods related to handling GUI updating. So far we have only employed simple
gestures for choosing and starting an application as well as exiting the application to return to the
home screen. The pinched method returns true if the user’s fingers are pinched and is deployed
as follows:
private bool pinched() {
if (!_touchMgr.Markers[0].CurrentData.Present || !_touchMgr.Markers[1].CurrentData.Present)
return false;
int blueX = _touchMgr.Markers[0].CurrentData.X; int blueY = _touchMgr.Markers[0].CurrentData.Y; int redX = _touchMgr.Markers[1].CurrentData.X; int redY = _touchMgr.Markers[1].CurrentData.Y; if ((Math.Abs(redX - blueX) < 10) && (Math.Abs(redY - blueY) < 15)) { return true; } return false; }
The exit method determines if the user has pinched their fingers in the upper-right corner of the
application which signals the software to exit the current application and return to the main
An electronic device that measures proper acceleration, or the acceleration experienced relative
to freefall.
C#
A general purpose object-oriented programming language made by Microsoft. C# is part of the
.NET framework and has syntax very similar to Java.
COM Interface
The component object model is a binary interface that enables inter-process communication and
dynamic creation in a wide range of programming languages.
Event Handler
Methods implemented in source code that handles actions that is initiated outside the scope of
the program.
Fiducial
An object, usually colored, that is used in video processing to be tracked in a live video feed.
Colored fiducials can be tracked and then handled in code to represent hand gestures.
Gesture Recognition
A programmatic solution that uses mathematical algorithms to analyze the movements of the
face or hand and determine which gesture is being presented to the system.
Graphical User Interface (GUI)
A user interface implemented in software that allows a user to interact with the program in an
intuitive manner.
Heart Rate Monitor
An electronic personal monitoring device that allows a person to measure their heart rate in real
time.
Human-Computer Interaction
The study of interaction between people and computers. This is commonly accomplished using
a mouse and keyboard as input devices.
Pulse Oximeter
An electronic medical device that indirectly measures the oxygen saturation of a person’s blood.
20
Real-Time System (RTC)
The practice of making software reactive to strict time restraints and incorporate appropriate
response mechanisms.
RGB
An additive color model comprised of red, blue and green color values.
Software Development Kit (SDK)
A set of development tools that aids in the creation of applications in a software package.
SixthSense
A wearable gestural interface designed by Pranav Mistry at the MIT Media Lab.
Touchless SDK
A webcam multi-touch object tracking software development kit.
Visual Studio
An integrated development environment (IDE) from Microsoft that is used to develop console
and graphical user interface applications.
Wearable Computer
Computers that are worn on the body of the user that can be used in behavioral modeling, health
monitoring systems and reality augmenters.
21
APPENDIX B – BUDGET
Expenses Purchased Future
Webcam $80
Pocket Projector $130
Heart Rate Monitor $100
Accelerometer $40
Pulse Oximeter $50-$100
Wireless Interface $50-$100
Smart Phone $300
Tablet PC $150-$200
Total $310 ~$440
Funding Amount
ECE Project Budget $300
IEEE Mini Grant $500
Total $800
22
APPENDIX C – IEEE GRANT PROPOSAL
Application for IEEE Mini-Grant for Student Application Papers Applying Industry Standards
1) DATE OF APPLICATION: October 15, 2010
2) Project Title: Wearable Computing
3) Student(s) Name(s) and contact Information, including email and postal address:
Celia Rose Pietsch email: [email protected] p: 623 S Grant Ave Fort Collins, CO 80521 Nick Brantley email: [email protected] p: 850 S Overland Trl. #4 Fort Collins, CO 80521 Ethyn Feldman email: [email protected] p: 1705 Heatheridge dr UNIT C202 Fort Collins,CO 80526
4) Name of Student Project Leader: Nick Brantley
5) Name of Faculty Advisor/Mentor and contact Information. Must include email and postal mailing address: Sudeep Pasricha email: [email protected] p: 1373 Campus Delivery Electrical and Computer Engineering Colorado State University Fort Collins, CO 80523-1373
6) Institution: Colorado State University
7) Program/Course: Embedded Systems and Senior Design