i APPARATUS AND METHOD FOR GESTURE RECOGNITION A PROJECT REPORT Submitted by PRATIK BHATT (080110111003) DEVAL MEHTA (080110111021) NIRAV FARASWAMI (080110111012) In fulfillment for the award of the degree of BACHELOR OF ENGINEERING In Electronics and Communication G.H.PATEL COLLEGE OF ENGINEERING AND TECHNOLOGY, V.V.NAGAR Gujarat Technological University, Ahmedabad MAY, 2012
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
G.H.PATEL COLLEGE OF ENGINEERING AND TECHNOLOGY, V.V.NAGAR
Gujarat Technological University, Ahmedabad
MAY, 2012
ii
CERTIFICATE
Date: 23-MAY-2012.
This is to certify that the dissertation entitled “APPARATUS AND METHOD FOR
GESTURE RECOGNITION ” has been carried out by PRATIK BHATT,
DEVAL MEHTA and NIRAV FARASWAMI under my guidance in
fulfillment of the degree of Bachelor of Engineering in Electronics and
Communication Engineering (8th Semester) of Gujarat Technological
University, Ahmedabad during the academic year 2011-12.
Project Guide: Head of Department
Prof. NILESH H DESAI. Dr. Chintan .K. Modi
G.H.PATEL COLLEGE OF ENGINEERING AND TECHNOLOGY
ELECTRONICS AND COMMUNICATION
2012
iii
ACKNOWLEDGEMENT
We are extremely thankful to our project guide Prof. Nilesh H Desai sir for being a
constant source of inspiration and encouragement. We would like to express our gratitude
towards him for solving our queries and helping us for the project.We are thankful to our
friends and families for their support and assistance.
iv
Abstract
In this world where every operation performed by human are getting simple, in order to make it more simpler we present an innovation which makes use of human GESTURE ,the very definition of being human. The invention can make dumb people speak, make disables people operate wheelchair, and can facilitate the life of human by providing HOME-AUTOMATION program e.g. controlling your Televisions and Air-conditioners just by making appropriate gesture.
The biggest advantage of this invention is that it is cost effective since image-processing is not used, thus reducing hardware and hence the cost. Device is made up of basic electronic components like IR sensor (IR LED and Photo-diode) in order to recognize gesture, and other basic processing unit such as microcontroller. The sensors are basically a proximity sensors which converts position of Fingers either open or close into proper electrical signal or logical level like “0” and “1”.
This data is then processed by microcontroller unit to generate different codes or signals for remote applications. Also for the application of speaking aid to dumb people with further use of Text to speech converters we can convert gesture into sound and hence can make dumb people speak.
v
List of figures
Figure Number
Figure Description Page no.
Fig 3.1 Finger is open 6 Fig 3.2 Finger is closed 6 Fig 3.3 Gesture for alphabet A with Code: 10000 7 Fig 3.4 Gesture for alphabet B with Code: 11110 7 Fig 3.5 Gesture for alphabet C with Code: 1111111 7 Fig 3.6 Gesture for alphabet E with Code: 111 8 Fig 3.7 Gesture for alphabet F with Code: 100011 8 Fig 3.8 Gesture for alphabet G with Code: 00001 8 Fig 3.9 Gesture for alphabet H with Code: 101001 8 Fig 3.10 Gesture for alphabet I with Code: 1000 9 Fig 3.11 Gesture for alphabet J with Code: 1000001000 9 Fig 3.12 Kinect gesture recognizing device 10 Fig 3.13 eyeSight’s gesture recognizing device 10 Fig 3.14 AcceleGlove’s gesture recognizing 11 Fig 4.1 General Block Diagram 14 Fig 4.2 General LED Construction 16 Fig 4.3 Surface mount LED 16
Characteristics of Photodiode 89c2051 Interfacing figure NE555 Interfacing figure ST3654 Interfacing figure TTS256 DIP Speakjet DIP TTS256 and Speakjet Interfacing figure Character LCD HT6230 DIP Right hand Microcontroller unit along with sensors Left hand Microcontroller unit along with sensors Neck Microcontroller unit along with TTS256 and Speakjet Finger open condition detailing Finger closed condition detailing Gesture operated T.V. Front view of sensors Front view of left hand with touchpad Front view of right hand with touchpad Position of touchpad for thumb Proteus simulation Right hand module Left hand module Completed module Breadboard implementation of TTS256 module
Chapter 2: Brief history of Gesture Recognizing methods 2
Chapter 3: Literature review 5
3.1 Gesture Recognition 5
3.1.1 Basics of Gesture Recognition 5
3.1.2 Principles of operation of Gesture Recognition Technology 6
3.1.3 Alphabetic Gestures and corresponding binary codes 7
3.2 Availability of system in market 9
3.2.1 Kinect for Windows, Gesture recognizing UI 9
3.2.2 eyesight’s Hand Gesture Recognizing Technology 10
3.2.3 AcceleGlove 11
3.3 Applications 12
3.4 Motivation and objectives 13
Chapter 4 : Gesture Recognition system and its components 14
x
4.1 Block Diagram 14
4.2 Sensors 15
4.2.1 IR LED 15
4.2.2 Photodiode 17
4.3 Left hand microcontroller unit 18
4.3.1 89c2051 18
4.3.2 NE555 Timer IC 21
4.3.3 RF Module 22
4.3.3.1 ST3654- Serial Interface IC 22
4.3.3.2 RF Transreciever 23
4.3.4 Right hand Microcontroller unit 24
4.3.5 TTS256 24
4.3.6 Speakjet IC 25
4.3.7 Character LCD 28
4.3.8 HT6230 IC 29
Chapter 5: Working/Implementation of Project Work 31
5.1 Drawings of method using IR 31
5.2 Working of the model using IR 34
5.3 Drawings of the method using touchpad 36
5.4 Working of the model using touchpad 38
5.5 Algorithm of C-program 39
Chapter 6: Result Analysis 41
6.1 Proteus Simulation 41
6.2 Hardware Results 42
Chapter 7: Conclusion 46
References 47
Rewards 48
1
Technological Revolution has given us a powerful tool to control our surrounding.
We recognize this tool as Remote controls. Today everything can be operated using
Remote controls, from T.Vs to door locks. Want to change the channel? Just click the
button. But in this process we have lost something much more valuable i.e. Being
Gesture oriented. Gestures are an important means of human being to express their
thoughts in the best possible way. So we came up with the questions: why can’t we
operate things around us just by our hand gestures? Why can’t we just change the
channels or switch on the lights with some gestures?
After detail research on gesture recognition, we were certain about one thing i.e. gesture
recognition need a CAMERA and a digital Processor (mainly laptops). The idea of
carrying a laptop on the back and camera on the head did not appeal to us that much. So
we challenged this notion and started developing an innovative concept for gesture
recognition which must use only basic electronics components and no cameras. The end
result is what we call now “Gesture speaks”.
“Gesture speaks” is a device which once recognizes gesture made by hand fingers.
The device has to be worn on the hands. The sensors consist of rings which are to be
worn on each finger. The sensors are basically proximity sensors which are discussed in
more detail later. They constantly monitor the position of the finger i.e. whether they
are open or close. On the basis of data of position of all the fingers, the device
decides which gesture is invoked.
INTRODUCTION Chapter-1
2
Gesture recognition is a topic in computer science and language technology with the
goal of interpreting human gestures via mathematical algorithms. Gestures can originate
from any bodily motion or state but commonly originate from the face or hand. Current
focuses in the field include emotion recognition from the face and hand gesture
recognition. Many approaches have been made using cameras and computer vision
algorithms to interpret sign language. However, the identification and recognition of and
human behaviors are also the subject of gesture recognition techniques.
Gesture recognition enables humans to interface with the machine (HMI) and
interact naturally without any mechanical devices. Using the concept of gesture
recognition, it is possible to point a finger at the computer screen so that the cursor will
move accordingly. This could potentially make conventional input devices such as
mouse, keyboards and even touch-screens redundant. Gesture recognition has been being
conducted with techniques from computer vision and image processing, up till now.
The literature includes work done on gesture recognition without using image
processing and computer vision, without using any kind of memory for storage, just
using very basic sensors of IR transmitter and receiver pair.
In computer interfaces, two types of gestures are distinguished:
�Offline gestures: Those gestures that are processed after the user interaction with the
object. An example is the gesture to activate a menu.
�Online gestures: Direct manipulation gestures. They are used to scale or rotate
�For socially assistive robotics: By using proper sensors (accelerometers and gyros)
worn on the body of a patient and by reading the values from those sensors, robots
can assist in patient rehabilitation. The best example can be stroke rehabilitation.
�Directional indication through pointing: Pointing has a very specific purpose in our
society, to reference an object or location based on its position relative to ourselves. The
Brief history of Gesture Chapter-2
Recognizing methods
3
use of gesture recognition to determine where a person is pointing is useful for
identifying the context of statements or instructions. This application is of particular
interest in the field of robotics.
�Control through facial gestures: Controlling a computer through facial gestures is a
useful application of gesture recognition for users who may not physically be able to use
a mouse or keyboard. Eye tracking in particular may be of use for controlling cursor
motion or focusing on elements of a display.
�Alternative computer interfaces: Foregoing the traditional keyboard and mouse setup
to interact with a computer, strong gesture recognition could allow users to
accomplish frequent or common tasks using hand or face gestures to a camera.
�Immersive game technology: Gestures can be used to control interactions within
video games to try and make the game player's experience more interactive or
immersive.
�Virtual controllers: For systems where the act of finding or acquiring a physical
controller could require too much time, gestures can be used as an alternative control
mechanism. Controlling secondary devices in car or controlling a television set are
examples of such usage.
�Affective computing: In affective computing, gesture recognition is used in the
process of identifying emotional expression through computer systems.
�Remote control: Through the use of gesture recognition, "remote control with the
wave of a hand" of various devices is possible. The signal must not only indicate the
desired response, but also which device to be controlled.
Input devices used up till now in this discipline as followed:
�Wired gloves: These can provide input to the computer about the position and rotation
of the hands using magnetic or inertial tracking devices. Furthermore, some gloves
can detect finger bending with a high degree of accuracy (5-10 degrees), or even provide
haptic feedback to the user, which is a simulation of the sense of touch. The first
commercially available hand-tracking glove-type device was the Data Glove, a glove-
type device which could detect hand position, movement and finger bending. This uses
fiber optic cables running down the back of the hand. Light pulses are created and
when the fingers are bent, light leaks through small cracks and the loss is registered,
giving an approximation of the hand pose.
4
�Depth-aware cameras: Using specialized cameras such as time-of-flight cameras, one
can generate a depth map of what is being seen through the camera at a short range, and
use this data to approximate a 3d representation of what is being seen. These can be
effective for detection of hand gestures due to their short range capabilities.
�Stereo cameras: Using two cameras whose relations to one another are known, a 3d
representation can be approximated by the output of the cameras. To get the cameras'
relations, one can use a positioning reference such as a lexian-stripe or infrared
emitters. In combination with direct motion measurement (6D-Vision) gestures can
directly be detected.
�Controller-based gestures: These controllers act as an extension of the body so that
when gestures are performed, some of their motion can be conveniently captured by
software. Mouse gestures are one such example, where the motion of the mouse is
correlated to a symbol being drawn by a person's hand, as is the Wii Remote, which can
study changes in acceleration over time to represent gestures. Devices such as the LG
Electronics Magic Wand, the Loop and the Scoop use Hillcrest Labs' Freespace
technology, which uses MEMS accelerometers, gyroscopes and other sensors to translate
gestures into cursor movement. The software also compensates for human tremor and
inadvertent movement.
�Single camera: A normal camera can be used for gesture recognition where the
resources/environment would not be convenient for other forms of image-based
recognition. Although not necessarily as effective as stereo or depth aware cameras,
using a single camera allows a greater possibility of accessibility to a wider audience.
5
3.1 Gesture recognition
3.1.1 Basics of gesture recognition:
With the use of sensor, fabricated with IR LED and Photodiode, the position of the
fingers is detected for both the hands. The sensors are to be worn on both hands, on each
individual finger. The sensors are in the form of a ring. All these rings are connected to
Microcontroller and Interfacing unit, where continuous polling of the fingers is done,
whether it is open or close. This procedure is repeated periodically.
Corresponding to the results of the polling of each finger, rather the position of the finger
being detected, a unique code for each position of the fingers is generated. The code
generated is different for each alphabet (A-Z) and numbers from 0 to 9. The position of
the fingers and the corresponding codes are designed according to American
standards for sign language of Deaf and Dumb people. There is code assigned for
SPACE between the words in a sentence and all for any kind of erroneous gesture
generated by the user, other than the predefined position of the fingers.
Care has been taken, considering the normal human tendencies for movement of the
hands, especially taking into account the thumb finger and tiny finger, for each gesture.
The polled data of fingers of the left hand is send via RF transmission using RF module,
to the right hand RF module, which is further processed in the right hand
Microcontroller unit.
Taking into consideration of the position of fingers of both the hands, a unique code
is generated from right hand Microcontroller unit and it is again send serially, via RF
communication to the RF detector module worn on the neck. The code is further
converted into ANCII letters and numbers using microcontroller. IC TTS256 converts
ANCII letters into allophonic codes. This allophone IC converts the allophonic codes
into corresponding electrical signals and it is further send to speaker.
LI TERATURE SURVEY Chapter-3
6
3.1.2 Principles of operation of gesture recognition:
Fig 3.1 Finger is open
Initially when the finger is open, the IR LED throws its ray in the direction as shown in
the figure. This ray is not sensed by the photodiode, thus resulting in no change in
the resistance of the photodiode and no change in the voltage level.
Fig 3.2 Finger is closed
7
When the finger is closed, the ray from the IR is detected by the photodiode. This
further changes the resistance of the photodiode and correspondingly changes the
voltage level. This change of voltage level is polled by the microcontroller and it is
further processed.
3.1.3 Alphabetic gestures and corresponding binary codes
Fig 3.3 Gesture for alphabet A with Code:10000
Fig 3.4 Gesture for alphabet B with Code: 11110
Fig 3.5 Gesture for alphabet C with Code: 1111111
8
Fig 3.6 Gesture for alphabet D with Code: 111
Fig 3.7 Gesture for alphabet E with Code: 100011
Fig 3.8 Gesture for alphabet G with Code: 00001
Fig 3.9 Gesture for alphabet H with Code: 101001
9
Fig 3.10 Gesture for alphabet I with Code: 10000
Fig 3.11 Gesture for alphabet J with Code: 1000001000
3.2 Availability of systems in market
There are many models and brands of gesture recognizing systems in different fields.
They are described with their brief features as followed.
3.2.1 Kinect for Windows, Gesture-recognizing UI around the Corner
After Microsoft evolved game controller the Kinect for Xbox 360 came to the fore,
speculation among tech circles saw a future for the device in the PC platform right
away, beyond being just a game controller. It looks like Redmond is taking steps in that
direction, with the groundwork for Kinect's arrival on the Windows PC platform
underway. Microsoft will release the Kinect for Windows software development kit
(SDK) this spring, so developers can start work on it right away. This could include
giving Games for Windows applications Kinect support which they enjoy on the
Xbox 360 platform.
10
Fig 3.12 Kinect gesture recognizing device
The possibilities are endless for non-game applications to make use of Kinect as a
gesture-recognizing and face-recognizing device. Gesture recognition UI and face-
recognition are pitched to be some of the defining features of Microsoft's next version of
Windows. Kinect for Windows SDK will be released on a non-commercial basis. "The
hope is that the SDK will unleash a wave of creativity to add to the already
exciting developments we’ve seen on top of Kinect. The SDK will provide access to
Kinect’s sensor as well audio and system API’s," the company commented.
3.2.2 eyeSight’s Hand Gesture Recognition Technology
Allows people to interact with devices using simple hand gestures. eyeSight’s solution
tracks the user’s hand motions in front of the device’s camera and converts these
gestures into user input commands that control the device and its applications. eyeSight’s
solution is based on advanced image processing and machine vision algorithms, which
analyze real time video input from common built-in cameras. The technology is
independent of the underlying processor and camera hardware and produces high
quality gesture recognitions using low-end VGA cameras.
Fig 3.13 eyeSight’s gesture recognizing device
11
Moreover, the technology is designed for embedded platforms. It is optimized to operate
utilizing minimal CPU and power consumption and supports challenging user
environments with poor as well as direct lighting conditions. eyeSight’s Hand Gesture
Recognition Technology is available for integration at any layer, either as part of the
device software stack or as low as the camera sensor silicon level.
3.2.3 AcceleGlove
The AcceleGlove is an instrumented glove that can be used as a
communication and control device for recognizing hand gestures or controlling
computers and robotic devices.
Fig 3.14 AcceleGlove’s gesture recognizing
The AcceleGlove contains six accelerometers on the fingers and the back of the hand
that sense hand and finger orientation and movements. Pattern recognition software
captures and recognizes static hand positions, as well as dynamic gestures.
What Will It Accomplish?
� Provides an ideal interface for recognizing gesture-based communication such as sign
language or military hand signals
�Provides an intuitive interface for computers, gaming devices, virtual environments
and visual displays
�Provides an intuitive interface for controlling robotic devices such as unmanned
vehicles and payloads such as robotic arms or cameras
�Provides a means for tracking hand movements for medical assessments and
rehabilitation.
12
3.3 Applications
1. The user can operate any home appliances like T.V, A.C or Microwave just by
making gestures. For e.g. if you want to change the channel to 24? Just point “2” with
two fingers and “4” with four fingers and the channel will be changed. Want to increase
the volume? Just point your thumbs up till the volume is needed to be changed. Same is
the case to change temperature in A.Cs and Microwave. Thus we don’t need to carry
different remotes for different appliances. Just by selecting different modes in this device
you can switch from operating A.C to T.V. Thus anything that can be controlled with
remote controls can be controlled with our device.
2. For disabled people using wheelchair, the user can operate the wheelchair with
his/her gestures. Want to go straight? Point your index finger. Want to turn left? Point
your right thumb. Want to turn right? Point your left thumb. Thus it can help the user the
better operate the wheel chair rather than manual effort or use of joysticks. The same
technique is applicable for controlling a ROBOT with gestures.
3. The most striking feature of this device is the ability of converting “Gesture into
speech”. Dumb people with speaking problems are resorted to sign language. But it is
very difficult to convey their message to a person not familiar with the sign language.
This innovation will give them voice and they will be able to convey their message
via speech rather than signs. For e.g. A dumb person just has to sign “C” “A” “T” and
device will speak “CAT”. Thus it is like adding voice to their sign language and it
will be nothing less than a boon to them.
4. For kinder-garden kids and small children to learn English alphabets and Number
system with gestures e.g. “Gesture speaks” understands sign language for every
English alphabet and numbers. Gesturing “1” or ‘4” will speak “Fourteen”. Gesturing
“A” will speak “A for Apple”.
5. Also the device can work as a Gesture controlled calculator. Gesture “1”, “+”
with two index fingers, then “5” and “=” and the device will either display or speak the
answer “six”. Thus it can save the time of pressing buttons repeatedly.
6. The above listed applications are just small part of what can be done with this device.
Applications are only limited to human imagination. Future application can vary from
controlling your laptop to change songs or ppt slides to gesture controlled mouse.
Another striking application is Gesture mobile. Interfacing GSM module with this device
13
can call or SMS any number without the need of actual mobile. Just gesture the number
and gesture “dial” and the call will be connected.
3.4 Motivation and Objectives
The present invention is in the technical field of electronics. More particularly the
present invention is in the technical field of optical gesture recognition through use
of simple sensors like IR LED and IR photodiode. Gesture recognition is mainly done
through image processing in present days but difficulty with this technique is the
complexity of hardware and software required. Also the cost related to such
technique is high, thus it cannot be used for home applications very efficiently.
This device can control different homemade appliances like Televisions, Air- conditions,
compact disk player. Further assigning every English alphabet a unique gesture, it can
convert “gestures into speech” and thus can be used as a speaking aid for dumb
people. This invention allows the deaf people to communicate with the normal
human beings in the world easily, with the ease for the people to easily understand their
sign language which is converted into speech. Also this device can be used for gesture
controlled wheelchair for disabled people.
14
4.1 Block Diagram:
Fig 4.1 General Block Diagram
List of system components are as follows:
1 Sensors
2 Left hand microcontroller unit
3 RF modem
4 Right hand microcontroller unit
APPARATUS FOR Chapter 4
GESTURE RECOGNITION
AND IT’S COMPONENTS.
15
5 TTS256 IC
6 Speakjet IC
7 Graphics LCD.
8 HT6230.
4.2 Sensors
4.2.1 IR LED
The LED consists of a chip of semiconducting material doped with impurities to create a p-n junction. As in other diodes, current flows easily from the p-side, or anode, to the n-side, or cathode, but not in the reverse direction. Charge-carriers— electrons and holes—flow into the junction from electrodes with different voltages. When an electron meets a hole, it falls into a lower energy level, and releases energy in the form of a photon.
The wavelength of the light emitted, and thus its color depends on the band gap energy of the materials forming the p-n junction. In silicon or germanium diodes, the electrons and holes recombine by a non-radiative transition, which produces no optical emission, because these are indirect band gap materials. The materials used for the LED have a direct band gap with energies corresponding to near-infrared, visible, or near-ultraviolet light.
16
Fig 4.2 General LED Construction
LED development began with infrared and red devices made with gallium arsenide. Advances in materials science have enabled making devices with ever- shorter wavelengths, emitting light in a variety of colors.
LEDs are usually built on an n-type substrate, with an electrode attached to the p-type
layer deposited on its surface. P-type substrates, while less common, occur as well. Many
commercial LEDs, especially GaN/InGaN, also use sapphire substrate.
Most materials used for LED production have very high refractive indices. This
means that much light will be reflected back into the material at the material/air surface
interface. Thus, light extraction in LEDs is an important aspect of LED
production, subject to much research and development.
Fig 4.3 Surface mount LED
17
4.2.2 Photodiode
A photodiode is a p-n junction or PIN structure. When a photon of sufficient energy
strikes the diode, it excites an electron, thereby creating a free electron (and a positively
charged electron hole). This mechanism is also known as the inner photoelectric effect.
If the absorption occurs in the junction's depletion region, or one diffusion length away
from it, these carriers are swept from the junction by the built- in field of the depletion
region. Thus holes move toward the anode, and electrons toward the cathode, and a
photocurrent is produced. This photocurrent is the sum of both the dark current (without
light) and the light current, so the dark current must be minimized to enhance the
sensitivity of the device.
Fig 4.4 Characteristics of Photodiode
�Critical performance parameters of a photodiode include:
Responsivity-
The ratio of generated photocurrent to incident light power, typically expressed in
A/W when used in photoconductive mode. The responsivity may also be expressed as
Quantum efficiency, or the ratio of the number of photogenerated carriers to incident
photons and thus a unit less quantity.
Dark current-
The current through the photodiode in the absence of light, when it is operated in
18
photoconductive mode. The dark current includes photocurrent generated by background
radiation and the saturation current of the semiconductor junction. Dark current must be
accounted for by calibration if a photodiode is used to make an accurate optical power
measurement, and it is also a source of noise when a photodiode is used in an optical
communication system.
4.3 Left hand microcontroller unit
4.3.1 89c2051
Low-voltage, high-performance CMOS 8-bit microcomputer with 2KB of flash
programmable and erasable read-only memory. The device is manufactured using
Atmel high-density nonvolatile memory technology and is compatible with the industry-
standard MCS-51 instruction set. This versatile 8-bit CPU with flash provides a
highly flexible and cost-effective solution for many embedded control applications.
Features:
• Compatible with MCS®-51Products
• 2K Bytes of Reprogrammable Flash Memory
– Endurance: 1,000 Write/Erase Cycles
• 2.7V to 6V Operating Range
• Fully Static Operation: 0 Hz to 24 MHz
• Two-level Program Memory Lock
• 128 x 8-bit Internal RAM
• 15 Programmable I/O Lines
• Two 16-bit Timer/Counters
• Six Interrupt Sources
19
• Programmable Serial UART Channel
• Direct LED Drive Outputs
• On-chip Analog Comparator
• Low-power Idle and Power-down Modes
• Green (Pb/Halide-free) Packaging Option
Role of Microcontroller Unit:
The Microcontroller unit is used for continuous polling of the fingers, whether they are
closed or open. Depending on this result, it generates a unique code which is send via RF
module to the right hand controller unit.
Selection of Microcontroller:
Various type of microcontroller having different kind of features is available in market
such as 8051, PIC, AVR, ARM, TI-MSP430. Various parameters lie power consumption,
cost, availability of ADC/DAC, UART, Timers, processing speed etc. determines the
choice of microcontroller. The choice of microcontroller must be in accordance with
requirements of project.
Table 4.1 Comparison of different MCU
Features 89c2051 PIC16F87
XA
AVR MSP430F26
18
MSP430FG43X
Supply
Voltage
5 V 5 V 4.5-5.5
V
1.8V to 3.6V 1.8V to 3.6V
Frequency 0-24MHz DC-20
MHz
0-16
MHz
0-16MHz 0-8 MHz
RAM 128Bytes 368Bytes 1KB 4KB 2KB
20
ROM 2KB 128Bytes
EEPROM
512byt
es
EEP
120KB +
256B
Flash
60KB+256B
Flash Memory
Input/outpu
t lines
15 22 32 64 48
Architectur CISC RISC RISC RISC RISC
Instruction
set
111 35 131 51 51
Timer Two 16-bit Three
timers-
Timer0-8-
bit,
Timer1-
16-bit,
Timer2-8-
bit
Three
flexib
le
timers
with
comp
are
mode
16-bit timer
A
with three
capture/com
pare
registers.
16-bit timer
B with
16-bit timer A
with three
capture/compare
registers.
16-bit timer B with
seven compare with
shadow registers
ADC NA 10 bit ADC 10-bit
ADC
12-bit ADC 12-bit ADC
DAC NA NA NA Dual 12-bit
DAC
Dual 12-bit DAC
with synchronization
Inbuilt
OPAMP
NA NA NA NA 3 inbuilt
configurable
opamps.
Power
Down
Two
power
saving.
Brown out
reset
Six
softw
are
power
savin
5 low power
modes
5 power saving
modes.
Serial
interface
Programm
abl
e full
duplex
serial
channel.
USART Progra
m
mabl
e
Serial
USA
Four USCI USART ,
Asynchronous
UART
21
Justifying the choice of 89c2051:
The project requirement for left hand side microcontroller clearly justifies the use of only
5 ports for polling of fingers, only one timer and use of full duplex serial channel.
Considering the cost requirement and choice of other microcontroller, 89c2051
justifies its presence in the hardware.
The interfacing figure used in the circuit for 89c2051 is as followed:
Fig 4.5 89c2051 Interfacing figure
4.3.2 NE555 Timer IC
The timer IC is used for generating 38 KHz square wave for providing the pulse to
the IR LED so that it can emit infrared light. The interfacing figure of this IC for
generating 38 KHz pulse is shown below.
22
Fig 4.6 NE555 Interfacing figure
4.3.3 RF module
The RF module is used for quick serial transmission and reception operating it into
USART mode. This module justifies a secured serial transmission for a short range
of 30 meters, which is requirement of the project.
4.3.3.1 ST3654 – Serial Interface IC
ST3654 Serial Interface provides a simple UART interface for transmission and reception
of serial data at various baud rates. It can be used for applications that need two way
wireless data transmission. The communication protocol is self-controlled and
completely transparent to user interface. The IC can be embedded to your current design
so that wireless communication can be set up easily.
Features:
�Automatic switching between TX and RX mode with LED indication
�Adjustable baud rate setting of 9600, 4800, 38400 and 19200