7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
1/50
1
OBJECT RECOGNITION AND SORTING
ROBOT
By
Umang Sharma
Km Swati
College of Engineering
University of Petroleum & Energy Studies
Dehradun
May, 2012
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
2/50
2
Object Recognition and Sorting Robot
A project report submitted in partial fulfilment of the requirements for the Degree ofBachelor of Technology(Electronics Engineering)
By
Umang Sharma
Km Swati
Under the guidance of
Mr. Amit Kumar MondalDoctoral Research Fellow
Electronics Electrical and Instrumentation Department, COES
Approved
.. Director
College of Engineering
University of Petroleum & Energy Studies
DehradunMay, 2012
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
3/50
3
CERTIFICATE
This is to certify that the work contained in this report titled Object Recognition
and Sorting Robot has been carried out Umang Sharma and Km Swati by. undermy supervision and has not been submitted elsewhere for a degree.
..
.
.
.
Date
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
4/50
4
Abstract/ Synopsis
Color is the most common feature to distinguish between objects, sorting,
recognizing and tracking. Colour detection has proven to be useful for face
detection, localization and tracking. The developed algorithm can be divided in to
three main parts. Detection recognition of object and movement of the robot. The
objective of the project is to separate the different coloured objects from a set. This
technology is used in an industry where the objects moving through a conveyer
belt or by human beings can be separated using a colour detecting robot. The
detection of the colour is done through image processing technique using
MATLAB. This project consists of a MATLAB based movable robot for separate
the different coloured objects from a set. The MATLAB based system recognizes
the colour and sends command to the controller. The controller, using the incoming
signal controls the movements of the robot and arm. The robot consist of four DC
motors and one free wheel two dc motors are used for arm in which one for the
base and second for the gripper and other two motors and free wheel is used for
movement of the robot .
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
5/50
5
Acknowledgement
The project bears the imprints of the efforts extended by many people to whom we are deeply
indebted. We take this opportunity to express great sense of gratitude to Mr Adesh kumar for
providing us the opportunity to carry out this project.
We also extend our heartiest thanks to our project guide Mr. Amit Kumar Mondal, for his
invaluable guidance, support and encouragement during working on the project Object
Recognitionand Sorting Robot for Material Handling in Packaging and Logistic Industries. His
effective planning, coordination, skill, knowledge and experience have made it possible to made
successfully progress in the project within the stipulated time. We are also indebted to all the
faculty members of Electronic Department as a whole and everything we got from here.
I also like thank to my friends and everyone who extended a helping hand towards me.
At the last but not the least, a remembrance to the God Almighty,without whose blessings, the
thoughts about the project will not go forth
Km Swati
Umang Sharma
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
6/50
6
Table of content
S. No. Name of Content Page
No.
1 Introduction 9
2 Literature Review 11
3 Theortical development 16
4 Methodology 17
5 Image Acquistion 18
6 Color Recognition 19
7 Flow Chart of Process 20
8 Hardware Development 21
9 Development of System 23
10 Software Development 31
11 AVR Studio 31
12 MATLAB 35
13 Proteus 36
14 Programming in MATLAB 38
15 Programme for Motor Control 41
16 Appliacations 42
17 Result and Discussions 43
18 Conclusion and Suggestion 44
19 Refrences 45
20
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
7/50
7
List of figures
S.No. Name of Figure Page
No.
1 Flow Diagram of Methodology 18
2 Flow Diagram of Image Acquistion 18
3 Flow Diagram of Color Recognition 20
4 Flow Chart of the Process 20
5 Basic Diagram of Hardware Development 21
6 Block Diagram of Hardware Development 22
7 DB9 Connector 23
8 MAX 232 25
9 ATmega 16 26
10 Pin Description 27
11 L293D Connection 28
12 Crystal Oscillator 29
13 DC Motor 29
14 Power Supply 30
15 Software Development Cycle 32
16 Coding in AVR Studio 33
17 Compiling in AVR Studio 34
18 Burning of Program in Micro Controller 34
19 MATLAB Desktop 36
20 Systematic Diagram Using Proteus 37
21 Programming in MATLAB 40
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
8/50
8
Nomenclature
1) V: Volt2) DC: Direct Current3) AC: Alternating current4) Matlab: Matrix laboratory5) IR sensor: Infrared sensors6) USB: Universal serial Bus7) IC: Integrated Circuit8) RGB: Red, green, blue9) PCB: Printed circuit board10) RISC: Reduced instruction set computing11) CMOS: Complementray metal oxide semiconductor12) I/O: Input/output13) A/D: Analog/digital
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
9/50
9
Chapter1: Introduction
From the 1960s until the present, the field of image processing has grown vigorously. In addition
to applications in medicine and the space program, digital image processing techniques now are
used in a broad range of applications. Computer procedures are used to enhance the contrast or
code the intensity levels into colour for easier interpretation of X-rays and other images used in
industry, medicine, and the biological sciences. Geographers use the same or similar techniques
to study pollution patterns from aerial and satellite imagery. Image enhancement and restoration
procedures are used to process degraded images of unrecoverable objects or experimental results
too expensive to duplicate.
Many real world applications require real-time image processing like motion and color
recognition. Performance of a color recognition system should be fast enough so that objects in
video can be detected and processed in real time. Once object in a video is detected, object
tracking, image data mining, semantic meaning extraction, and other video and image processing
techniques can be performed. To extract the maximum benefit from this recorded digital data,
detection of any object from the scene is needed without engaging any human eye to monitor
things all the time. Real-time colour detection and Recognition of images is a fundamental step
in many vision systems[1].
Image processing is a form of signal processing in which the input is an image, such as a
photograph or video frame. The output of image processing may be either an image or, a set of
characteristics or parameters related to the image. Most image-processing techniques involve
treating the image as a two-dimensional signal and applying standard signal-processing
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
10/50
10
techniques to it. This project aims at processing the real time images captured by a Webcam for
motion detection and Color Recognition using MATLAB programming. The results of this
processing can be used in sense the particular colored block and automatically pick up the block
and place it into a bin according to its color.
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
11/50
11
Chapter2: LITERATURE SURVEY
1.1 This paper appears in: Real-time capable system for hand gesture recognition Using hidden
Markov models in stereo color image sequences.
Issue Date: 2008
Author: Elmezain, Mahmoud, Al-Hamadi , Ayoub Michaelis , Bernd.
ISBN: 978-80-86943-14-5
Product type: Article
Abstract
This paper proposes a system to recognize the alphabets and numbers in real time from color
image sequences by the motion trajectory of a single hand using Hidden Markov Models
(HMM). Our system is based on three main stages; automatic segmentation and preprocessing of
the hand regions, feature extraction and classification. In automatic segmentation and
preprocessing stage, YCbCr color space and depth information are used to detect hands and face
in connection with morphological operation where Gaussian Mixture Model (GMM) is used for
computing the skin probability. After the hand is detected and the centroid point of the hand
region is determined, the tracking will take place in the further steps to determine the hand
motion trajectory by using a search area around the hand region. In the feature extraction stage,
the orientation is determined between two consecutive points from hand motion trajectory and
then it is quantized to give a discrete vector that is used as input to HMM. The final stage so-
called classification, Baum-Welch algorithm (BW) is used to do a full train for HMM
parameters. The gesture of alphabets and numbers is recognized by using Left-Right Banded
model (LRB) in conjunction with Forward algorithm. In our experiment, 720 trained gestures are
https://otik.uk.zcu.cz/browse?type=author&value=Al-Hamadi%2C+Ayoubhttps://otik.uk.zcu.cz/browse?type=author&value=Al-Hamadi%2C+Ayoub7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
12/50
12
used for training and also 360 tested gestures for testing. Our system recognizes the alphabets
from A to Z and numbers from 0 to 9 and achieves an average recognition rate of 94.72%[2]
1.2 This paper appears in: A real-time color feature tracking system using color histograms
Issue Date: Date of Conference: 17-20 Oct. 2007
Author(s): Jung Uk ChoSungkyunkwan Univ, Suwon Seung Hun Jin; Xuan Dai Pham;
Dongkyun Kim; Jae Wook Jeon
Page(s): 11631167
Product Type: Conference Publications
Abstract
Color feature tracking is based on pattern matching algorithms where the appearance of the
target is compared with a reference model in successive images and the position of the target is
estimated. The major drawback of these methods is that such operations require expensive
computation power. It is bottleneck to implement real-time color feature tracking system. The
probabilistic tracking methods have been shown to be robust and versatile for a modest
computational cost. However, the probabilistic tracking methods break down easily when the
object moves very fast because these methods search only the regions of interest based on the
probability density function to estimate the position of the moving object. In this paper, we
propose a real-time color feature tracking circuit. We propose a window-based image processing
structure to improve the processing speed of the tracking circuit. The tracking circuit searches all
regions of the image to perform a matching operation in order to estimate the position of the
moving object. The main results of our work are that we have designed and implemented a
physically feasible hardware circuit to improve the processing speed of the operations required
http://ieeexplore.ieee.org/search/searchresult.jsp?searchWithin=p_Authors:.QT.Jae%20Wook%20Jeon.QT.&newsearch=partialPrefhttp://ieeexplore.ieee.org/search/searchresult.jsp?searchWithin=p_Authors:.QT.Jae%20Wook%20Jeon.QT.&newsearch=partialPref7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
13/50
13
for real-time color feature tracking. Therefore, this work has resulted in the development of a
real-time color feature tracking system employing an FPGA (field programmable gate array)
implemented circuit designed by VHDL (the VHSIC hardware description tanguage). Its
performance has been measured to compare with the equivalent software implementation.[3]
1.3 This paper appears in: Automatic Number Plate Recognition System for Vehicle
Identification Using Optical Character Recognition.
Issue Date: Date of Conference: 17-20 April 2009
Author(s): Qadri, M.T. Dept. of Electron. Eng., Sir Syed Univ. of Eng. & Technol., Karachi,
Pakistan ,Asif, M.
Page(s): 335-338
Product Type: Conference Publications
Abstract
Automatic number plate recognition (ANPR) is an image processing technology which uses
number (license) plate to identify the vehicle. The objective is to design an efficient automatic
authorized vehicle identification system by using the vehicle number plate. The system is
implemented on the entrance for security control of a highly restricted area like military zones or
area around top government offices e.g. Parliament, Supreme Court etc. The developed system
first detects the vehicle and then captures the vehicle image. Vehicle number plate region is
extracted using the image segmentation in an image. Optical character recognition technique is
used for the character recognition. The resulting data is then used to compare with the records on
a database so as to come up with the specific information like the vehiclepsilas owner, place of
registration, address, etc. The system is implemented and simulated in Matlab, and it
http://ieeexplore.ieee.org/search/searchresult.jsp?searchWithin=p_Authors:.QT.Qadri,%20M.T..QT.&searchWithin=p_Author_Ids:37320522100&newsearch=truehttp://ieeexplore.ieee.org/search/searchresult.jsp?searchWithin=p_Authors:.QT.Asif,%20M..QT.&searchWithin=p_Author_Ids:37658214100&newsearch=truehttp://ieeexplore.ieee.org/search/searchresult.jsp?searchWithin=p_Authors:.QT.Asif,%20M..QT.&searchWithin=p_Author_Ids:37658214100&newsearch=truehttp://ieeexplore.ieee.org/search/searchresult.jsp?searchWithin=p_Authors:.QT.Asif,%20M..QT.&searchWithin=p_Author_Ids:37658214100&newsearch=truehttp://ieeexplore.ieee.org/search/searchresult.jsp?searchWithin=p_Authors:.QT.Qadri,%20M.T..QT.&searchWithin=p_Author_Ids:37320522100&newsearch=true7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
14/50
14
performance is tested on real image. It is observed from the experiment that the developed
system successfully detects and recognize the vehicle number plate on real images.[4]
1.4 Thispaper appears in: Recognition, Analysis, and Tracking of Faces and Gestures in Real-
Time Systems
Date of Conference: 2001
Author(s): Jianhuang Lai Dept. of Comput. Sci., Hong Kong Baptist Univ., China Yuen,
P.C.; Wensheng Chen; Shihong Lao;
Page(s): 168-174
Product Type: Conference Publications
Abstract
Addresses the problem of facial feature point detection under different lighting conditions. Our
goal is to develop an efficient detection algorithm, which is suitable for practical applications.
The problems that we need to overcome include (1) high detection accuracy, (2) low
computational time and (3) nonlinear illumination. An algorithm is developed and reported in the
paper. One of the key factors affecting the performance of feature point detection is the accuracy
in locating face boundary. To solve this problem, we propose to make use of skin color, lip color
and also the face boundary information. The basic idea to overcome the nonlinear illumination is
that, each person shares the same/similar facial primitives, such as two eyes, one nose and one
mouth. So the binary images of each person should be similar. Again, if a binary image (with
appropriate thresholding) is obtained from the gray scale image, the facial feature points can also
be detection easily. To achieve this, we propose to use the integral optical density (IOD) on face
region. We propose to use the average IOD to detect feature windows. As all the above-
http://ieeexplore.ieee.org/xpl/mostRecentIssue.jsp?punumber=7480http://ieeexplore.ieee.org/xpl/mostRecentIssue.jsp?punumber=7480http://ieeexplore.ieee.org/search/searchresult.jsp?searchWithin=p_Authors:.QT.Jianhuang%20Lai.QT.&searchWithin=p_Author_Ids:37274695700&newsearch=truehttp://ieeexplore.ieee.org/search/searchresult.jsp?searchWithin=p_Authors:.QT.Yuen,%20P.C..QT.&searchWithin=p_Author_Ids:38264172900&newsearch=truehttp://ieeexplore.ieee.org/search/searchresult.jsp?searchWithin=p_Authors:.QT.Yuen,%20P.C..QT.&searchWithin=p_Author_Ids:38264172900&newsearch=truehttp://ieeexplore.ieee.org/search/searchresult.jsp?searchWithin=p_Authors:.QT.Wensheng%20Chen.QT.&searchWithin=p_Author_Ids:37279196500&newsearch=truehttp://ieeexplore.ieee.org/search/searchresult.jsp?searchWithin=p_Authors:.QT.Shihong%20Lao.QT.&searchWithin=p_Author_Ids:37271029600&newsearch=truehttp://ieeexplore.ieee.org/search/searchresult.jsp?searchWithin=p_Authors:.QT.Shihong%20Lao.QT.&searchWithin=p_Author_Ids:37271029600&newsearch=truehttp://ieeexplore.ieee.org/search/searchresult.jsp?searchWithin=p_Authors:.QT.Wensheng%20Chen.QT.&searchWithin=p_Author_Ids:37279196500&newsearch=truehttp://ieeexplore.ieee.org/search/searchresult.jsp?searchWithin=p_Authors:.QT.Yuen,%20P.C..QT.&searchWithin=p_Author_Ids:38264172900&newsearch=truehttp://ieeexplore.ieee.org/search/searchresult.jsp?searchWithin=p_Authors:.QT.Yuen,%20P.C..QT.&searchWithin=p_Author_Ids:38264172900&newsearch=truehttp://ieeexplore.ieee.org/search/searchresult.jsp?searchWithin=p_Authors:.QT.Jianhuang%20Lai.QT.&searchWithin=p_Author_Ids:37274695700&newsearch=truehttp://ieeexplore.ieee.org/xpl/mostRecentIssue.jsp?punumber=7480http://ieeexplore.ieee.org/xpl/mostRecentIssue.jsp?punumber=74807/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
15/50
15
mentioned techniques are simple and efficient, the proposed method is computationally effective
and suitable for practical applications. 743 images from the Omron database with different facial
expressions, different glasses and different hairstyle captured indoor and outdoor have been used
to evaluate the proposed method and the detection accuracy is around 86%. The computational
time in Pentium III 750 MHz using matlab for implementation is less than 7 seconds.
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
16/50
16
Chapter 3 Theoretical Development
1.1 Technical Specifications
Domain : Embedded System
Software : AVR, MATLAB and Proteus
Microcontroller : Atmega 16
Power Supply : 12V from the battery and 5V from the board
Communication : Wired
Application : Industry process, Home automation
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
17/50
17
1.2 Methodology
Most image-processing techniques involve treating the image as a two-dimensional signal and
applying standard signal-processing techniques to it. The video captured by the camera is being
processed by the MATLAB program that helps in color recognition. The results of this
processing can be used in numerous security applications such as intrusion detection, Spy robots,
fire alarm, person finder,sorting of objects etc. Generally signal processing is used in the analysis
of the colour of an object. In this project the detection of different colors is done through image
processing technique using MATLAB. The goal of the project is to develop Bot. Bot is a typical
model used to pick and place the desired color objects from one location to another. This robot is
used in sorting the objects in a mixture of different color objects. The project consists of a
MATLAB based robotic arm and a controller for controlling the mechanical movements. An
Objrec algorithm was developed in MATLAB to recognize the color and send command to the
controller using serial communication. The controller, using the incoming signal controls the
movements of the robot. The robot consist of two DC motors one for the base and another for the
gripper. The controller that was used is ATMEGA 16. RS232 communication was used for
MATLAB to communicate with the microcontroller.
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
18/50
18
Figure 1: Flow diagram of methodology
1.3 Image Acquisition
The first stage of any vision system is the image acquisition stage. After the image has been
obtained, various methods of processing can be applied to the image to perform the many
different vision tasks required today.
Figure 2: Flow diagram of Image Acquisition
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
19/50
19
1.3.1 Algorithm for Image Acquisition
1Install the image acquisition device by installing the frame grabber board in your computer.
These devices directly connect your computer via a USB or FireWire port. After installing and
configuring your image acquisition hardware, start MATLAB on your computer.
2. With the comment imaqhwinfo we can get all information of our camera.
3. Create a Video Input Object. To create a video input object, use the video input function at
the MATLAB prompt.
4. Acquire Image Data.
5. Start Acquiring Image or Video.
1.4 Need for colour detection
The colour detection is of very much importance to human as we daily use colour to differentiate
the objects in the environment, recognize them and communicate using these information.
Analysis of colour in the image processing is basically signal processing. Image Processing is a
technique to improve the images received from cameras acting as sensors placed on robots,
satellites, space probes and aircrafts or images taken in daily life for various applications.
1.5 Color Recognition
Color plays an important role in image processing. Each image is composed of an array of M*N
pixels with M rows and N columns of pixels. Each pixel contains a certain value for red, green
and blue Varying these values for red, green, blue (RGB) we can get almost any color.
Images:
picture (row, column, rgb value)
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
20/50
20
Videos:
frames(row, column, rgb value, frame)
Figure3: Flow diagram of Color Recognition In MATLAB
1.6 Flow chart for the process
Figure4: Flow chart of the process
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
21/50
21
2.1 Hardware Development
Large volume of data is produced when camera is used as a sensor. Other sensors give out he
output in terms of 0s or 1s like position sensor, encoders, IR sensors etc. The power supply
supplies the power of 5V to the controller to operate, which include a bridge rectifier and a
voltage regulator , a capacitor and an LED. ATMEGA 16 is the microcontroller which receives
the commands from the MATLAB and sends the commands to the L293D for driving the
motors. MAX 232 IC is used for serial communication in order to communicate with the PC. A
USB to Serial cable is used in Between MAX 232 and the PC for the flow of data. To drive the
two DC motors the IC L293D is used.
Figure 5: Basic diagram of hardware development
The hardware implementation deals in:
Drawing the schematic on the plane paper according to the application
Testing the schematic design over the breadboard using the various ICs to find if the
design meets the objective
Designing the PCB layout of the schematic tested on the breadboard.
Finally preparing the board and testing the designed hardware
Hardware development of Bot is divided into two parts.
Interfacing section
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
22/50
22
Power supply
Figure 6 : Development Board
The hardware board as shown in Figure 2 consists of:
Power supply
ATMEGA 16
MAX 232
L293D
DB9
USB to serial cable
Focus range: 3 cm to infinity
Clip type mounting to clamp on any surface
Active night vision with backlit LEDs
Integrated microphone for sound recording
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
23/50
23
Figure 7: Block diagram of hardware development
Once the colour is detected, the microcontroller will initiate the following actions on the robot.
Gripper open
Gripper close
Left
Right
2.2 Development of the System
A simple approach for developing object reorganization system is shown below:
Decide the ideal position of the object with respect to the camera
The distinguishing feature of the object to be picked is to be figured out.
Deciding the robots movement as planned
2.3 Hardware description:
2.3.1 DB Connector:The DB9-USB-RS232 connector can be used to upgrade RS232 port to active USB port
without the need to redesign the PCB. These active connectors contain all the USB to
RS232 (and vice-versa) conversion electronics and are designed to fit directly into the same
PCB footprint as a PC compatible RS232DB9 connector. The FTDI DB9-USB-RS232
connectors come in two types DB9-USB-RS232-M and DB9-USB-RS232-F . A DB9-USB-
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
24/50
24
RS232-M canbe used to replace male DB9 connector that is wired in a PC compatible
RS232manner. A DB9-USB-RS232-F can be used to replace female DB9 connector that is
wired in a PC compatible RS232manner. The purpose of the modules is to provide
simple method of adapting legacy serial devices with RS232 interfaces to modern USB ports
by replacing the DB9 connector with miniaturized module which is closely resembles a DB9
connector. This is accomplished by incorporating the industry standard FTDI FT232R USB-
Serial Bridge IC plus the required level shifters inside the module.
Fig 7: DB connector
2.3.2 MAX232:The MAX232 is an integrated circuit that converts signal from an RS-232 serial port to
signal suitable for use in TTL compatible digital logic circuits. The MAX232 is a dual
driver/receiver and typically converts the RX, TX, CTS and RTS signals. The drivers provide
RS-232 voltage level outputs (approx. 7.5 V) from a single + 5 V supply via on-chip charge
pumps and external capacitors. This makes it useful for implementing RS-232 in devices
that do not need any voltages outside the 0 V to + 5 V range, as power supply design
does not need to be made more complicated just for driving the RS-232 in this case. The
receivers reduce RS-232 inputs (which may be as high as 25 V), to standard 5 V TTL
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
25/50
25
levels. These receivers have a typical threshold of 1.3 V , and a typical hysteresis of 0.5
V . Later MAX232A is backwards compatible with the original MAX232 but may operate at
higher baud rate and can use smaller external capacitors 0.1 F in place of the 1.0 F
capacitors used with the original device. The newer MAX3232 is also backwards
compatible, but operates at a broader voltage range, from 3 to 5.5 V pin to pin compatible:
ICL232, ST232, ADM232, HIN2.
Fig 8: Max 232
2.3.3 ATmega 16:The ATmega16 is a low-power CMOS 8-bit microcontroller based on the AVR enhanced RISC
architecture. By executing powerful instructions in a single clock cycle, the ATmega16 achieves
throughputs approaching 1 MIPS per MHz allowing the system designer to optimize power
consumption versus processing speed.
(a)Features :High-performance, Low-power AVR 8-bit Microcontroller.
130 Powerful InstructionsMost Single Clock Cycle Execution
32 x 8 General Purpose Working Registers
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
26/50
26
Fully Static Operation
Up to 16 MIPS Throughput at 16 MHz
On-chip 2-cycle Multiplier
512 Bytes EEPROM
512 Bytes Internal SRAM
Fig 9: ATmega 16
(b)Pin description of ATmega 16:VCC:Digital supply voltage.
GND: Ground.
PORT(PA7-PA0): Port A: serves as the analog input to A/D converter. Port A also serves as
an 8-bit bidirectional I/O port, if the A/D port is not used.
PORT B(PB7-PB0): Port B is an 8-bit bidirectional I/O port.
PORT C(PC7-PC0):Port C ia an 8-bit bidirectional I0O port with external pull up resistors.
PORT D(PD7-PD0):Port D is an 8-bit bidirectional I/O port with external pull up resistors.
RESET-Reset input. A low level on this pin for longer than maximum pulse length will
generate a reset even if the clock is not running.
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
27/50
27
XTAL1- Input to the inverting oscillator amplifier and input to the internal clock operating
circuit.
XTAL2- Output from the inverting oscillator amplifier.
AVCC:- AVCC is the supply voltage pin for port A and the A/D converter.
AREF:-It is the analog reference pin to the A/D convertor.
Fig 10: Pin discription
2.3.4 L293D:
To control a dc motor we have to first convert digital output to a signal which can run motors so
we have used the H-bridge. Here we have a simple Hbridge circuit. An H-bridge is an
electronic circuit which enables DC electric motors to be run forwards or backwards. These
circuits are often used in robotics. Hbridges are available as integrated circuits, or can be built
from separate components.
Truth Table
High Left High Right Low Left Low Right Description
On Off Off On Motor runs clockwise
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
28/50
28
Off On On Off Motor runs anti-clockwise
On On Off Off Motor stops or decelerates
Off Off On On Motor stops or decelerates
Fig 11 : L293D connection
2.3.5 Crystal Oscillator:
A crystal oscillator is an electronic oscillator circuit that uses the mechanical resonance of a
vibrating crystal of piezoelectric material to create an electrical signal with a very precise
frequency. This frequency is commonly used to keep track of time (as in quartz wristwatches), to
provide a stable clock signal for digital integrated circuits, and to stabilize frequencies for
radio transmitters and receivers. The most common type of piezoelectric resonator used is the
quartz crystal, so oscillator circuits designed around them became known as "crystal
oscillators.
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
29/50
29
Fig 12: Crystal oscillator
2.3.6 DC Motors:
DC motor is a machine that produces rotation. It is an arrangement of coils and magnets that
converts elecrtric current into mechanical rotation. In robotics applications,they are preffered
over AC motors as the motor and the complete circuit require same kind of supply.i.e DC supply.
Features:
Easy to control speed.
Easy to control torque.
Motors having external gear arrangement attached with motor.
It has a gear box thet increases torque decreases speed.
Most commonly used in robotics as they are having considerable torque.
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
30/50
30
Fig 13: DC motor
2.3.7 Power supply
Generally the battery is source of power to robot. The battery power is not sufficient for driving
the motors of the robot, hence power supply unit should be added to the circuit. Step down
transformer is used for converting the higer voltage into lower voltage. Bridge rectifer is used for
converting AC voltage into plusating DC voltage and RC rectifer is used for converting plusating
DC into pure DC voltage.
Figure14: Power supply
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
31/50
31
Chapter 4 Experimental/ Computational
1. Software Development
1.1AVR Studio:A microcontroller often serves as the brain of a mechatronic system. Like a mini, self-
contained computer, it can be programmed to interact with both the hardware of the system and
the user. We use avr studio for programming of our microcontoller and use c language for the
coding . There are several ways that we can write, compile, and download a program to the
ATmega16 microcontroller. WinAVR consists of a suite of executable, open source software
development tools for the Atmel AVR series of RISC microprocessors hosted on the Windows
platform. It includes the GNU GCC compiler for C and C++, which is sometimes referred to as
avr-gcc.
The coding is done in AVR Studio4 in embedded C. The actions performed by the robot are
written in ATMEGA16 microcontroller. The following steps are involved in the software
development:
Coding/debugging
Compiling
Burning
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
32/50
32
Figure 15: software development cycle
1.1.1 Coding / debugging:In a high-level language (such as C, or Java), a compiler for a high level language helps to
reduce production time. T o program the microcontrollers the WinAVR was used. Although
inline assembly was possible, the programming was done strictly in the C language. The source
code has been commented to facilitate any occasional future improvement and maintenance.
WinAVR is a suite of executable, open source software development tools for the Atmel
AVR series of RISC microprocessors hosted on the Windows platform. Coding / Debugging:
it includes the GNU GCC compiler for C and C++. WinAVR contains all the tools for
developing on the AVR. This includes AVR-gcc (compiler), AVR-gdb (debugger) etc. Test
Source Code has been written in C Language to test the microcontroller .
High level languages such as C, Java or assembly language are used for coding and debugging.
For Bot coding is done in AVR studio4 using embedded C language. The code is written to move
the motors of the robot according to the image acquired.
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
33/50
33
Figure 16: coding in AVR studio
1.1.2 CompilingThe compilation of the C program converts it into machine language file (.hex). This is the
only language the microcontroller will understand, because it contains the original program
code converted into a hexadecimal format. During this step there were some warnings
about eventual errors in the program.
A compiler for a high level language helps to reduce production time. To program the
microcontroller WinAvr was used. Although inline assembly was possible, the programming was
done in strictly in C. A source code for USART has also been included. The microcontroller
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
34/50
34
understands only the machine level language.
Figure 17 : Compiling in AVR Studio
1.1.3 BurningBurning the machine language file into the micro controllers program memory is achieved with
the dedicated programmer, which is attached to the PC peripheral. PCs serial port has been used
for this purpose.
Figure 18: Buring of Programme in Microcontroller
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
35/50
35
1.2Matlab:Matlab = matrix laboratory
It is a numerical computing environment and fourth generation programming language ,
developed by Mathworks . Matlab allows matrix manipulations , plotting of functions and data
implementation of algorithms creation of user interfaces and interfacing with programs written in
other languages including c , c++ , Java and Fortran. It is an interactive program for numerical
computation and data visualization.
Although MATLAB is intended primarily for numerical computing , an optional toolbox uses the
MuPAD symbolic engine, allowing access to symbolic computing capabilities . An additional
package simulink adds graphical multi-domain simulation and model-based design for dynamic
and Embedded system.
The MATLAB desktop is the main MATLAB application window. As Fig below shows, the
desktop contains five sub-windows: the Command Window, the Workspace Browser, the
Current Directory Window, the Command History Window, and one or more Figure Windows,
which are shown only when the user displays a graphic.
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
36/50
36
Figure 19 : MATLAB Desktop
Matlab commands description
A) Syntax: imaqtoolDescription: Imaqtool launches an interactive GUI to allow you to explore, configure, and
acquire data from installed and supported image acquisition devices.
B) Syntax: vid = videoinput('winvideo',1);Description: Vid is called a video input object( vid is a custom MATLAB class.)
Winvideo is the adaptor name for image acquisition hardware.
1 is the device ID (given by imaqhwinfo)
C) Syntax: Set(vid)Description: Use the set function to display all configurable device object properties.
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
37/50
37
D) Syntax:Get(vid)Description: Use the get function to return the current device object property values.
E) Syntax : inspect(vid)Description: Use the image acquisition property editor to view and edit the device object'sproperties.
F) Syntax: imshow(fiugre name);Description: The imshow function displays the image in a MATLAB figure window.
G)Syntax: triggerconfig(obj,type)Description: configures thevalue of the TriggerType property of the video input object obj to
the value specified by the text string type.
H)Syntax: start(vid);Description: When you start an object, you reserve the device for your exclusive use and lock
the configuration. Thus, certain properties become read only while running.
I)Syntax: rgb2ycbcr();
Description: This command converts the RGB values in map to the YCbCr color
space. map must be an M-by-3 array. ycbcrmap is an M-by-3 matrix that contains the YCbCr
luminance (Y) and chrominance (Cb and Cr) color values as columns. Each rowin ycbcfmap
represents the equivalent color to the corresponding row in the RGB colormap, map.
J)Syntax: YCBCR = rgb2ycbcr(RGB)Description: This command converts the truecolor image RGB to the equivalent image in the
YCbCr color space. RGB must be a M-by-N-by-3 array.
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
38/50
38
K)Syntax: getdata(obj);
Description: Extracts the number of samples specified by the SamplesPerTriggerproperty for
each channel contained byobj. data is an m-by-n array, where m is the number of samples
extracted and n is the number of channels.
L)Syntax: data = getdata(obj,samples,'type')
Description: Extracts the number of samples specified by samples in the format specified
by type for each channel contained by obj.
M)Syntax: Imread(image name);Description: Reads a grayscale or truecolor image named filename into A. If the file contains a
grayscale intensity image, A is a two-dimensional array. If the file contains a truecolor (RGB)
image, A is a three-dimensional (m-by-n-by-3) array.
N) Syntax: flushdata(obj)Description: Removes all the data from the memory buffer used to store acquired image
frames. obj can be a single video input object or an array of video input objects.
O)Syntax: getsnapshot(obj);Description: Immediately returns one single image frame, frame, from the video input
object obj. The frame of data returned is independent of the video input object Frames Per
Trigger property and has no effect on the value of the Frames Available or Frames
Acquired property.
P) Syntax: triggerinfo(obj)Description: Displays all available trigger configurations for the video input object obj. obj can
only be a 1-by-1 video input object.
http://www.mathworks.in/help/daq/ref/samplespertrigger.htmlhttp://www.mathworks.in/help/daq/ref/samplespertrigger.html7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
39/50
39
Q)Syntax: imaq.VideoDevice(adaptorname)Description: Creates a Video Device System object, obj, using the first device of the specified
adaptor name. adaptor name is a text string that specifies the name of the adaptor used to
communicate with the device. Use the imaqhwinfo function to determine the adaptors available
on your system.
R) Syntax: set(obj)Description: Set(obj) displays property names and any enumerated values for all configurable
properties of image acquisition object obj. obj must be a single image acquisition object.
S)
Syntax: obj2mfile(obj,filename)
Description: obj2mfile(obj,filename) converts the video input object obj into an M-file with the
name specified by filename. The M-file contains the MATLAB
code required to create the
object and set its properties. obj can be a single video input object or an array of objects.
T) Syntax: flushdata(obj)Description: flushdata(obj) removes all the data from the memory buffer used to store acquired
image frames. obj can be a single video input object or an array of video input objects.
U)Syntax: start(obj)Description: start(obj) obtains exclusive use of the image acquisition device associated with the
video input object obj and locks the device's configuration. Starting an object is a necessary first
step to acquire image data, but it does not control when data is logged.
V) Syntax: delete(obj)Description: delete(obj) removes obj, an image acquisition object or array of image acquisition
objects, from memory. Use delete to free memory at the end of an image acquisition session.
W)Syntax: triggerinfo(obj)
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
40/50
40
Displays all available trigger configurations for the video input object obj. obj can only be a 1-by-1 video
input object.
X) Syntax: load filename obj1 obj2 ...Description: load filename obj1 obj2 ... returns the specified image acquisition objects
(obj1, obj2, etc.) from the MAT-file specified byfilename to the MATLAB workspace .
Y) Syntax: imaqhelp(obj)Description: imaqhelp(obj) displays a listing of functions and properties for the image
acquisition object obj along with the online help for the object's constructor. obj must be a 1-by-1
image acquisition object.
Z) Syntax: src = getselectedsource(obj)Description: src = getselectedsource(obj) searches all the video source objects associated with
the video input object obj and returns the video source object, src, that has the Selected property
value set to 'on'.
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
41/50
41
Figure 20 : Video input session
1.3ProteusProteus is software formicroprocessorsimulation, schematic capture, and printed circuit board
(PCB) design. It is developed by Labcenter Electronics. This package splits into three parts :
(a) ISIS: Intelligent Schematic Input System - for drawing circuit diagrams etc.(b) ARES: Advanced Routing and Editing Software - for producing pcb layout drawings.(c) LISA: Labcenter Integrated Simulation Architecture - for simulation of circuit diagram.
Separate handout
http://en.wikipedia.org/wiki/Microprocessorhttp://en.wikipedia.org/wiki/Printed_circuit_boardhttp://en.wikipedia.org/w/index.php?title=Labcenter_Electronics&action=edit&redlink=1http://en.wikipedia.org/w/index.php?title=Labcenter_Electronics&action=edit&redlink=1http://en.wikipedia.org/wiki/Printed_circuit_boardhttp://en.wikipedia.org/wiki/Microprocessor7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
42/50
42
Figure 20: Systematic Diagram using Proteus
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
43/50
43
2.1 Programming in Matlab
imaqtool
vid = videoinput('winvideo',1);
triggerconfig(vid,'manual');
set(vid,'framespertrigger',1);
set(vid,'triggerRepeat',Inf);
start(vid);
total_pixels = 240*320;
default = 0;
while(1)
trigger(vid);
im = getdata(vid,1);
imshow(im);
default = 0;
im_new = rgb2ycbcr(im);
%imshow(im_new(:,:,3));
sz = size(im_new);
m = sz(1,1);
n = sz(1,2);
binred = zeros(m,n);
num = 0;
I = 0;
J = 0;
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
44/50
44
fori = 1:m
forj = 1:n
if(im_new(i,j,3)>180&&im_new(i,j,2)(n/2+0.05*n))
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
45/50
45
disp('R');
end
if(J3(n/2-0.05*n))&&(J3
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
46/50
46
2.2 Programme for Motor control:
#include#include
void main(){
DClear();
DDRB=0xFF;
while(1)
{
PORTB=0b00000101;
_delay_ms(20);
PORTB=0b00001010;
_delay_ms(20);
PORTB=0b000000001;
_delay_ms(20);
PORTB=0b00000100;
_delay_ms(20);
PORTB=0b00000000;
_delay_ms(20);
}
}
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
47/50
47
3.1 Application
1: This concept can be implemented in Person Finder application. Pfinder is a real time system
for tracking people and interpreting their behavior.
2: Alarm system: In this system sounds a buzzer when ever motion is detected along with
glowing of a red led.
3: Colour detection along with pattern recognition and Speech recognition will play a vital role
in many industries and also will increase the accuracy of the task in logistic and packaging
industry
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
48/50
48
Chapter 5 Results and Discussion
Implementation of the Robot was Successfully done with the help of MATLAB (Image
Processing) and AVR Studio. Red color Object was Successfully Picked and Dropped into the
desired bin with the Robot.
Figure : Snapshot of Robot
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
49/50
49
Chapter 6 Conclusion and suggestions
The developed rotot is able to detect the colour of the object and place it in the desired location.
The colour detection capability can be increased to blue and green along with red which can sort
out wide range of objects. There is a wired communication between the robot and the PC this can
be improved by creating a wireless communication. The robot can be controlled wireless in
industries with hazardous environment. Colour detection along with pattern recognition will
play a vital role in many industries and also will increase the accuracy of the task.
7/28/2019 OBJECT RECOGNITION AND SORTING ROBOT
50/50
References
http://www.ijvspa.net/docs/IJVSPA20120105.pdf
https://otik.uk.zcu.cz/handle/11025/1315
http://ieeexplore.ieee.org/xpl/articleDetails.jsp?reload=true&Citations&tp=&arnumber=4406509
&contentType=Conference+Publications&sortType%3Dasc_p_Sequence%26filter%3DAND(p_
IS_Number%3A4406493)
http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=5169511&url=http%3A%2F%2Fieeexplo
re.ieee.org%2Fxpls%2Fabs_all.jsp%3Farnumber%3D5169511
http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=938927&url=http%3A%2F%2Fieeexplor
e.ieee.org%2Fxpls%2Fabs_all.jsp%3Farnumber%3D938927
http://www.ijvspa.net/docs/IJVSPA20120105.pdfhttp://www.ijvspa.net/docs/IJVSPA20120105.pdfhttps://otik.uk.zcu.cz/handle/11025/1315https://otik.uk.zcu.cz/handle/11025/1315http://ieeexplore.ieee.org/xpl/articleDetails.jsp?reload=true&Citations&tp=&arnumber=4406509&contentType=Conference+Publications&sortType%3Dasc_p_Sequence%26filter%3DAND(p_IS_Number%3A4406493)http://ieeexplore.ieee.org/xpl/articleDetails.jsp?reload=true&Citations&tp=&arnumber=4406509&contentType=Conference+Publications&sortType%3Dasc_p_Sequence%26filter%3DAND(p_IS_Number%3A4406493)http://ieeexplore.ieee.org/xpl/articleDetails.jsp?reload=true&Citations&tp=&arnumber=4406509&contentType=Conference+Publications&sortType%3Dasc_p_Sequence%26filter%3DAND(p_IS_Number%3A4406493)http://ieeexplore.ieee.org/xpl/articleDetails.jsp?reload=true&Citations&tp=&arnumber=4406509&contentType=Conference+Publications&sortType%3Dasc_p_Sequence%26filter%3DAND(p_IS_Number%3A4406493)http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=5169511&url=http%3A%2F%2Fieeexplore.ieee.org%2Fxpls%2Fabs_all.jsp%3Farnumber%3D5169511http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=5169511&url=http%3A%2F%2Fieeexplore.ieee.org%2Fxpls%2Fabs_all.jsp%3Farnumber%3D5169511http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=5169511&url=http%3A%2F%2Fieeexplore.ieee.org%2Fxpls%2Fabs_all.jsp%3Farnumber%3D5169511http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=938927&url=http%3A%2F%2Fieeexplore.ieee.org%2Fxpls%2Fabs_all.jsp%3Farnumber%3D938927http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=938927&url=http%3A%2F%2Fieeexplore.ieee.org%2Fxpls%2Fabs_all.jsp%3Farnumber%3D938927http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=938927&url=http%3A%2F%2Fieeexplore.ieee.org%2Fxpls%2Fabs_all.jsp%3Farnumber%3D938927http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=938927&url=http%3A%2F%2Fieeexplore.ieee.org%2Fxpls%2Fabs_all.jsp%3Farnumber%3D938927http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=938927&url=http%3A%2F%2Fieeexplore.ieee.org%2Fxpls%2Fabs_all.jsp%3Farnumber%3D938927http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=5169511&url=http%3A%2F%2Fieeexplore.ieee.org%2Fxpls%2Fabs_all.jsp%3Farnumber%3D5169511http://ieeexplore.ieee.org/xpl/login.jsp?tp=&arnumber=5169511&url=http%3A%2F%2Fieeexplore.ieee.org%2Fxpls%2Fabs_all.jsp%3Farnumber%3D5169511http://ieeexplore.ieee.org/xpl/articleDetails.jsp?reload=true&Citations&tp=&arnumber=4406509&contentType=Conference+Publications&sortType%3Dasc_p_Sequence%26filter%3DAND(p_IS_Number%3A4406493)http://ieeexplore.ieee.org/xpl/articleDetails.jsp?reload=true&Citations&tp=&arnumber=4406509&contentType=Conference+Publications&sortType%3Dasc_p_Sequence%26filter%3DAND(p_IS_Number%3A4406493)http://ieeexplore.ieee.org/xpl/articleDetails.jsp?reload=true&Citations&tp=&arnumber=4406509&contentType=Conference+Publications&sortType%3Dasc_p_Sequence%26filter%3DAND(p_IS_Number%3A4406493)https://otik.uk.zcu.cz/handle/11025/1315http://www.ijvspa.net/docs/IJVSPA20120105.pdf