Top Banner
On E-Puck Mobile Robots for Distributed Robotics Submitted by Mohamed Isa Bin Mohamed Nasser Department of Electrical & Computer Engineering In partial fulfillment of the requirements for the Degree of Bachelor of Engineering National University of Singapore
104
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Isa's Portfolio - Thesis

On E-Puck Mobile Robots for Distributed Robotics

Submitted by

Mohamed Isa Bin Mohamed Nasser

Department of Electrical & Computer Engineering

In partial fulfillment of the

requirements for the Degree of

Bachelor of Engineering

National University of Singapore

Page 2: Isa's Portfolio - Thesis

i

ABSTRACT

This project provides tools to enable the E-Puck to be used for distributed

robotic experiments. The capabilities of the E-Puck robot for distributed robotics in

terms of communication, motion planning and localization has limitations.

For motion planning, tools to implement grid based motion paths have been

created. This includes software for path planning and synchronization codes. A

demonstration of a sample path is presented. For the limitations in communication

and localization, a second robot is developed to provide pose information and a

communication channel. The development of vision based localization is incomplete

and left for future work.

Page 3: Isa's Portfolio - Thesis

ii

ACKNOWLEDGEMENTS

I am thankful to the patience, understanding and flexibility of my Supervisor Dr.

Hai Lin throughout this project.

I would also like to thank my Graduate Assistant, Mr. Mohammad Karimadini who

has, throughout this project, given much encouragement, constructive inputs and

technical help.

Page 4: Isa's Portfolio - Thesis

iii

TABLE OF CONTENTS

ABSTRACT........................................................................................................................ I

ACKNOWLEDGEMENTS.................................................................................................... II

TABLE OF CONTENTS ...................................................................................................... III

LIST OF FIGURES............................................................................................................. III

CODE FRAGMENTS ........................................................................................................VII

LIST OF TABLES..............................................................................................................VII

LIST OF ABBREVIATIONS...............................................................................................VIII

CHAPTER 1 INTRODUCTION ....................................................................................... 1

1.1 LITERATURE REVIEW ............................................................................................ 3

CHAPTER 2 CAPABILITY OF THE E-PUCK ROBOT .......................................................... 4

2.1 E-PUCK'S SENSORS .................................................................................................. 4

2.1.1 Infrared Sensor............................................................................................... 4

2.1.2 3d Accelerometer.............................................................................................. 5

2.1.3 CCD Camera ..................................................................................................... 6

2.1.4 Microphones..................................................................................................... 6

2.2 LOCALIZATION OF THE E-PUCK ................................................................................ 7

2.2.1 Global Localization ........................................................................................... 7

2.2.2 Relative Localization ......................................................................................... 7

CHAPTER 3 BLUETOOTH COMMUNICATION OF THE E-PUCK........................................ 9

3.1 STRUCTURE OF A PACKET ...................................................................................... 10

3.2 BASIC COMMUNICATION IN THE E-PUCK ............................................................... 11

3.2.1 Overview of Operation in Transparent Mode............................................ 11

3.2.2 Setting to Transparent Mode ...................................................................... 12

3.2.2.1 Factory Reset ...................................................................................... 13

3.2.2.2 Set Device Name .............................................................................. 15

3.2.2.3 Set Device PIN ..................................................................................... 16

3.2.2.4 Set Transparent Mode......................................................................... 16

3.2.3 Establish Connection....................................................................................... 17

3.2.3.1 Inquiry................................................................................................. 18

3.2.3.2 Issues With The Inquiry........................................................................ 19

3.2.3.3 Selection of Devices............................................................................. 19

3.2.3.4 Establish SPP Link................................................................................ 22

3.2.3.5 Issues With SPP Link ............................................................................ 23

Page 5: Isa's Portfolio - Thesis

iv

3.2.3.6 Results From Forcing Establish Connection .......................................... 24

3.2.4 Sending and Receiving Data in Transparent Mode .......................................... 24

3.3 OVERVIEW OF OPERATIONS IN COMMAND MODE................................................. 26

3.3.1 Setting to Command Mode ............................................................................. 27

3.3.2 Open Ports...................................................................................................... 28

3.3.3 Establish Connection....................................................................................... 30

3.3.4 Sending and Receiving in the Command Mode................................................ 32

3.3.4.1 Sending ............................................................................................... 33

3.3.4.2 Receiving............................................................................................. 33

3.4 ISSUES WITH BLUETOOTH COMMUNICATION ................................................. 34

CHAPTER 4 MOTION PLANNING OF THE E-PUCK ROBOT ........................................... 35

4.1 DYNAMIC VS STATIC............................................................................................... 35

4.2 KINEMATICS OF THE E-PUCK.................................................................................. 35

4.3 PHYSICAL SPECIFICATION ...................................................................................... 36

4.4 MODEL FOR DIFFERENTIAL DRIVE.......................................................................... 36

4.5 FORWARD KINEMATICS......................................................................................... 37

4.6 INVERSE KINEMATICS............................................................................................ 38

4.6.1 Inverse Kinematics using Curved Trajectories .................................................. 38

4.6.2 Inverse Kinematics using straight Trajectories................................................. 38

4.7 KINEMATICS SOFTWARE FOR THE E-PUCK............................................................. 39

4.7.1 Tracking the E-Pucks ....................................................................................... 39

4.7.2 Forward Kinematics (software) ....................................................................... 40

4.7.3 Inverse Kinematics using straight paths (software) ......................................... 41

4.7.4 Inverse Kinematics using Curved paths (software)........................................... 42

4.7.5 Graphing feature ............................................................................................ 43

4.7.6 Command list ................................................................................................. 44

4.7.7 Sending Command to E-Pucks ......................................................................... 44

4.7.8 Pending Developments ................................................................................... 44

4.8 GRID BASED PATH PLANNING .............................................................................. 45

4.8.1 Simple Software to convert Grid movements into Code ................................... 45

4.8.2 Add................................................................................................................. 46

4.8.3 Move .............................................................................................................. 46

4.8.4 Wait ............................................................................................................... 47

4.8.5 Generate Code................................................................................................ 47

CHAPTER 5 DEVELOPMENT OF A LAPTOP ROBOT...................................................... 49

5.1 DESIGN OF THE LAPTOP ROBOT............................................................................. 49

5.2 IMPLEMENTING STEREO VISION ............................................................................ 50

5.2.1 Camera Calibration......................................................................................... 50

5.2.2 Matching........................................................................................................ 52

5.2.3 Disparity Map................................................................................................. 52

5.3 STEREO SLAM........................................................................................................ 53

Page 6: Isa's Portfolio - Thesis

v

5.4 COMMUNICATION CHANNEL FOR THE E-PUCK ...................................................... 54

5.5 LOADING TASKS INTO THE E-PUCK......................................................................... 55

5.6 SYNCHRONIZATION OF THE E-PUCK....................................................................... 56

CHAPTER 6 RESULTS ................................................................................................ 58

6.1 DIRECT COMMUNICATION RESULTS FOR E-PUCK................................................... 58

6.1.1 Single Point Through Transparent Mode ......................................................... 58

6.1.2 Multipoint Through Command Mode .............................................................. 59

6.1.3 Quality of Communication Through the Bluetooth Module.............................. 59

6.1.4 Mirroring using a Single point connection ....................................................... 60

6.1.5 Mirroring using a Multipoint Connection......................................................... 60

6.1.6 Slave to Master communication in Command Mode ....................................... 61

6.2 COMMUNICATION THROUGH THE LAPTOP ROBOT ............................................... 61

6.2.1 Grid Based Motion Planning Results................................................................ 62

6.3 RESULTS OF THE LAPTOP ROBOT’S VISION ............................................................ 64

6.3.1 Stereo Vision................................................................................................... 64

6.3.1.1 Matching ............................................................................................ 64

6.3.1.2 Disparity Map ..................................................................................... 66

6.3.1.3 Object Recognition .............................................................................. 69

6.3.1.4 Camera calibration parameters........................................................... 71

6.3.1.5 Triangulation ...................................................................................... 71

CHAPTER 7 CONCLUSION ......................................................................................... 73

7.1 FUTURE WORK...................................................................................................... 73

REFERENCES.................................................................................................................. 75

APPENDIX A - E-PUCK BLUETOOTH MODULE CODES....................................................... 77

APPENDIX B – VISION DATA AND CODES........................................................................ 81

APPENDIX C – E-PUCK SYNCHRONIZATION CODE............................................................ 77

APPENDIX A - E-PUCK BLUETOOTH MODULE CODES....................................................... 77

LIST OF FIGURES

Figure 1.1 : E-Puck IR Sensor Position............................................................................... 5

Figure 2.2 E-Puck 3d Accelerometer Position................................................................... 5

Figure 2.3 E-Puck Microphone Position ........................................................................... 6

Figure 3.1 : Overview of transparent mode .................................................................... 12

Figure 3.2 : Initializing to Transparent Mode .................................................................. 13

Page 7: Isa's Portfolio - Thesis

vi

Figure 3.3 : Actual Packet for Factory Reset.................................................................... 14

Figure 3.4 : Flowchart for Establishing Connection.......................................................... 17

Figure 3.5 : Flowchart for Establishing Connection.......................................................... 20

Figure 3.6 : Flowchart for Forcing SPP Link Connection ................................................... 23

Figure 3.7 : Flowchart of operation in Command Mode .................................................. 26

Figure 3.8 : Initialization for Command Mode................................................................. 27

Figure 3.9 : Establishing Connection in Command Mode................................................. 31

Figure 4.1 : Wheel Base Length ...................................................................................... 36

Figure 4.2 : Model of Differential Drive .......................................................................... 37

Figure 4.3 : Overhead Camera Tracking.......................................................................... 40

Figure 4.4 : Forward Kinematic Controls......................................................................... 40

Figure 4.5: Inverse Kinematics Controls (straight path) ................................................... 41

Figure 4.6 : Straight Line Path ........................................................................................ 42

Figure 4.7 : Inverse Kinematics using Curved Paths......................................................... 42

Figure 4.8 : Trajectory From inverse kinematics using Curve Path ................................... 43

Figure 2.17 NyARToolkit Control Flow............................................................................ 37

Figure 2.18 NyARToolkit Flowchart ................................................................................ 38

Figure 3.1 Android running in SDK Emulator................................................................... 41

Figure 3.2 Android Architecture Protocol stack .............................................................. 43

Figure 4.9 : Grid based Path Planning Software .............................................................. 46

Figure 4.10 : Generated Command from Software.......................................................... 48

Figure 5.1 : Setting up Bluetooth Connection for both E-Puck ......................................... 55

Figure 5.2 : Synchronization flowchart ........................................................................... 57

Figure 6.1 : Mirroring using Multipoint connection......................................................... 61

Figure 6.2 : Grid Movement Sequence for Experiment.................................................... 62

Figure 6.3 : Actual Movement by the E-Pucks................................................................. 63

Figure 6.4 : Matching using SIFT descriptors ................................................................... 65

Figure 6.5 : Matching using Multi Scale algorithm .......................................................... 65

Figure 6.6 : Dense Disparity Map using SIFT matches...................................................... 67

Figure 6.7 : Dense disparity map using multiscale matching........................................... 68

Page 8: Isa's Portfolio - Thesis

vii

Figure 6.8 : Objects selected from corresponding Scene ................................................. 68

Figure 6.8 : Object Recognition, Left image is taken at an earlier time frame................... 70

Figure 6.9 : Stereo Triangulation.................................................................................... 72

CODE FRAGMENTS

Code Fragment 2.1 Get Infrared Sensor ........................................................................... 5

Code Fragment 2.2 Get 3d Accelerometer........................................................................ 5

Code Fragment 2.3 Get CCD Camera ................................................................................ 6

Code Fragment 2.4 Get Microphone ................................................................................ 6

Code Fragment 3.1: Inquiry ........................................................................................... 21

Code Fragment 3.2: Get friendly names of Devices......................................................... 21

Code Fragment 3.3: Select E-Pucks From List.................................................................. 22

Code Fragment 3.4: Sending of the request packet......................................................... 24

Code Fragment 3.5: Send data in Transparent Mode ...................................................... 25

Code Fragment 3.6: Receive data in Transparent Mode.................................................. 25

Code Fragment 3.7: Receive data in Transparent Mode.................................................. 25

Code Fragment 3.8: Establishing multiple SPP link.......................................................... 32

Code Fragment 3.5: Send data in Transparent Mode ...................................................... 25

Code Fragment 3.6: Receive data in Transparent Mode.................................................. 25

Code Fragment 3.7: Receive data in Transparent Mode.................................................. 25

LIST OF TABLES

Table 3.1 : Values in a Packet Structure.......................................................................... 10

Table 3.2 : Restore Factory Settings ............................................................................... 13

Table 3.3 : Write Local Name ......................................................................................... 15

Table 3.4 : Write Device PIN .......................................................................................... 16

Table 3.5 : Set to Transparent Mode .............................................................................. 17

Page 9: Isa's Portfolio - Thesis

viii

Table 3.6 : Inquiry Packets............................................................................................. 18

Table 3.7 : Get device friendly Name.............................................................................. 21

Table 3.8 : Establish SPP Link ......................................................................................... 23

Table 3.9 : Write Operation Mode ................................................................................. 28

Table 3.10 : Opening of Ports......................................................................................... 30

Table 3.11 : SPP link for Command Mode....................................................................... 31

Table 3.12 : SPP Send Data ............................................................................................ 33

Table 5.1 : Grid Based Task Codes .................................................................................. 56

Table 6.1 : Generated Code for the Sequence................................................................. 63

Table 6.2 : Comparison between Disparity Map Value with Manual Measurements........ 69

List of Abbreviations

RSSI Radio Signal Strength Indicator

SIFT Scale Invariant Feature Transform

SLAM Synchronous Localization and mapping

SPP Serial Port Profile

Page 10: Isa's Portfolio - Thesis

1

1

Introduction

The E-Puck robot is a differential drive robot developed by Ecole Polytechnique

Fédérale de Lausanne (EPFL) that is originally used for educational purposes [1].

The E-Puck’s relative low cost and its many sensors have made it popular in

experiments for swarm robotics [2].

Research in distributed robotics has been very active in recent years. Solutions

using multiple agents are often more robust and scalable than using a single agent

solution. The occurrence of emergence in a swarm of robot creates complex

behaviours from simple rules followed by every agent [3]. The hardware

requirements of these robots are relative low, resulting in low cost solutions.

The objective of this project is to develop tools to enable easy implementation of

distributed robotics experiments using the E-Puck robots. This platform

concentrates on providing the E-Puck with robust communication, motion

planning and localization capabilities.

The main constraints that are present in the E-Puck robots are its low processing

power of 16 MHz [1] and Bluetooth communication that can accommodate three

slaves at a time [2]. The extents to which the Bluetooth module can accommodate

a network have been studied.

It is found that the resending of commands to the Bluetooth module needs to be

made to establish connection between E-Pucks which is not described in the

Bluetooth manual [2]. Furthermore, while single point to point connection works

Page 11: Isa's Portfolio - Thesis

2

flawlessly, a multipoint connection results in the Slave unable to send back to the

Master.

For the localization problem, sensors-wise the E-Puck can accommodate relative

localization [4]. This is achieved using the 3 onboard microphones and speaker.

Using the code provided for locating the bearing of the sound source as well as

Bluetooth communication, it is found that even though the bearing of the sound

source is located well, its distance cannot be determined accurately.

As a solution to these constraints, a second robot is designed which uses a laptop

mounted on a wheel base and equipped with two webcams. The role of this robot

is to provide the E-Puck with global localization and a communication channel.

The communication channel has been written in MATLAB. This channel has been

used in the implementation of grid based motion planning. To enable the grid

based movement to be coordinated, synchronization codes is developed for the E-

Pucks.

A demonstration of an example of this movement is presented.

To enable this system to be implemented in the real world, the laptop robot will

use vision based techniques to achieve SLAM. Algorithms to achieve stereo

vision using two regular webcams are explored. For the correspondence problem,

the Scale Invariant Features Transform (SIFT) [5] is compared with another Multi

Scale Feature matching technique that uses Fisher Information Analysis to decide

on correspondence [6].

Page 12: Isa's Portfolio - Thesis

3

The sparse disparity maps from this matching are converted into dense map by

weighting a segments of the original image. These segments are achieved using

mean shift segmentation algorithm over the original pixels in greyscale. The

accuracy of this dense map is assessed.

1.1 Literature Review

An approach to solving these constraints is by increasing the capability of the E-

Pucks by building hardware extensions. Increasing the capability of the E-Puck

has been explored by creating extension hardware that is fitted in the free UART

provided. Gutiérrez [7] creates an extension that provides situated

communication. This refers to communication that contains information of both

the message and also the physical property of its location. This solves certain

constraints the E-Puck has in communication and localization.

Xiaolei and Changbin [8] have created a testbed for swarm robotics for the E-

Pucks which have similar goals to this project. This testbed gives the E-Puck

global positioning through an overhead camera. Using this information, control

algorithms for navigation is studied.

The testbed focuses on providing the global position to the E-Pucks through a

static world camera. In this project the camera are to be mounted on a laptop robot

which localizes itself globally through vision. Also, the synchronization of the E-

Pucks are achieved through inter E-Puck communication and not coordinated by

the laptop robot.

Page 13: Isa's Portfolio - Thesis

4

2

Capability of the E-Puck Robot

The E-Puck robot is a 2 wheeled differential drive robot running on the DSPic

microcontroller developed by Ecole Polytechnique Fédérale de Lausanne (EPFL),

France. The main objective for the design of the robot is as an educational tool

[1]. Due to its relative low cost and sensor capabilities, it has also been used for

research in swarm robotics.

In the following section, the sensors that the E-Pucks are equipped with will be

discussed and the corresponding codes to extract its values listed.

2.1 E-Puck’s sensors

In this section, the details of the E-Puck sensors are discussed.

2.1.1 Infrared Sensor

8 Infrared proximity sensors with an effective range of about 20 cm are positioned

around the E-Puck. The position is illustrated in Figure 1.1. These sensors like the

Khapera robot [9] is not positioned in an equidistant manner around the robot.

Therefore a fusion method needs to be used to assess a sensed obstacle position

relative to the robot’s orientation. The weights used in Breitenberg fashion is

provided in the library.

Page 14: Isa's Portfolio - Thesis

5

Figure 2.1 : E-Puck IR Sensor Position

Code to extract Value

int e_get_prox(unsigned int sensor_number)

Code Fragment 1.1: Get Infrared Sensor

Where sensor_number is an integer from 0 to 7 that is reflected on the figure.

2.1.2 3d accelerometer

The accelerometer in the E-Puck is placed in the position illustrated in the Figure

1.2. The convention used of the axis is also shown.

Figure 2.2 : E-Puck 3d Accelerometer Position

Code to extract Value

int e_get_acc(unsigned int captor)

Code Fragment 2.2: Get 3d Accelerometer

Where captor is 0 for the x axis, 1 for the y axis and 2 for the z axis

Page 15: Isa's Portfolio - Thesis

6

2.1.3 CCD Camera

The CCD colour camera has a resolution of 640x480 however the fidelity of the

pictures fairly low. Due to the clock speed of the E-Puck, complex image

processing is not possible. This camera is often used for basic colour extractions.

Code to extract value

e_po3030k_launch_capture(char *buffer);

Code Fragment 2.3: Get CCD Camera

Where buffer is a character array to store the image.

2.1.4 Microphones

There are three microphones on the E-Puck. They are located in a symmetrical

manner on the E-Puck in an isosceles triangle configuration as shown in Figure

1.3. The reason for the availability of three microphones is to enable trilateration

of a sound source. The time difference in the detection of the sound source for the

three microphones can be used to calculate the bearing of the sound source [4].

Figure 2.3 : E-Puck Microphone Position

Code to extract value

e_po3030k_launch_capture(char *buffer);

Code Fragment 2.4: Get Microphone

Where buffer is a character array to store the image.

Page 16: Isa's Portfolio - Thesis

7

2.2 Localization of the E-Puck

In order to carry out distributed robotics experiments, the robot has to be aware of

its pose. This information of its pose can either be relative to each other, and

hence relative localization, or a global one, which is global localization.

2.2.1 Global Localization

Global localization of the E-Puck is not implementable. The 3d accelerometer and

the IR proximity sensors are not capable of tracking features over distances.

Although there are techniques for landmark localization and SLAM for a single

camera, the 16MHz microcontroller is not powerful enough for the involved

calculations.

2.2.2 Relative Localization

Relative localization of the E-Puck can be achieved in a few ways. There are three

microphones located at different points on the E-Puck. A sound wave travelling

from a certain direction will be detected by the microphones at different times.

The difference of these times can determine the bearing of the sound source. The

first E-Puck will send a Bluetooth signal to the second E-Puck which commands it

to emit the sound. A second E-Puck will emit this sound wave from its speaker.

Upon sending the Bluetooth signal, the first E-Puck immediately starts counting.

It will stop counting the moment it detects the sound. The number of counts

divide by its clock speed determines the time for which the sound wave travels.

This time multiplied by the speed of sound gives the distance between the two E-

Pucks. With the bearing and distance, these two E-Pucks can change roles to

Page 17: Isa's Portfolio - Thesis

8

determine the second E-Puck’s orientation. This whole process achieves relative

localization for the two E-Pucks [4].

If there are three or more E-Pucks with at least the initial distance between two E-

Pucks known, we can simply find the bearing of the third E-Puck from the two E-

Pucks. The location of the intersection of the line of the two bearings can be

calculated since 2 angles and 1 length of the triangle is known.

Lastly, received signal strength indication of the Bluetooth communication

module can be used. The larger the distance between two communicating E-

Pucks, the lower the value of RSSI of the communication link [10]. With a

configuration of 4 E-Pucks, 3 of which with known position, the position of the

fourth E-Puck can be determined through trilateration. This methods will not be

used since only 3 E-Pucks is available currently.

Page 18: Isa's Portfolio - Thesis

9

3

Bluetooth Communication of the E-Puck

The E-Puck communicates using the LMX9820A Bluetooth module. It runs on

two main operating modes. The transparent mode is designed for single point to

single point link. It emulates a cable between two E-Pucks; every message pushed

into the Tx buffer of one E-Puck will be sent directly to the Rx buffer of the

linked E-Puck. Bluetooth is designed in a master-slave structure. Using

transparent mode, this structure is ignored.

The command mode is for multi point communication. In conventional Bluetooth

modules, a multi point connection of 8 agents can be established at one time. This

means that the master is connected to 7 slaves in a piconet. This enables for

scatternet formation when bridge nodes links one piconet to another. Due to

memory constraints, the Bluetooth module is able to accommodate a piconet of 1

master and 3 slaves. This makes ad hoc networks based on scatternet structures

being more impractical since one of the slaves in the piconet has to function as a

bridge node. A higher number of bridge nodes generally imply slower network

performance [11]. Also, since two piconets are asynchronous, scheduling a high

number of bridge nodes can be difficult.

In order to use any of the module’s function, a packet needs to be written

manually according to the format listed in the software user guide. However this

guide is verbose and does not have any implementation in any specific language.

Page 19: Isa's Portfolio - Thesis

10

In the following section, a step by step guide to the format of the packet, as well

as fragments of the actual code in C, is presented.

3.1 Structure of a packet

Byte Value Desciption

01 0x02 Start delimiter

02 0x52(REQ)/0x43(CFM)/0x69(IND)/0x72(RES) Packet type

03 <unique command identifier> Opcode

04 – 05 Data Length

06 <Packet type> + <Op code> + <Data Length> Checksum

07 - <last

byte -1>

Command

specific values

<last byte> 0x03 End delimiter

Table 3.1 : Values in a Packet Structure

Referring to table 3.1 for this section, the start delimiter is always 0x02. It marks

the start of the packet. There are three types of packet, namely request (REQ),

confirm (CFM), indication (IND), response (RES). Most operations are made up

of sequential sending and receiving of different packet types. For example, to

establish an SPP connection, a request type followed by a confirm type packet is

required. These sequences are found on page 102 onwards o in the software user

guide (SWUG).

The opcode is a commands identifier. It is therefore unique for each command and

the list of available commands can be found on page 100 of the SWUG.

Page 20: Isa's Portfolio - Thesis

11

The checksum is a Block Check Character (BCC) checksum of the packet type,

opcode and data length.

Byte 7 onwards is called the “packet data” area and it holds command specific

values that is stated in the corresponding command in the list on page 102

onwards of the SWUG. Finally an end delimiter marks the end of the packet.

There are many commands that are listed for the Bluetooth module. However, for

normal use, there are a few important commands that are more important than the

rest. These commands are those related to the actual sending and receiving of

packet, as well as the establishing of connection. From the experience of

implementing these commands, certain problems and workarounds that are not

explicitly defined in the guide have been encountered. Therefore, the proceeding

section will discuss these functions as well as code fragments of the actual

implementation.

3.2 Basic communication in the E-Puck

Basic communication here refers to communication that does not have any energy

saving or security requirements that is not already set in the factory settings.

As mentioned earlier, the Bluetooth module runs on 2 operating modes, namely

the transparent mode and the command mode.

3.2.1 Overview of Operation in Transparent Mode

The Figure 3.1 illustrates a high level view of the operation in transparent mode.

Both the E-Pucks involved must be set to transparent mode. It follows by either of

Page 21: Isa's Portfolio - Thesis

12

the E-Pucks initiating a establishing a connection command sequence after which

both E-Pucks has equal ability to transmit data.

Figure 3.1 : Overview of transparent mode

3.2.2 Setting to transparent mode

The Bluetooth module has an EEPROM memory to store its configuration even

after the E-Puck has been switched off. If the state of the configurations is not

known at any point of time, a quick way to ensure that the right configurations are

being used for the operation is by using a factory reset command. This command

is then followed by the setting of the required configuration parameters.

In the case of transparent mode, the required configuration after the factory reset

is the setting of the device name and PIN, followed by the setting of operation

mode to transparent mode. This sequence of configuration is shown in Figure 3.2.

Page 22: Isa's Portfolio - Thesis

13

Figure 3.2 : Initializing to Transparent Mode

Now we will proceed to the actual low level command to achieve this flowchart.

3.2.2.1 Factory Reset

The factory reset involves a request followed by a confirm.

Packet Transmissions

Table 3.2 : Restore Factory Settings

As the starting example, some parameters here shall be clarified. From Table 3.2,

under the “Opcode” row, “RESTORE_FACTORY_SETTINGS” is provided

instead of the hexadecimal number required for programmatic implementation.

This phrase is actually referring to the name of the opcode from the list of

opcodes on page 98 of the SWUG. In the same way ERROR_OK and

Page 23: Isa's Portfolio - Thesis

14

ERROR_INVALID_NO_OF_PARAMETERS refers to error codes on page 180

of the SWUG.

Since the second packet is a confirmation type packet, these error codes refer to

the possible data that can be in the corresponding byte. ERROR_OK refers to a

successful operation and takes a value 0x00. To clarify this operation, Figure 3.3

shows the actual request packet that are being sent, and the important parts of the

received packet.

Figure 3.3 : Actual Packet for Factory Reset

The confirm packet is as discussed. Notice that for the received packet, the

important portion is the error code. Also the datalength and end delimiter can be

used to check that the full packet has been received.

Page 24: Isa's Portfolio - Thesis

15

The rest of the codes for initializing the E-Puck are done in a similar manner by

referring to the SWUG for the format. To make this report concise, the subsequent

functions will have its importance described, followed by a table with the sending

and receiving packets needed that can be implemented in the same manner as the

factory reset example. The full codes used to achieve all the commands for a

normal connection has been included in the appendix.

3.2.2.2 Set Device Name

Device name is the name of the E-Puck used when it is being discovered. By

convention, the E-Pucks are named as “epuck_<ID>” where ID refers to the

corresponding ID physically labelled on the E-Puck. Setting the device name is

important so that the E-Pucks can differentiate themselves in the network.

Packet Transmissions

To set the device name, the packet transmission in Table 3.3 has to be followed.

Send request:

Receive confirm:

Table 3.3 : Write Local Name

Page 25: Isa's Portfolio - Thesis

16

3.2.2.3 Set device PIN

The Bluetooth device is connected to each other through a process called pairing.

This involves the discovery of the Bluetooth device followed by the entering of a

PIN. This PIN acts as a security mechanism to protect the device from

unauthorized access. By convention, the E-Puck’s PIN is its ID. Setting this after

factory reset is useful to ensure the right device is being connected to.

Packet Transmission

To set the device PIN, the packet transmission in Table 3.4 has to be followed.

Send Request:

Receive Confirm:

Table 3.4 : Write Device PIN

3.2.2.4 Set Transparent Mode

In order to operate in transparent mode, this configuration has to be set on both E-

Pucks involved in the communication.

Packet Transmission

To set to Transparent Mode, the packet transmission in Table 3.5 has to be

followed.

Send Request:

Page 26: Isa's Portfolio - Thesis

17

Receive Confirm:

Table 3.5 : Set to Transparent Mode

3.2.3 Establish connection

Since Bluetooth uses frequency hopping to provide a connection between two

devices, the mechanism for establishing of connection is relatively complex.

Figure 3.4 : Flowchart for Establishing Connection

From Figure 3.4, it shown that E-Puck 1 is connecting to E-Puck 2. After

configuring its settings, E-Puck 2 has to simply wait for a connection from E-Puck

1. There are three steps in order to establish a connection in the transparent mode.

Firstly, the inquiry step is the “finding” step where Bluetooth devices within the

Page 27: Isa's Portfolio - Thesis

18

network range is inquired and their details being saved in the E-Puck. This list of

Bluetooth devices then have to be parsed in the second step and only the E-Puck 2

have to be selected. Lastly, with the right E-Pucks known, we can then make a

connection. This process is shown in the Figure 3.4.

3.2.3.1 Inquiry

The inquiry command has three steps; a request followed by a confirm and then

the indicator packets which gives the information of the actual devices found.

Packet transmission

Send Request:

Receive Confirm:

Receive Indicator:

Table 3.6 : Inquiry Packets

Page 28: Isa's Portfolio - Thesis

19

3.2.3.2 Issues with the Inquiry

One issue with the inquiry process is that sometimes devices within range are not

found. This problem can solved by checking if a particular device has been found

and running the inquiry again if it has not. However, in scenarios where the

targeted devices within range are not known exactly, this error must be taken into

consideration.

Alternatively multiple runs of the inquiry command can be made and any new

devices found will be saved. Since this process is time consuming and the

discussion involves connection between known E-Pucks, this approach is not

covered. Instead a check for the targeted E-Puck will be done. This will be

covered in the selection section.

3.2.3.3 Selection of Devices

Among the ways to identify a Bluetooth device using information from the

inquiry scan, using the friendly device name has been chosen. Friendly device

name refers to the same device name covered in the preceding configuration

section. One requirement for selection through this process is that the devices of

interest must be named uniquely. Since the E-Pucks are named uniquely, this

requirement is met.

The first step is to parse the list of devices for E-Puck types. This will be a useful

process since the list of E-Pucks can be used to either find a particular E-Puck, or

updated when new runs of the inquiry is made. Updating all the non E-Puck

devices found is computationally wasteful since they are not of interest.

Page 29: Isa's Portfolio - Thesis

20

Figure 3.5 : Flowchart for Establishing Connection

The packet transmission that is involved to acquire the device friendly name is

covered next.

Packet Transmission

Send Request:

Receive Confirm:

Page 30: Isa's Portfolio - Thesis

21

Table 3.7 : Get device friendly Name

Firstly, an inquiry is made as shown in the code fragments 3.1.

int device_find=e_bt_inquiry(e_bt_present_device);

Code Fragment 3.1: Inquiry

Where device_find is the number of Bluetooth devices found and

e_bt_present_device is an array of a structure in the library called BtDevice. The

exact specification of this structure is less important in this discussion; only the

required components will be discussed. This method now populates

e_bt_present_device with information of the devices found.

Next, the array of devices e_bt_present_device is used to get the friendly names of

all the devices found as shown in Code Fragment 3.2. This is of course using the

command discussed on the previous section.

For (i=0;i<device_find;i++) { e_bt_get_friendly_name(&e_bt_present_device[i]); } Code Fragment 3.2: Get friendly names of Devices

Where device_find is the same variable from the inquiry process holding the

number of devices found though the inquiry. e_bt_get_friendly_name is the command

to acquire the friendly name of the Bluetooth device pointed to be the array

_bt_present_device[i].

It will assigned the friendly_name char array inside the BtDevice structure with a

character array of the device’s name. This array can therefore be checked if it

starts with the characters ‘e’, ‘-‘, ‘p’ , ‘u’, ‘c’, ‘k’ . In most cases the first three

Page 31: Isa's Portfolio - Thesis

22

letter is sufficient. This condition is met using the a conditional statement shown

in Code Fragment 3.3.

if((e_bt_present_device[i].friendly_name[0]=='e')& (e_bt_present_device[i].friendly_name[1]=='-')& (e_bt_present_device[i].friendly_name[2]=='p')) { // It is an E-Puck. Store the device information } Code Fragment 3.3: Select E-Pucks From List

Where i is referring to the counting variable in the For loop. When this condition

is met, it means that the device is indeed an E-Puck and the next step is to store

the device information. This step is trivial and will not be covered. The codes are

in the appendix.

3.2.3.4 Establish SPP Link

Now that the information (in the form of the BtDevice structure) of the device that

is being connected to is known, the next step is to use the command to establish

Serial Port Profile (SPP) link. This command does the actual connection to the E-

Puck itself.

The term Serial Port Profile is used because it emulates a serial connection

between the connected devices.

Packet Transmission

Send Request:

Page 32: Isa's Portfolio - Thesis

23

Receive Confirm:

Table 3.8 : Establish SPP Link

3.2.3.5 Issues with SPP Link

Similar to the problem in the inquiry command, this command often receives an

error from the confirmation. This occurs when the E-Puck referred to by the

BtDevice structure can not be found for a particular connection attempt.

A way to handle this is to keep sending the request packet until the confirmation

has no errors (ERROR_OK).

Figure 3.6 : Flowchart for Forcing SPP Link Connection

Page 33: Isa's Portfolio - Thesis

24

Code snippets for forcing establish SPP link command

The Code Fragment 3.4 shows the sending of the request packet followed by the

receiving of the confirm packet.

do{ e_send_uart1_char(send,15); // send the request packet i=0; c=0; do{ i=0; if (e_getchar_uart1(&read[i])) // read the confirm packet { c=read[i]; i++; } } while (((char)c != 0x03)||(i<(read[3]+6))); // while the end of the confirm packet // hasn’t been reached }while(read[6] !=0x00); // keep sending request packet until the // confirm packet has ERROR_OK (0x00) Code Fragment 3.4: Sending of the request packet

Where e_send_uart1_char is the command to send the packet to the Bluetooth

module through the UART. send is a character array of the request array and read

is the character array of the confirm packet.

Notice that the last line of the snippet, “while(read[6]!= 0x00)”. The seventh byte

of the confirm packet refers to the error code. Therefore this code keep sending

the request packet (send) until ERROR_OK is received.

3.2.3.6 Results from forcing establish connection

Initially this command was not successful even once out of a total of 50 tries.

After the addition of this code, it never fails to establish connection. Therefore

this is a crucial code fragment to establish connection.

3.2.4 Sending and Receiving data in Transparent Mode

Page 34: Isa's Portfolio - Thesis

25

The transparent mode emulates using a cable link between to devices. Pushing a

packet into the Tx of the UART will automatically be sent to the Rx of the

corrresponding linked UART. In this sense, the transmission in this mode is

simple.

Code snippets for data transmission in transparent mode

Sending

void e_send_uart1_char(char *send, unsigned int datalength );

Code Fragment 3.5: Send data in Transparent Mode

Where send is a character array and datalength is its length.

This method in Code Fragment 3.5 will push the data in the send array into the Tx

of the UART.

Receiving

bool e_getchar_uart1(char *read)

Code Fragment 3.6: Receive data in Transparent Mode

The method in Code Fragment 3.6 receives 1 byte of the data in the Rx channel of

the UART. It return true when a data is received and zero if there is no data to be

received. Therefore a loop is required and a check on the end delimiter OR the

datalength information in the packet must be used to separate one received packet

from another in the Rx. Code fragment 3.7 shows this operation.

do { if (e_getchar_uart1(&read[i])) // read the confirm packet { c=read[i]; i++; } } while (((char)c != 0x03)||(i<(read[3]+6))); // check if the packet has ended Code Fragment 3.7: Receive data in Transparent Mode

Here either the end delimiter of value 0x03 OR datalength + header length has

been received will mean the packet end has been met. Header length is always 6

bytes. The data length can be read from the fourth byte of confirm packet.

Page 35: Isa's Portfolio - Thesis

26

The use of an OR statement instead of an AND statement is to be conservative in

the sense that either of the condition is made sufficient. This is to prevent

deadlocks in the case some packet loss may cause either of the condition not met.

3.3 Overview of Operation in Command Mode

The transparent mode is ideal for single point to point connection. For multipoint

connections, the command mode has to be used. Since most of the methods used

in the transparent mode is also used in the command mode, this section will focus

on the additional methods used for operation in this mode. The following

flowchart shows a high level representation of communication in this mode.

Figure 3.7 : Flowchart of operation in Command Mode

The similarities of the operation under both modes are many. Both modes require

the initializing of EEPROM configuration to their respective modes for all the

Page 36: Isa's Portfolio - Thesis

27

involved devices. Both require only the Master E-Puck to establish connection

with the rest of the E-Pucks.

The main differences, command-wise, between the modes are that the command

mode has to open up multiple ports for connection with multiple E-Pucks. The

establishing of connection to these E-Pucks requires a free port for each E-Puck.

The sending and receiving of data use the port number corresponding to the

connection to a specific E-Puck specify its destination and to differentiate the

received packets.

3.3.1 Setting to command mode

The procedure to set the E-Puck to command mode is almost identical to that of

setting it to transparent mode.

Figure 3.8 : Initialization for Command Mode

Page 37: Isa's Portfolio - Thesis

28

The only difference lies in the last block where instead of setting it to transparent

mode, the appropriate command for command mode is used. For completeness,

the packet transmission to achieve this is covered.

Packet Transmission

Send Request:

Here the data in the seventh byte of the packet must be 0x00 for command mode.

Receive Confirm:

Table 3.9 : Write Operation Mode

The seventh byte is checked for ERROR_OK.

3.3.2 Open Ports

In the Bluetooth communication, each SPP link is assigned with a unique port.

These ports acts channels to identify the E-Puck involved in the transmission.

These ports have 2 states; they can either be open or closed. Hence, before

connection can be established, a command to open the required ports must be

executed first.

Page 38: Isa's Portfolio - Thesis

29

There are 30 available ports in the Bluetooth module. It is assigned using a 32-bit

mask (4 bytes). The 31st and 32nd bit must be ‘0’. The opening of the ports is best

explained with an example.

The 4 bytes of hexadecimal data used to open the ports is shown is

0x00 0x00 0x00 0x07

This is equivalent to 0x00000007. It is represented as elements of a character

array which is 1 byte each.

The binary representations of all the bytes are zeros except for the right-most byte

which is

000001112

These binary digits correspond to a port that ascends from right to left.

Binary digit 0 0 0 0 0 1 1 1

Corresponding Port

Port 8 Port 7 Port 6 Port 5 Port 4 Port 3 Port 2 Port 1

This code, therefore, opens port 1 to port 3 while closing the rest of the ports.

There’s a last twist to be aware of. The 4 bytes must be byte inverted in the

packet. Hence, it must be sent to the UART in the following byte inverted order:

0x07 0x00 0x00 0x00

Packet transmission

Send Request:

Page 39: Isa's Portfolio - Thesis

30

The “PORTS 4 bytes” is the 32 bit mask described.

Receive Confirm:

Table 3.10 : Opening of Ports

3.3.3 Establish Connection

The difference in the establishing of connection in the command mode compared

to the transparent mode is the number of SPP links to be made and the port

numbers to be specified in the SPP Link.

Page 40: Isa's Portfolio - Thesis

31

Figure 3.9 : Establishing Connection in Command Mode

In Figure 3.8, the inquiry and selection blocks is identical to the one is transparent

mode. The difference lies in the establishing of SPP link.

Table 3.11 : SPP link for Command Mode

In the request packet shown in table 3.11, there are two port parameters. On the 7th

byte, there is the local port number which defines the port of the master E-Puck

that is used for the connection. On the 14th byte, there is the remote port number

that defines the port of the slave E-Puck.

Page 41: Isa's Portfolio - Thesis

32

Assuming the slave E-Pucks has only 1 master, the remote port number can

always be set to port 1 or 0x01. Since the Master E-Puck has to connect to

multiple slaves, the local port number has to correspond to those which are set to

“open” earlier. They must also be unique for each E-Puck connection.

The SPP connections, which involves a request and a confirm packet as before,

are done sequentially for every E-Puck to be connected to by the Master E-Puck.

Code fragment 3.8 shows the establishing of multiple SPP links.

e_bt_write_local_pin_number(&e_bt_present_epuck[0].number[0]); error=e_bt_establish_SPP_link(&e_bt_present_epuck[0].address[0], 0x01); e_bt_write_local_pin_number(&e_bt_present_epuck[1].number[0]); error=e_bt_establish_SPP_link(&e_bt_present_epuck[1].address[0], 0x02); Code Fragment 3.8: Establishing multiple SPP link

Where e_bt_present_epuck[i].address[n] is the nth byte of the 4 byte address for

the ith E-Puck of the inquiry list. e_bt_present_epuck[i]. number [n] is the nth byte

of the 4 byte PIN for the ith E-Puck of the inquiry list.

In order to authenticate the PIN for every slave, the Master E-Puck change its own

PIN to that of the slave’s and revert it at the end of the code. The

e_bt_write_local_pin_number method writes the PIN of the slave E-Puck

temporarily to the master E-Puck for the establish SPP link command to occur.

The e_bt_establish_SPP_link method, handles the packet transmissions with using

the 2nd parameter as the port for the connection. It always sets the remote port to

port 1.

3.3.4 Sending and Receiving in the Command Mode

Unlike in the transparent, direct sending linking of the Tx and Rx channels of the

UART between two E-Pucks is not possible due to multiple connections from the

Page 42: Isa's Portfolio - Thesis

33

Master to the Slave. A packet needs to be constructed with enough information to

distinguish its source, when receiving, or destination, when sending.

3.3.4.1 Sending

In the transparent mode, there is no designated command to send data between E-

Pucks. The command mode requires a command due the reasons mentioned and

this command is SPP_SEND_DATA.

Packet Transmission

Send Request:

Here the payload is the character array to be sent and the payload size is its

corresponding size. Sample code has been included in the appendix to clarify the

syntax of constructing this packet.

Receive Confirm:

Table 3.12 : SPP Send Data

3.3.4.2 Receiving

Page 43: Isa's Portfolio - Thesis

34

The data received in transparent mode is the actual raw data sent by the sender.

For example when sending the character array {‘t’, ‘e’, ‘s’, ‘t’}, the data in the Tx

of the sender is directly sent to the receiver:

However in command mode, using the SPP_SEND_DATA command, The data is

sent in inverted order byte by byte in the form of packets:

3.4 Issues with Bluetooth communication

In command mode, the data that is received by the slaves, contrary to the manual,

does not contain the delimiters or comes in byte by byte. The data that is received

are the raw data itself just as in transparent mode. Master to multiple Slaves

communication is successful. The master is running an obstacle avoidance code

while the slaves are mirroring its movements.

However a persisting problem is getting the Slaves to send data back to the

master. The master is unable to receive packets from slaves in command mode

using the command SEND_SPP_DATA. This is a major problem since scatternets

cannot be formed and a robust communication network for multi agents

communication.

Page 44: Isa's Portfolio - Thesis

35

4

Motion Planning of the E-Puck Robot

Motion planning or the navigation problem is the process of translating a task that

involves movement into actual motor commands.

4.1 Dynamic vs Static

Motion planning can be divided into two categories, static and dynamic. Static

motion planning is often computed offline. The term offline computations refers

to computations that are done before the task starts often by a central unit which is

then transferred to the agents involved in the task. Static motion planning does not

take into account changes in the environment after the commencement of the task.

Dynamic motion planning is carried out online during the task itself. It is more

versatile and takes into account changes in the environment. In most cases, robots

used in distributed robotics, including the E-Pucks, have processors with low

clock speeds. This is to keep cost low for the duplication of the robot.

Consequently, dynamic motion planning algorithm often has to be kept simple

and computationally low.

4.2 Kinematics of the E-Puck

The E-Puck is a differential drive robot. Differential drive refers to two wheels

that can be driven independently in both directions, clockwise and anticlockwise.

It is important to study the kinematics of the E-Puck to identify useful models that

Page 45: Isa's Portfolio - Thesis

36

can be used for motion planning. We first look at physical specifications of the E-

Puck that affects its kinematics.

4.3 Physical Specification

The following are the physical specifications of the E-Puck that affects its

kinematics.

Motor type: Stepper Motor

Motor Speed: 1000 steps / second, 1000 steps / revolution

Wheel Diameter: 41 mm

Wheelbase length: 53 mm

The wheelbase length is defined as the length between two points of contact that

the two wheels make with the ground. This is very close to the length shown in

the following picture:

Figure 4.1 : Wheel Base Length

4.4 Model for Differential Drive

Page 46: Isa's Portfolio - Thesis

37

Figure 4.2 : Model of Differential Drive

Where l is the wheelbase length, Vl is the left wheel velocity, Vr is the right wheel

velocity, R is the turning radius, ICC is the instantaneous center of curvature, ω is

the angular velocity, Figure 4.2 shows the model for a differential drive trajectory.

The wheel input values are integers from -1000 to 1000 where negative value

steer “backwards”, zero stop the wheel and positive value steers “forward”. This

input value corresponds conveniently to physical value of steps/second. Therefore

to obtain the Vl and Vr value, from the input values, the following can be used:

Vl = Il / 1000 × π × 41

VR = IR / 1000 × π × 41

Where the resulting unit is in millimetre per second. With this, we can proceed to

use the rest of the equations that can be used to model the differential drive

trajectory.

4.5 Forward Kinematics

If the current position and orientation of the robot is known, forward kinematic is

the calculation of the new position and new orientation of the robot after a set of

known inputs is carried out at known time intervals.

Page 47: Isa's Portfolio - Thesis

38

The above equation relates the turning radius and the angular velocity to the input

wheel velocities.

Where x is the initial horizontal position, y is the vertical position and θ is the

orientation at time t. A set of (x,y, θ) is collectively known as the pose of the

robot. (x’, y’, θ’) is the pose at time (t + δt).

These equations can be used to predict the motion and pose of the E-Puck after an

array of given inputs [12].

4.6 Inverse Kinematics

Inverse kinematics is the process of determining the inputs in order to achieve a

desired pose.

4.6.1 Inverse Kinematics using Curved Trajectories

There are only 20002 combinations of left and right wheel speeds. This can be

reduced by taking steps of 10. With this, solutions for a certain destination points

can be obtained by exhaustively running through these trajectories.

4.6.2 Inverse Kinematics using straight Trajectories

Page 48: Isa's Portfolio - Thesis

39

A more practical method of dead reckoning is through straight trajectories since it

is less error prone especially from slippage. The formulas for the trajectories can

be derived from simple trigonometry.

T1 = atan2 (y’ – y, x’ – x) rad

T2 = ’ – y – T1 rad

D = ( ( x’ – x ) 2 + ( y’ – y ) 2 )1/2 mm

The process will be is hence a rotation T1 rad to face destination � Translate D

mm to destination � Rotation T2 to achieve orientation of destination pose.

4.7 Kinematics software for the E-Puck

A software has been partially developed for offline path planning. This will be

useful in order to automate the calculations especially for inverse kinematics and

also visually selecting destination points by using an overhead camera. This

section will be dedicated to the current working features of the software and its

pending features that can be further developed by later project groups.

4.7.1 Tracking the E-Pucks

In order to track the E-Pucks, a toolkit for tangible multi-touch surfaces developed

by Reactivision is used. This toolkit uses fudicial markers and tracks them using

an overhead webcam.

Page 49: Isa's Portfolio - Thesis

40

Figure 4.3 : Overhead Camera Tracking

The E-Pucks that are detected are reflected in the basic program provided as black

boxes as shown in figure 4.3. Since this display is now used for robotics, certain

visuals have been made so that analysis of the movement is enhanced. Error is

represented in red text. Error is the difference between the calculated position

based on the differential drive model and the actual position of the E-Puck. Also,

a green line is drawn to represent the velocity vector of the robots. This is make

assessment of how fast the E-Pucks are moving relative to each other, easier.

4.7.2 Forward Kinematics (software)

Figure 4.4 : Forward Kinematic Controls

The controls used for forward kinematics is rather intuitive. It is shown in Figure

4.4. One will have to use the drop down boxes to set the right wheel speed (RW

Page 50: Isa's Portfolio - Thesis

41

speed), left wheel speed (LW speed) and time at which the wheels run at the

speed. The trajectory that the E-Puck will travel at will be drawn on the display

area as a green line.

4.7.3 Inverse Kinematics using straight paths (software)

Figure 4.5: Inverse Kinematics Controls (straight path)

The algorithm used for this box is the same as the preceding section on inverse

kinematics. The control is also intuitive as shown in Figure 4.5. Destination X, Y

and angle defines the final pose required. The “Get from Marker” dropbox uses a

marker already detected by the camera as the destination pose. Figure 4.6 shows a

straight path planned using this control. The green line defines this path.

Page 51: Isa's Portfolio - Thesis

42

Figure 4.6 : Straight Line Path

4.7.4 Inverse Kinematics using Curved paths (software)

Figure 4.7 : Inverse Kinematics using Curved Paths

The algorithm used for inverse kinematics using curved path is an exhaustive

search through all the left and right wheel speed with the maximum time for each

combination defined in the “time for each route” field. A dropbox of possible

Page 52: Isa's Portfolio - Thesis

43

solutions is presented at the end of the search. This dropbox is shown in Figure

4.7 with the label “Choose Trajectory”.

Alternatively, the user can simply click on the display to define the destination

point. The destination angle will be set to zero using this method. This method is

used in the generation of trajectory in Figure 4.8.

Figure 4.8 : Trajectory From inverse kinematics using Curve Path

4.7.5 Graphing feature

With the information on the movement agents on screen, it would be useful to

save the data and represent it graphically as X, Y and angle.

The Zedgraph library for C#.net is used to keep track of the position of 1 agent

currently. This feature is not fully functional since the task organization for

multiple markers has not been developed yet. However the code for 1 robot can be

extended to multiple agents once task organization for multiple agent has been

coded.

Page 53: Isa's Portfolio - Thesis

44

4.7.6 Command list

The command list saves the commands that have been made from the three

movement options. This list is made up of two arrays, one which it displays,

showing the motor speed required for every task. And the other an array not

displayed which holds the actual message to be sent to the E-Pucks.

4.7.7 Sending Command to E-Pucks

Termie is a serial RS232 terminal developed in C#.net. The code for this terminal

implementation has been integrated into the motion planning software in order to

send the planned movement commands to the respective E-Pucks through

Bluetooth.

Although the software currently can occasionally send commands to a single E-

Puck, the sending is still buggy. This is an issue of coordinating the thread that

runs Termie. Data from the main form needs to be sent to the terminal’s thread

and this has caused certain possible deadlock situations.

4.7.8 Pending Developments

The main pending features that are to be developed are:

• Task management for multiple agents

• Organize the visual for multiple agents (which task to show at a certain

point of time for each agent)

• Replacing termie with a reliable way to send data to the E-Pucks

Page 54: Isa's Portfolio - Thesis

45

4.8 Grid based path planning

Due to the lab’s interest to developed grid based navigation algorithms by

decomposing global tasks, a big emphasis of the project is to develop a way to

plan, coordinate and implement quickly grid based paths into the E-Puck robots.

This was not implementable due to the issues of multipoint connection in

Bluetooth communication. The synchronization code between the E-Pucks has

also yet to be developed. Lastly, a quick way to convert a certain sequence of grid

based movement into an easy to enter code into the E-Puck is not available yet.

4.8.1 Simple Software to convert Grid movements into Code

The idea of keying in repetitive code for a repetitive translation and rotation

motion is very inefficient. Therefore software needs to be developed to automate

this code generation. The main requirement of the software is that it must be quick

and intuitive. The screen shot of the software can be seen in Figure 4.9.

Page 55: Isa's Portfolio - Thesis

46

Figure 4.9 : Grid based Path Planning Software

4.8.2 Add

Based on the paper, there are four types of squares. Empty squares, squares

containing an E-Puck, avoid squares and goal squares. Squares are empty by

default. Each of these squares can be added by selecting the appropriate radio

button followed by a double click on its desired placement.

E-Puck squares are in blue, avoid squares in grey, goal squares in yellow and

empty squares in white.

4.8.3 Move

Movement is input using 4 directional buttons. First, select the move radio button.

Then click on a blue box which represents an E-Puck robot. Use the 4 directional

button the move square around the grid.

The move feature only uses simple forward motion with the distant of 1 square. A

reverse motion is omitted so that the issue of deciding optimal places to rotate and

Page 56: Isa's Portfolio - Thesis

47

reverse is avoided. Also it provides better clarity to the motion such that the

desired motion is always head by the front of the robot.

In the software, rotations are handled automatically. It is always represented by a

clockwise rotation of multiples of 90 degrees. Shorter anticlockwise rotations, in

the case there is a clockwise rotation of 270 degrees, is used and handled by the

rotation algorithm in the E-Puck itself.

4.8.3 Wait

The wait command is for synchronization. At any point of time when a certain E-

Puck needs to synchronize with another on the next movement for both E-Pucks,

this command can be used. The command is simple.

Select the wait radio button. Double click on E-Pucks that need to sync with the

current selected E-Puck. And then click add button. After this choose any

movement to carry out and this movement to the next square will be synchronized

between the selected E-Pucks.

4.8.4 Generate Code

The generate code button generate codes that can be used to program into the E-

Puck. This can be seen in Figure 4.10. Alternatively, number combinations are

also generated to be used with the communication and synchronization program

discussed in the Chapter 5.

Page 57: Isa's Portfolio - Thesis

48

Figure 4.10 : Generated Command from Software

Page 58: Isa's Portfolio - Thesis

49

5

Development of a Laptop Robot

Although the original aim of this project is to create tools for distributed robotics

using a homogeneous group of robot, difficulties in communication and

localization of the E-Puck robots prompted the design of a second robot to fill in

these requirements.

This second robot has to be able to carry out global localization and also act as a

communication channel for the E-Pucks. The roles that the E-Pucks and the laptop

robots take may be different. These exact roles will not be discussed in this report.

We shall only discuss the accommodation of localization and communication and

its extents from the addition of the laptop robot.

In order to accommodate experiments in Multi Agent Systems and swarm

robotics, the cost of the build of the laptop robot must be kept low. An analysis of

the cost is given after the assessment of the robot’s components.

Besides low cost, the robot has to be easy to build. Large numbers of the robot

may be needed and hence the structural design of the robot has to be simple for

duplication. These robots are designed to be hand built with aluminium sheets,

low cost and easy to machine, being the primary building material.

5.1 Design of the laptop robot

Page 59: Isa's Portfolio - Thesis

50

Laptops shall be used as the brain of this robot to due to its availability,

computing power for image processing as well as the available software available

under the PC platform.

Sensors used in this robot must be able to be used to localize the robot, provide

basic collision avoidance and track the E-Pucks.

One obvious sensor is a webcam. Using one webcam, global localization can be

achieved using monocular SLAM. This technique involves tracking of features

and formation of a sparse map.

Two webcams for stereo vision has been chosen instead. This will enable us to

determine the pose of the features directly. Also, the depth map can be used for

collision avoidance.

5.2 Implementing stereo vision

Two cheap CCD webcams with a maximum resolution of 800x600 are used. The

first step is calibration which involves determining intrinsic and extrinsic

parameters of the camera.

5.2.1 Camera Calibration

Camera calibration is the process of finding the camera’s intrinsic and extrinsic

parameter in order to correctly map point in world space into a pixel in the image.

The intrinsic parameters describe the mapping from the camera plane to the image

plane. The parameters are as follows.

Page 60: Isa's Portfolio - Thesis

51

u0,v0 on the image plane is the principal point

γ is the shear factor

f is the focal length

m is the scale factor that relates pixels to distance

This matrix corrects the camera matrix which maps a point on the world plane

into the image plane. For stereo triangulation, we need to know where the camera

is in relation with each other which is represented as a transformation consisting

of a rotational and a translational matrix. These are the extrinsic parameters.

The intrinsic parameters can be found easily using an automated process by using

the GML C++ Camera Calibration Toolbox [13]. It uses a checkerboard pattern

with known dimensions which it detects automatically in the image.

After calibration using the toolbox, each calibration image will have its own set of

extrinsic parameters. This relates the pose of the camera relative to the checker

box pattern. Therefore the calibration images taken from the two cameras must be

taken at the same time. Two corresponding images will have extrinsic parameters

that correspond to each other. The inverse of one of the transformation matrix can

be used to set one of the cameras as the world origin by a matrix multiplication.

Page 61: Isa's Portfolio - Thesis

52

5.2.2 Matching

In order to triangulate, we need to match a feature which appears in the two

images which exist in the same world space. This is the correspondence problem.

A common solution to this problem is by first extracting keypoints on both images

using Scale Invariant Feature Transform (SIFT). SIFT features are obtained by a

difference of Gaussians and scaling [5]. The resulting features are robust and

invariant to scaling and some degree of rotation [14]. This makes it ideal for

stereo matching where the projection transform on both cameras may scale the

keypoints differently. A SIFT feature identifier, which is essentially gradient data

of the region around keypoint, is attached to each keypoint. These identifiers are

then compared with each other between the keypoints in the two images to find

correspondence. The implementation used for SIFT is VLFeat [15] in MATLAB.

From the method so far, the matching is still rather poor. The next step is

commonly using the epipolar constraints to eliminate bad matches. However for

this implementation a multiscale feature matching program written by Lorenzo

Sorgi is used which matches using variance of position estimate error, extracted

from the Fisher information matrix.

5.2.3 Disparity Map

The difference between the horizontal components of the match points forms the

disparity map. There are two types of disparity maps, sparse disparity maps and

dense disparity maps. Dense maps can be obtained directly from rectified images,

Page 62: Isa's Portfolio - Thesis

53

which lines up the epipolar lines between two images, by using pixel by pixel

comparisons. The map that we obtain from keypoint matching are sparse maps.

Using a sparse map for collision avoidance is inconvenient since depth

information of only a small number of points on the images are known. This

makes avoiding a body of obstacle hard. To tackle this, conversion of the sparse

map into a dense one is necessary.

The approach for this is by firstly converting the colour image into a grayscale

image. We then use mean shift segmentation on the image to segment it into

regions. These regions are then weighted by the points in the disparity map.

The policy for weighting can be designed to be conservative for collision

avoidance in the sense that points that indicate a near obstacle will be given more

weight. Also, one good property of dense disparity maps obtained in this manner

is that the ground planes are often on an average distance. This is due to

segmentation of the ground as one piece and being weighted along many

distances. This removes it as a near obstacle which might occur in more accurate

dense maps approximations [16].

Although this method imposes the false assumption that a certain colour will have

the same depth everywhere, it does give good and fast results in most cases.

5.3 Stereo SLAM

Page 63: Isa's Portfolio - Thesis

54

With the intrinsic and extrinsic parameters, stereo triangulation of the disparity

map can be achieved using the method included in the MATLAB Camera

Calibration Toolbox.

In order to track features between stereo image pairs, SIFT features can be

attached to keypoints in either of the images. These keypoints are then matched

between stereo pairs. Horn method for absolute orientation using unit quaternion

has already been written in MATLAB. It finds the rotation, translation and

optionally scaling that maps a set of 3d coordinates to another in a least square

way. This transformation is the same transformation that the robot has undergone.

This method will accumulate error from the sensed pose of the features.

Therefore a database of features can be constructed containing its global positions.

Once the robot revisit a place and detects a previous keypoint, it can update with

an earlier measurement which has less error. This project will only focus on object

recognition. Further implementation like a sophisticated but similar

implementation by (Se, Lowe and Little) [17] can build upon the results.

5.4 Communication channel for the E-Puck

The laptop comes with an integrated Bluetooth module. The Bluetooth module is

able to connect to multiple E-Pucks at the same time. It can therefore acts as a hub

that routes packet to its right recipients.

To do this, the laptop has to first pair up with the E-Puck. Each E-Puck has its on

Bluetooth device name as well as its own PIN, which by convention is its 4 digit

Page 64: Isa's Portfolio - Thesis

55

number. Figure 4.11 shows the Bluetooth window panes after the pairing of the E-

Pucks. The E-Pucks are assigned two COM ports each and appears on the list of

devices.

Figure 5.1 : Setting up Bluetooth Connection for both E-Puck

This had been implemented in MATLAB. The format of the packets for

communication is a character array of the form “C,<recipient id>(message)\r\0”

where the recipient id is a pre-assigned unique number given to each E-Puck, ‘\r’

is the carriage return character and \0 is the null character.

5.5 Loading Tasks in to the E-Puck

Tasks are loaded into the E-Puck, from the PC, as integers. Table 5.1 shows the

tasks that have been created for the implementation of grid based path planning:

Task Type Task Id No of

Parameters

Task Parameters

Move Forward 1 0 -

Page 65: Isa's Portfolio - Thesis

56

Rotation 2 1 <no of 90 degrees clockwise

rotation>

Synchronize 3 <number of

E-Pucks>

<E-Puck1>, <E-Puck2>, <E-

Puck3>, … ,<E-Puck n>

Table 5.1 : Grid Based Task Codes

For example, “2,2” indicates a turn of 180 degrees clockwise. Note that the

internal E-Puck function written to handle this task will turn anticlockwise by 90

degrees if “2,3” is encountered. This is done instead of the inefficient move of

turning clockwise by 270 degrees.

An example for the synchronization command for three E-Pucks is “3,1,1,1” for

E-Puck1. The synchronization will ensure that the next task is done synchronously

with E-Puck2 and E-Puck3 where the corresponding command of the same string

exists.

Notice that the first parameter ‘1’ is redundant; the fact that this task is loaded to

E-Puck1 already implies that E-Puck1 is part of the synchronization. However to

keep the format uniform and in order to use the same code for multiple E-Pucks,

this format is kept. The digit corresponding to the programmed E-Puck is

automatically overwritten to be 1 in the E-Puck online. Therefore the task 3,1,1,1

and 3,0,1,1 for E-Puck1 are equivalent.

5.6 Synchronization of the E-Puck

Synchronization is an important component of distributed systems. Since the

communication channel is now using the laptop robot as a “hub”, the network no

Page 66: Isa's Portfolio - Thesis

57

longer has to tackle the inter-piconet synchronization and scatternet formation

problem.

Figure 5.2 : Synchronization flowchart

When an E-Puck executes a synchronization task, it will send a “waiting” signal

to other E-Pucks that are involved in the synchronization. Each E-Puck holds an

integer variable for all the E-Pucks in the network which keep track of E-Pucks

that are waiting for them at that point of time. When all the E-Pucks involved in

the synchronization is “waiting”, these variables are reset and the next task

continues.

Page 67: Isa's Portfolio - Thesis

58

6

Results

In this project, there have been efforts in a diverse range of problems to give the

E-Pucks the ability to localize itself, communicate robustly and do coordinated

grid based motion paths. From E-Puck communication codes for the Bluetooth

module to the vision algorithms of the laptop robot, these efforts has a diverse

range of results some more significant than the other.

Therefore, in this section, we will divide it into results relating to the E-Puck first

in terms of communications, motion planning and localization. This is followed

by results related to the laptop robot which are the vision algorithms to achieve

localization.

6.1 Direct Communication Results for E-Puck

There are two methods of communication which have been tested. The first is

directly through the Bluetooth communication between E-Pucks. There are two

modes within this method namely transparent mode and command mode.

The second method is through the laptop robot acting as a “hub” to receive

packets and send it to the right E-Puck.

6.1.1 Single point through transparent mode

Page 68: Isa's Portfolio - Thesis

59

The last part of the communication procedure is the establishing of an SPP link.

Using the standard method of sending one request packet is not effective. The SPP

link is not made and no packets could be transferred.

Transparent mode single point connection is achieved by forcing the SPP

connection to persistently send “request” packets until the received “confirm”

packet carries no error (refer to chapter 3 for details). After this code change

which is discovered by trial and error, an E-Puck is always able to connect to

another target E-Puck.

6.1.2 Multipoint through Command mode

The establishment of SPP link between multiple E-Puck is successful using the

same persistent sending method described in the transparent mode section.

Initially, following the SWUG that states that the slave E-Pucks will not receive in

raw data but rather packets that contain byte by byte data, the Slave E-Puck is not

successful in receiving the sent message.

Through trial and error, it is discovered that the data that are received by the slave

E-Pucks are indeed raw data. This approach has been used in the proceeding

experiment to test the quality of this multipoint connection.

6.1.3 Quality of Communication through the Bluetooth module

Since the UART is used for Bluetooth communication between E-Pucks, there is

no mean to probe into the E-Puck directly. A test is therefore designed in using

this connection in a dynamic motion planning algorithm. It is tested by running an

Page 69: Isa's Portfolio - Thesis

60

obstacle avoidance algorithm on one of the E-Puck. For every update on motor

speeds from this algorithm, the updated speed is sent to the second E-Puck in a

form of motion update command. Upon receiving this command the second E-

Puck will set its motor to the same speed. This causes a mirroring action.

The more latency and the packet loss there is to the connection, the less

coordinated the mirroring will be.

6.1.4 Mirroring using a Single point connection

The mirroring for the transparent mode between two E-Pucks is very good. There

is no observable difference in movement between the E-Pucks. The only failures

occur when the E-Pucks are too far away from each other such that the connection

breaks. This result is however less useful since distributed robotics will require

synchronization between E-Pucks which requires a multipoint connection.

6.1.5 Mirroring using a Multipoint Connection

In command mode, the Master E-Puck runs the obstacle avoidance algorithm. It

then sends its motor update data to the two slave E-Pucks sequentially; first to

slave 1 and then to slave 2. Figure 6.1 shows the movement sequence.

Page 70: Isa's Portfolio - Thesis

61

Figure 6.1 : Mirroring using Multipoint connection

It can be observed that as the mirroring motion carries on, the slave E-Pucks gets

increasingly less synchronized with the master, especially for slave 2. This can be

attributed to the loss of packets since certain rotational motion of the Master E-

Pucks did not occur on the slave E-Puck which cannot be attributed to latency.

6.1.6 Slave to Master communication in Command Mode

Using the same Bluetooth module command that has been used by the Master to

send to the slaves in the preceding experiment, the Master is unable to receive any

packet from the Slave. The SWUG approach of byte by byte receiving of data as

packets has also failed. This leaves a gap for any form of multipoint

communication for the E-Pucks.

6.2 Communication through the laptop robot

Master

Slave 2

Slave 1

Page 71: Isa's Portfolio - Thesis

62

Due to the lack of a multipoint communication method, this is the only viable

method of communication.

The mirroring experiment has not been adapted to be used with the

communication through the laptop robot due to time constraints for this project. It

is, however, extensively tested for synchronization in the grid based path planning

which result is discussed in the next section.

6.2.1 Grid Based Motion Planning Results

Due to the research interest in the lab currently, one of the emphases of this

project is to enable fast implementation of grid-based motion paths that are

planned offline. These motions require synchronization of multiple E-Pucks.

The example path that was chosen to test the communication channel, the

movement and the synchronization codes in the E-Puck that has been developed is

given in (Kloetzer and Belta 08) [18].

Figure 6.2 : Grid Movement Sequence for Experiment

The next step is to form this scenario and motion in the grid based motion planner

software as shown in the following Figure 6.3. The code generated is in Table 1.

Page 72: Isa's Portfolio - Thesis

63

Figure 6.2 : Sequence entered into Software

E-Puck 1 3,1,1,1,2,1,3,1,1,1,1,3,1,1,1,1,2,3,3,1,1,1,1,2,1,3,1,

1,1,1

E-Puck 2 3,1,1,1,1,3,1,1,1,1,2,1,3,1,1,1,1,3,1,1,1,1,2,1,3,1,1,

1,1 E-Puck 3 3,1,1,3,1,1,1,1,2,1,3,1,1,1,1,3,1,1,1,1,2,1,3,1,1,1,1,

2,3,3,1,1,1,1

Table 6.1 : Generated Code for the Sequence

The length of the square is 12 cm and the speed to run this motion is set to 500.

Therefore 12 and 500 is appended to the front of the number sequence. This is

then copied to the MATLAB code that loads the sequences into the E-Pucks and

then acts as a communication channel. The resulting movement is shown in

Figure 6.3 in the same form as Figure 6.1.

Figure 6.3 : Actual Movement by the E-Pucks

The synchronization and movement codes perform very well. The movement code

to move from one square forward is designed to stop after travelling the distance.

1

2

3

Page 73: Isa's Portfolio - Thesis

64

In frame 2 to 3, E-Puck 3 is making two consecutive “move forward” movements

and there is no observable stoppage. Even from frame 3 to 4 where the same E-

Puck has to synchronize with the rest of the E-Pucks to move synchronously to

the next position, there is no observable delay in the two forward motions

involved. This means that the latency of the synchronization process as well as the

transmissions in the communication channel is negligible relative to the robot’s

motor response.

6.3 Results of the Laptop Robot’s Vision

The requirement of the laptop robot is the ability to localize itself. The approach

to tackle is this is a vision-based localization using landmarks. It must also be able

to tell its relative distance and bearing from the E-Pucks in order to send the E-

Pucks its global pose.

6.3.1 Stereo Vision

Experiments in stereo vision are conducted for matching, sparse to disparity map

estimation for object avoidance, object recognition and triangulation. These

experiments are geared to achieving global localization by landmark ultimately.

6.3.1.1 Matching

The correspondence problem is approached by first using the famous Scale

Invariant Features Transform (SIFT) and matching SIFT features between two

images. This has been described in chapter 6. The results for the matching using

SIFT is shown in Figure

Page 74: Isa's Portfolio - Thesis

65

Figure 6.4 : Matching using SIFT descriptors

The left image shows the left camera image with the corresponding SIFT

keypoints marked in blue. The right image shows the right camera image with the

matched keypoints between the both images connected by a blue line.

It is clear that many of the keypoints are matched wrongly. The obvious ones are

long lines that cannot be true due to the relative small distance of the two cameras

and zig zag lines that cannot be true since the image has no large distortions.

The multi scale feature matching algorithm that uses the Fischer Information to

identify matches is then used. The results are as follows:

Figure 6.5 : Matching using Multi Scale algorithm

Page 75: Isa's Portfolio - Thesis

66

Clearly more features are matched. 1887 keypoints are matched for this case of a

640x380 image. This is compared to only 131 matches using comparisons of SIFT

identifiers. The more points there are, the denser the disparity map and

consequently the weighting of image segments used to achieve a dense map will

be more accurate.

The matching is visibly accurate. Since the image is not rectified, the epipolar

lines are not aligned between the images. However it is clear that most of the

points are matched correctly by the manner that the blue lines connecting matched

features are parallel to each other for each portion of the map. The directions of

these lines are dependent on mainly the extrinsic parameters of the stereo cameras.

In this case, the cameras are not well aligned.

6.3.1.2 Disparity Map

The sparse disparity map is proportional to the radial distances of the

corresponding objects to the camera [5]. This factor can be computed as a

function of its focal length and scaling factors from pixel to focal plane. However

in this experiment, this factor is computed experimentally by measuring actual

object distances to the camera and its corresponding disparity value. The result is

a scale factor of 2100 for measurements in the units of centimetres.

After matching, a sparse disparity map is obtained. A sparse disparity map refers

to a map that does not cover almost every pixel in the image. Mean shift

segmentation is used to segment the image based on its grayscale pixel values.

These segments are weighted by the disparity map to form a dense map.

Page 76: Isa's Portfolio - Thesis

67

The dense disparity map for the SIFT matches is show in Figure 6.5.

X: 241 Y: 312

Index: NaN

RGB: 0.5, 0, 0

X: 147 Y: 184

Index: NaN

RGB: 0.5, 0, 0

100 200 300 400 500 600

50

100

150

200

250

300

350

400

450

Figure 6.5 : Dense Disparity Map using SIFT matches

Due to the low number of matches, most of the segments are not weighted. The

NaN value refers to pixels that have not been weighted. Even with enough

matches, the disparity map will clearly be inaccurate due to the visible

mismatches earlier.

Page 77: Isa's Portfolio - Thesis

68

Figure 6.6 shows the dense disparity map using the multi scale method.

X: 225 Y: 72

Index: 56.7

RGB: 0, 0, 0.625 X: 386 Y: 136

Index: 350

RGB: 0, 0.813, 1

X: 92 Y: 222

Index: 110.5

RGB: 0, 0, 0.875

X: 364 Y: 348

Index: 35.59

RGB: 0, 0, 0.563

100 200 300 400 500 600

50

100

150

200

250

300

350

400

450

Figure 6.7 : Dense disparity map using multiscale matching

X: 364 Y: 348

RGB: 232, 250, 225

X: 386 Y: 136

RGB: 9, 12, 0

X: 92 Y: 222

RGB: 225, 182, 25

X: 225 Y: 72

RGB: 237, 255, 250

Figure 6.8 : Objects selected from corresponding Scene

The lighter blue regions are regions that are further way while the darker blue

regions are those that are nearer. 4 objects are selected which can be seen with the

Page 78: Isa's Portfolio - Thesis

69

corresponding image of the scene in Figure 6.7. A can, curtain, drum shell and fan

(in grey) have been selected and the distance to the corresponding points are

measured using a tape measure manually.

The result is tabulated in the following table:

Manual measurement

(cm)

From Disparity Map (cm)

Can 35 35.6

Curtain 200 350.0

Drum shell 149 110.5

Fan 79 56.7

Table 6.2 : Comparison between Disparity Map Value with Manual Measurements

The results are not too far from the manual measurement. This segmentation

method cannot be used for localization however it can be used for obstacle

avoidance where a fuller map is required.

6.3.1.3 Object Recognition

The main aimed method for localization is through landmarks. In order to carry

out landmark localization, the laptop robot must be able to recognize the

landmarks. One way to achieve this is to have a database of objects and its

corresponding SIFT features.

Keypoints, divided in segments, are compared to a database of features which are

grouped in the segments that they are originally found. If the keypoints in one of

Page 79: Isa's Portfolio - Thesis

70

the segments in the current image does not match any from the database, it is

saved into the database.

One problem is the growing database and a growing amount of comparisons

which results from it. Therefore the maintenance of the database should also be

implemented such that objects that are less likely to be prominent should be

deleted. Due to the constraint of time, this maintenance of the database has not

implemented. The focus shall be on object recognition simply using a fixed size

database that is maintained in a first in first out fashion.

objects1.jpg

objects24.jpg

Figure 6.8 : Object Recognition, Left image is taken at an earlier time frame

This method does have some good object matches. However the segmentation

changes for every viewpoint. This causes certain segments that are selected from a

segment already in the database, but a subset of it. Because the features in the

subset are small, that segment is not recognized. In the Figure 6.8, this situation

has occured as the second segment being a subset of the first segment is not

recognized as the same object. The object is saved as object 21 instead of being

recognized as object 1.

One way to solved this is by checking whether the segments around it has been

recognized and if they have been, it implies that the segment has not been

matched well and should not be saved.

Page 80: Isa's Portfolio - Thesis

71

However this project will not delve into the policy used for matching segments

and the database maintenance. These issues will be left for future work.

6.3.1.5 Camera calibration parameters

The GML Calibration toolbox has been used to calibrate the cameras. There is no

option to find extrinsic parameter for two cameras. The left extrinsic parameters

obtained during the calibration process using a stereo pair is inverted and

multiplied with the right extrinsic parameters. Although both of these parameters

correspond to the same world before this inversion, they are often in a

inconvenient arbitrary axis based on the free position of the camera from the

calibration pattern.

6.3.1.6 Triangulation

Stereo triangulation is the process of transforming the matched points in

perspective space to Cartesian distances in Euclidean space. The stereo

triangulation function included in the MATLAB calibration toolbox has been

used. The matched points from the multi scale feature matching algorithm are

used with the calibration parameters. The results are in millimetres and shown in

Figure 6.9.

Page 81: Isa's Portfolio - Thesis

72

Figure 6.9 : Stereo Triangulation

The results are bad. The coordinates obtained does not resemble any of the

physical positions of the objects in the scene from solely the scale. Due to this

result, stereo SLAM cannot be achieved at this stage. An algorithm that finds the

transformation of points in Euclidean Space between two frames can be used to

update the pose of the robot once this issue has been resolved.

-500-400

-300

-500

0

500-500

0

500

Page 82: Isa's Portfolio - Thesis

73

7

Conclusion

This project has provided significant tools and hardware to solve the limitation of

the E-Puck robots in communication, motion planning and localization.

The step by step guide to achieving Bluetooth communication using the E-Puck

had been important. Crucial tweaks to the sending of commands in order to

achieve both multipoint and single point connection had been discovered. These

commands are not reflected in the Software User Guide of the Bluetooth module.

In motion planning the synchronization method to coordinate the grid based

movement is shown to be fast and robust. This implies the communication

channel is functional and fast. More importantly, it solves the issue of the poor

transmission quality in multipoint connection through Bluetooth which was

demonstrated in the mirroring algorithm.

For localization, significant progress has been made in the realization of a SLAM

algorithm in order to provide the E-Pucks with positional information. A method

to convert a sparse disparity into a dense one has shown decent results which can

be used for obstacle avoidance.

7.1 Future Work

Future work primarily focuses on the vision based localization techniques.

Page 83: Isa's Portfolio - Thesis

74

The laptop robot must also be able to calculate the relative pose of the E-Puck

from itself. One solution is using the Augmented Reality Toolkit (ARtookit). It

uses fudicial markers and the transformation matrix that maps the camera axis to

the marker’s position is obtained. These fudicial markers can directly obtain the

relative pose by using the translation matrix as the position and the Euler angles of

the rotational matrix as orientation.

Page 84: Isa's Portfolio - Thesis

75

REFERENCE

[1] Mondada, F., Bonani, M., Raemy, X., Pugh, J., Cianci, C., Klaptocz, A., Magnenat, S., Zufferey, J.-C., Floreano, D. and Martinoli, A. (2009) The e-puck, a Robot Designed for Education in Engineering. Proceedings of the 9th Conference on Autonomous Robot Systems and Competitions, 1(1) pp. 59-65.

[2] LMX9820/LMX9820A Bluetooth Serial Port Module - Software Users Guide, National Semiconductor, October 2006, Revision 1.6.4

[3] Sharkey, A. J. 2007. “Swarm robotics and minimalism” Connect. Sci 19, 3 (Sep. 2007), 245-260

[4] Jean-Marc Valin and François Michaud and Jean Rouat and Dominic Létourneau, “Robust Sound Source Localization Using a Microphone Array on a Mobile Robot”, Proceedings International Conference on Intelligent Robots and Systems (2003)

[5] David G. Lowe, "Distinctive image features from scale-invariant keypoints," International Journal of Computer Vision, 60, 2 (2004), pp. 91-110.

[6] Zhijun Pei, Ping Zhang, Shoumei Sun, Jinqing Gu, "Fisher Information Analysis for Matching Feature Extraction," itcs, vol. 1, pp.425-428, 2009 International Conference on Information Technology and Computer Science, 2009

[7] Gutiérrez, Á., Campo, A., Dorigo, M., Donate, J., Monasterio-Huelin, F., and Magdalena, L. 2009. Open E-puck range & bearing miniaturized board for local communication in swarm robotics. In Proceedings of the 2009 IEEE international Conference on Robotics and Automation (Kobe, Japan, May 12 - 17, 2009). IEEE Press, Piscataway, NJ, 1745-1750.

[8] Xiaolei Hou, Changbin Yu, “On the implementation of a robotic SWARM testbed”, Autonomous Robots and Agents, 2009. ICARA 2009. 4th International , pp. 27 – 32

[9] Narongdech Keeratipranon, Joaquin Sitte,”Beginners Guide to Khepera Robot Soccer”, Queensland University of Technology, 2003

[10] A.K.M. Mahtab Hossain and Wee-Seng Soh, “A Comprehensive Study Of Bluetooth Signal Parameters for Localization”, 18th Annual IEEE International Symposium on Personal, Indoor and Mobile Radio Communications (PIMRC’07)

[11] Wang, Z., Thomas, R. J., and Haas, Z. J. 2009. Performance comparison of Bluetooth scatternet formation protocols for multi-hop networks. Wirel. Netw. 15, 2 (Feb. 2009), 209-226

[12] Dudek and Jenkin, Computational Principles of Mobile Robotics. [13] V.Vezhnevets, A.Velizhev "GML C++ Camera Calibration Toolbox",

http://research.graphicon.ru/calibration/gml-c++-camera-calibration-toolbox.html, 2005

[14] Scale Invariant Feature Transform (SIFT):Performance and Application [15] Vedaldi and B. Fulkerson, “VLFeat: An Open and Portable Library of

Computer Vision Algorithms”, 2008 [16] Wided Miled, Jean Christophe Pesquet and Michel Parent , “Robust

Obstacle Detection Based on Dense Disparity Maps”, Lecture Notes in Computer Science, Computer Aided Systems Theory, EUROCAST 2007

Page 85: Isa's Portfolio - Thesis

76

[17] Stephen Se and David Lowe and Jim Little, “Local and Global Localization for Mobile Robots using Visual Landmarks”, 2001

[18] Marius Kloetzer and Calin Belta, “Distributed implementations of global temporal logic motion specifications”, ICRA 2008, pp.393-398

[19] Luke Fletcher, “An Introduction to Computer Vision”, http://users.cecs.anu.edu.au/~luke/cvcourse_files/online_notes/lectures_RV_1_dmaps_6up.pdf

Page 86: Isa's Portfolio - Thesis

77

Appendix A - E-Puck Bluetooth Module Codes

Code for Factory Reset

char e_bt_factory_reset(void) { char send[8]; char read[10]; int i; char c; send[0]=0x02; send[1]=0x52; send[2]=0x1A; send[3]=0x00; send[4]=0x00; send[5]=0x6c; send[6]=0x03;//link number 1-30 e_send_uart1_char(send,7); i=0; c=0; do { if (e_getchar_uart1(&read[i])) //read response { c=read[i]; i++; } } while (((char)c != 0x03)||(i<(read[3]+6))); read[i]='\0'; return read[6]; //return error 0=no error }

Code for setting Transparent Mode

char e_bt_tranparent_mode(void) { char send[8]; char read[10]; int i; char c; send[0]=0x02; send[1]=0x52; send[2]=0x11; send[3]=0x01; send[4]=0x00; send[5]=0x64; send[6]=0x01;//link number 1-30 send[7]=0x03; e_send_uart1_char(send,8); i=0;

Page 87: Isa's Portfolio - Thesis

78

c=0; do { if (e_getchar_uart1(&read[i])) //read response { c=read[i]; i++; } } while (((char)c != 0x03)||(i<(read[3]+6))); read[i]='\0'; return read[6]; //return error 0=no error }

Code for setting Device Name

char e_bt_read_local_name(char *name) { char send[7]; char read[40]; int i; char c; send[0]=0x02; //send Name request send[1]=0x52; send[2]=0x03; send[3]=0x00; send[4]=0x00; send[5]=0x55; send[6]=0x03; e_send_uart1_char(send,7); i=0; c=0; do { if (e_getchar_uart1(&read[i])) //read response { c=read[i]; i++; } } while (((char)c != 0x03)||(i<(read[3]+6))); read[i]='\0'; for(i=0;i<read[7];i++) //extract Name name[i]=read[i+8]; return read[6];//return error 0=no error }

Code for setting Device PIN

char e_bt_write_local_PIN (char *PIN)

Page 88: Isa's Portfolio - Thesis

79

{ char send[30]; char read[10]; int i; char c; int numberlenght; numberlenght=strlen(PIN); //send_uart2(PIN,numberlenght); send[0]=0x02; send[1]=0x52; send[2]=0x17; send[3]=numberlenght+1; send[4]=0x00; send[5]=(send[1]+send[2]+send[3]); send[6]=numberlenght; for (i=0;i<numberlenght;i++) send[i+7]=PIN[i]; send[7+numberlenght]=0x03; e_send_uart1_char(send,numberlenght+8); i=0; c=0; do { if (e_getchar_uart1(&read[i])) //read response { c=read[i]; i++; } } while (((char)c != 0x03)||(i<(read[3]+6))); read[i]='\0'; return read[6];//return error 0=no error }

Code for setting Command Mode

char e_bt_set_commandmode(void) { char send[120]; char read[10]; int i; char c, received; //send_uart2(PIN,numberlenght); send[0]=0x02; //send PIN request send[1]=0x52; send[2]=0x4A; send[3]=0x01; send[4]=0x00; send[5]=(send[1]+send[2]+send[3]); send[6]=0x00; send[7]=0x03; e_send_uart1_char(send,8);

Page 89: Isa's Portfolio - Thesis

80

i=0; c=0; do{ if (e_getchar_uart1(&read[i])) //read response { c=read[i]; i++; } } while (((char)c != 0x03)||(i<(read[3]+6))); return read[6]; }

Code for setting open ports

char e_bt_read_local_name(char *name) { char send[7]; char read[40]; int i; char c; send[0]=0x02; //send Name request send[1]=0x52; send[2]=0x03; send[3]=0x00; send[4]=0x00; send[5]=0x55; send[6]=0x03; e_send_uart1_char(send,7); i=0; c=0; do { if (e_getchar_uart1(&read[i])) //read response { c=read[i]; i++; } } while (((char)c != 0x03)||(i<(read[3]+6))); read[i]='\0'; for(i=0;i<read[7];i++) //extract Name name[i]=read[i+8]; return read[6];//return error 0=no error }

Page 90: Isa's Portfolio - Thesis

81

Appendix B – Vision Data and Codes

Calibration parameter in MATLAB syntax

%Left Camera Extrinsic Parameters [-0.994216 0.041057 0.099245 59.725145; 0.001058 0.927750 -0.373201 80.114263; -0.107397 -0.370937 -0.922427 833.387468];

%Right Camera Extrinsic Parameters

[-0.999576 -0.024518 0.015704 138.503738; -0.029009 0.884785 -0.465095 159.633384; -0.002492 -0.465354 -0.885121 777.938168;];

%Left Camera Intrinsic Parameters [ 436.120192 0 184.507674;

0 435.466024 85.620939;

0 0 1 ];

%Right Camera Intrinsic Parameters

[ 458.547601 0 175.314928;

0 459.008550 46.938338;

0 0 1 ];

Object Recognition MATLAB Code clear all;

webcamL = videoinput('winvideo',3,'YUY2_320x240');

set(webcamL,'ReturnedColorSpace','rgb');

preview(webcamL);

pause(2);

count1 = 1; %total number of objects in dictionary

count = 1;

nooffeatures = 3;

library_size = 30;

library_features = cell(library_size);

library_keypoints = cell(library_size);

library_hits = cell(library_size);

library_h = cell(library_size);

while(1)

im1c = double(getsnapshot(webcamL))/255;

im1c = CoherenceFilter(im1c);

im1c = imresize(im1c,.4);

im1c = uint8(im1c * 255);

im1 = single(rgb2gray(im1c)) ;

%We compute the SIFT frames (keypoints) and descriptors by

[segs labels] = msseg(im1c,10,7,1000); %-- mean shift

segmentation

imcompare = zeros(size(im1));

im1temp = zeros(320,20);

count = count1;

Page 91: Isa's Portfolio - Thesis

82

%========================================

% Extract out a segment from the image

%========================================

for(i = 0:length(unique(labels))-1) %for segment i

lab_idx =find((labels == i)); %extract out index of segment

imcompare = ones(size(im1));

imcompare = single(imcompare .* 255); %put white background

imcompare(lab_idx)= single(im1(lab_idx)); %paste segment over

background

flag = 0;

matches = [0;0]; %initialize matches

count3 = 1; %number of "decent matches"

historymatch = zeros(1,100); % number of features in ith

"decent match"

historyobject = zeros(1,100); % library index of "decent

match" ie. object<index>.jpg

[fa, da] = vl_sift(imcompare) ;

%========================================

% compare segment all of the dictionary

%========================================

for (j = 1:count)

if(count>1 && j < count)

filename = strcat('objects',int2str(j),'.jpg');

%load dictionary image

imdatabase = imread(filename);

[matches, scores] = vl_ubcmatch(da,

library_features{j});

%[x1,x2,corr_vec,

match]=features_matching(imdatabase,uint8(imcompare), 0, 1, 3, 5e-3);

%compare with segment

%found 1 "decent match"

if (length(matches(1,:))>nooffeatures)

flag = 1;

historymatch(1,count3) =

length(matches(1,:));

historyobject(1,count3) = j;

count3= count3+1;

% pause(2);

end

end

end

%========================================

% if there is at least 1 decent match

%========================================

if (flag ==1)

[numberofmatches, matchid] = max(historymatch(1,:));

%get max match

fprintf('\nObject %d found. %d features

\n',historyobject(matchid),numberofmatches);

filename =

strcat('objects',int2str(historyobject(matchid)),'.jpg')

%========================================

% Traces the object

%========================================

BW = imread(filename);

indx1 = find(BW == 255);

indx2 = find(BW ~= 255);

Page 92: Isa's Portfolio - Thesis

83

BW(indx1) = 0; % getoutline

BW(indx2) = 1;

%subplot(1,1,1);

imshow(im1c);

s=size(BW);

for row = 2:55:s(1)

for col=1:s(2)

if BW(row,col),

break;

end

end

contour = bwtraceboundary(BW, [row, col], 'W', 8,

inf,...

'counterclockwise');

if(~isempty(contour))

hold on;

%subplot(1,1,1);

plot(contour(:,2),contour(:,1),'g','LineWidth',2);

hold on;

%subplot(1,1,1);

plot(col, row,'gx','LineWidth',2);

end

end

text(col,row,filename,'FontSize',20,'Color','w');

hold off;

pause(.1);

end

%========================================

%========================================

% Nothing Found. Save to Dictionary

%========================================

if (length(lab_idx)>3 && flag ==0)

imdatabase = ones(size(im1));

imdatabase = imdatabase .* 255;

imdatabase(lab_idx)= im1(lab_idx);

fprintf('\nObject not found \n');

library_features{count1} = da;

library_keypoints{count1} = fa;

filename = strcat('objects',int2str(count1),'.jpg');

imwrite(uint8(imdatabase),filename,'jpg');

count1 = mod(count1,library_size)+1;

end

%=========================================

end

end

Page 93: Isa's Portfolio - Thesis

84

Appendix C – E-Puck Synchronization Code

This code is mainly BTcom.c from the E-Puck code library with appended codes for Synchronization. /********************************************************************************

The reference programm to control the e-puck with the PC

Version 2.0

Michael Bonani

This file is part of the e-puck library license.

See http://www.e-puck.org/index.php?option=com_content&task=view&id=18&Itemid=45

(c) 2004-2007 Michael Bonani

Robotics system laboratory http://lsro.epfl.ch

Laboratory of intelligent systems http://lis.epfl.ch

Swarm intelligent systems group http://swis.epfl.ch

EPFL Ecole polytechnique federale de Lausanne http://www.epfl.ch

**********************************************************************************/

#include <p30f6014A.h>

//#define FLOOR_SENSORS // define to enable floor sensors

//#define LIS_SENSORS_TURRET

#define IR_RECIEVER

#include <string.h>

#include <ctype.h>

#include <stdio.h>

#include <math.h>

#include <stdlib.h>

#include <motor_led/e_epuck_ports.h>

#include <motor_led/e_init_port.h>

#include <motor_led/advance_one_timer/e_led.h>

#include <motor_led/advance_one_timer/e_motors.h>

#include <uart/e_uart_char.h>

#include <a_d/advance_ad_scan/e_ad_conv.h>

#include <a_d/advance_ad_scan/e_acc.h>

#include <a_d/advance_ad_scan/e_prox.h>

#include <a_d/advance_ad_scan/e_micro.h>

#include <motor_led/advance_one_timer/e_agenda.h>

#include <camera/fast_2_timer/e_poxxxx.h>

#include <codec/e_sound.h>

#ifdef LIS_SENSORS_TURRET

//#include <contrib/LIS_sensors_turret/e_devantech.h>

#include <contrib/LIS_sensors_turret/e_sharp.h>

#include <contrib/LIS_sensors_turret/e_sensext.h>

#include <I2C/e_I2C_master_module.h>

#include <I2C/e_I2C_protocol.h>

#endif

#ifdef FLOOR_SENSORS

#include <./I2C/e_I2C_protocol.h>

#endif

#ifdef IR_RECIEVER

#include <motor_led/advance_one_timer/e_remote_control.h>

#define SPEED_IR 600

#endif

#define uart_send_static_text(msg) do { e_send_uart1_char(msg,sizeof(msg)-1); while(e_uart1_sending()); } while(0)

#define uart_send_text(msg) do { e_send_uart1_char(msg,strlen(msg)); while(e_uart1_sending()); } while(0)

static float PI = 3.14159265f;

Page 94: Isa's Portfolio - Thesis

85

static char buffer[39*52*2+3+80];

static int waitEpuck[3];

static int count = 0;

static int stillwaiting = 0;

static int tempEpuck = 0;

static int waitEpuck[3] = {0,0,0};

static int epucktowaitfor[3] = {0,0,0};

static float lengthofsquare = 7.0f;

static int travelspeed = 500;

static int bypass = 0;

static int gotopositionX = 0;

static int gotopositionY = 0;

static int gotoorientation = 0;

static int positionX = 0;

static int positionY = 0;

static int orientation = 0;

static float distance = 0;

static float angle = 0;

static int tasklist[100];

static int tasksize = 0;

static int currenttask = 0;

static float threshold = 50.0f;

extern int e_mic_scan[3][MIC_SAMP_NB];

extern unsigned int e_last_mic_scan_id;

void loadtask(char *buffer);

void nexttask(void);

void loadtask(char *buffer)

{

int count1 = 0;

int myConvertedInt = 0;

char * tok = strtok(buffer, ",");

while (tok != NULL)

{

myConvertedInt = atoi(tok);

tasklist[count1] = myConvertedInt; //fill tasklist with integers.

count1++;

tok = strtok(NULL,",");

}

tasksize = count1;

currenttask = 3;

lengthofsquare = (float)tasklist[1];

travelspeed = tasklist[2];

nexttask();

}

void nexttask(void)

{

int i = 0 ;

for (i=currenttask;i<tasksize;i++)

{

switch (tasklist[i])

{

case 1: //move forward

move_cm(lengthofsquare,travelspeed);

uart_send_static_text("m(move cm activated)\r\n");

break;

case 2: //turn

i++;

if(tasklist[i] != 4)

turn_to_direction(PI/2 * tasklist[i]);

uart_send_static_text("m(turn to direction activated)\r\n");

break;

case 3: //wait

bypass = 1;

Page 95: Isa's Portfolio - Thesis

86

sprintf(buffer,"J,%d,%d,%d\r",tasklist[i+1],tasklist[i+2],tasklist[i+3]);

i+=4;

currenttask = i;

uart_send_static_text("m(wait activated)\r\n");

break;

default:

break;

}

if(bypass==1) break;

}

}

/* \brief The main function of the programm */

int main(void) {

char c,c1,c2,wait_cam=0;

int i,j,n,speedr,speedl,positionr,positionl,LED_nbr,LED_action,accx,accy,accz,selector,sound;

int cam_mode,cam_width,cam_heigth,cam_zoom,cam_size;

static char first=0;

char *address;

char *ptr;

#ifdef LIS_SENSORS_TURRET

int sensext_pres=1, sensext_param[2], sensext_debug=0;

unsigned int sensext_value[2];

#endif

#ifdef IR_RECIEVER

char ir_move = 0,ir_address= 0, ir_last_move = 0;

#endif

TypeAccSpheric accelero;

e_init_port(); // configure port pins

e_start_agendas_processing();

e_init_motors();

e_init_uart1(); // initialize UART to 115200 Kbaud

e_init_ad_scan(ALL_ADC);

#ifdef FLOOR_SENSORS

#ifndef LIS_SENSORS_TURRET

e_i2cp_init();

#endif

#endif

#ifdef IR_RECIEVER

e_init_remote_control();

#endif

if(RCONbits.POR) { // reset if power on (some problem for few robots)

RCONbits.POR=0;

RESET();

}

/*Cam default parameter*/

cam_mode=RGB_565_MODE;

cam_width=40;

cam_heigth=40;

cam_zoom=8;

cam_size=cam_width*cam_heigth*2;

e_poxxxx_init_cam();

e_poxxxx_config_cam((ARRAY_WIDTH -cam_width*cam_zoom)/2,(ARRAY_HEIGHT-

cam_heigth*cam_zoom)/2,cam_width*cam_zoom,cam_heigth*cam_zoom,cam_zoom,cam_zoom,cam_mode);

e_poxxxx_set_mirror(1,1);

e_poxxxx_write_cam_registers();

#ifdef LIS_SENSORS_TURRET //check if sensor extension is present and initalizes ports accordingly

e_i2cp_init();

e_i2cp_enable();

sensext_debug = e_i2cp_write (I2C_ADDR_SENSEXT , 0, 49);

e_i2cp_disable();

// Wait for I2C eeprom on tourret to answer.

e_activate_agenda(e_stop_sensext_wait,0);

e_start_sensext_wait();

e_set_agenda_cycle(e_stop_sensext_wait, 100);

while(e_get_sensext_wait());

e_set_agenda_cycle(e_stop_sensext_wait, 0);

Page 96: Isa's Portfolio - Thesis

87

e_i2cp_enable();

sensext_debug = e_i2cp_read(I2C_ADDR_SENSEXT, 0);

e_i2cp_disable();

if(sensext_debug!=49) // no SENSORS_TURRET reply

{

sensext_pres=0;

}

else

{

sensext_pres=1;

e_init_sensext();

e_init_sharp(); //a executer s'il y a la

tourelle

// uart_send_static_text("ePic\r\n");

}

#endif

e_acc_calibr();

e_calibrate_ir();

uart_send_static_text("\f\a"

"WELCOME to the SerCom protocol on e-Puck\r\n"

"the EPFL education robot type \"H\" for help\r\n");

while(1) {

if (bypass == 1 ) {c = buffer[0]; }

else{

while (e_getchar_uart1(&c)==0)

#ifdef IR_RECIEVER

{

ir_move = e_get_data();

ir_address = e_get_address();

if (((ir_address == 0)||(ir_address == 8))&&(ir_move!=ir_last_move)){

switch(ir_move)

{

case 1:

speedr = SPEED_IR;

speedl = SPEED_IR/2;

break;

case 2:

speedr = SPEED_IR;

speedl = SPEED_IR;

break;

case 3:

speedr = SPEED_IR/2;

speedl = SPEED_IR;

break;

case 4:

speedr = SPEED_IR;

speedl = -SPEED_IR;

break;

case 5:

speedr = 0;

speedl = 0;

break;

case 6:

speedr = -SPEED_IR;

speedl = SPEED_IR;

break;

case 7:

speedr = -SPEED_IR;

speedl = -SPEED_IR/2;

break;

case 8:

speedr = -SPEED_IR;

speedl = -SPEED_IR;

break;

case 9:

speedr = -SPEED_IR/2;

speedl = -SPEED_IR;

break;

case 0:

Page 97: Isa's Portfolio - Thesis

88

if(first==0){

e_init_sound();

first=1;

}

e_play_sound(11028,8016);

break;

default:

speedr = speedl = 0;

}

ir_last_move = ir_move;

e_set_speed_left(speedl);

e_set_speed_right(speedr);

}

}

#else

;

#endif

}

if (c<0) { // binary mode (big endian)

i=0;

do {

switch(-c) {

case 'a': // Read acceleration sensors in a non filtered way, some as ASCII

accx=e_get_acc(0);

accy=e_get_acc(1);

accz=e_get_acc(2);

ptr=(char *)&accx;

buffer[i++]=accx & 0xff;

buffer[i++]=accx>>8;

buffer[i++]=accy & 0xff;

buffer[i++]=accy>>8;

buffer[i++]=accz & 0xff;

buffer[i++]=accz>>8;

break;

case 'A': // read acceleration sensors

accelero=e_read_acc_spheric();

ptr=(char *)&accelero.acceleration;

buffer[i++]=(*ptr);

ptr++;

buffer[i++]=(*ptr);

ptr++;

buffer[i++]=(*ptr);

ptr++;

buffer[i++]=(*ptr);

ptr=(char *)&accelero.orientation;

buffer[i++]=(*ptr);

ptr++;

buffer[i++]=(*ptr);

ptr++;

buffer[i++]=(*ptr);

ptr++;

buffer[i++]=(*ptr);

ptr=(char *)&accelero.inclination;

buffer[i++]=(*ptr);

ptr++;

buffer[i++]=(*ptr);

ptr++;

buffer[i++]=(*ptr);

ptr++;

buffer[i++]=(*ptr);

break;

case 'D': // set motor speed

while (e_getchar_uart1(&c1)==0);

while (e_getchar_uart1(&c2)==0);

speedl=(unsigned char)c1+((unsigned int)c2<<8);

while (e_getchar_uart1(&c1)==0);

while (e_getchar_uart1(&c2)==0);

speedr=(unsigned char)c1+((unsigned int)c2<<8);

Page 98: Isa's Portfolio - Thesis

89

e_set_speed_left(speedl);

e_set_speed_right(speedr);

break;

case 'E': // get motor speed

buffer[i++]=speedl & 0xff;

buffer[i++]=speedl >> 8;

buffer[i++]=speedr & 0xff;

buffer[i++]=speedr >> 8;

break;

case 'I': // get camera image

e_poxxxx_launch_capture(&buffer[i+3]);

wait_cam=1;

buffer[i++]=(char)cam_mode&0xff;//send image parameter

buffer[i++]=(char)cam_width&0xff;

buffer[i++]=(char)cam_heigth&0xff;

i+=cam_size;

break;

case 'L': // set LED

while (e_getchar_uart1(&c1)==0);

while (e_getchar_uart1(&c2)==0);

switch(c1) {

case 8:

e_set_body_led(c2);

break;

case 9:

e_set_front_led(c2);

break;

default:

e_set_led(c1,c2);

break;

}

break;

case 'M': // optional floor sensors

#ifdef FLOOR_SENSORS

e_i2cp_enable();

for (j=0; j<3; ++j) {

buffer[i++] = e_i2cp_read(0xC0,2*j+1);

buffer[i++] = e_i2cp_read(0xC0,2*j);

}

e_i2cp_disable();

#else

for(j=0;j<6;j++) buffer[i++]=0;

#endif

break;

case 'N': // read proximity sensors

for(j=0;j<8;j++) {

n=e_get_calibrated_prox(j);

buffer[i++]=n&0xff;

buffer[i++]=n>>8;

}

break;

case 'O': // read light sensors

for(j=0;j<8;j++) {

n=e_get_ambient_light(j);

buffer[i++]=n&0xff;

buffer[i++]=n>>8;

}

break;

case 'Q': // read encoders

n=e_get_steps_left();

buffer[i++]=n&0xff;

buffer[i++]=n>>8;

n=e_get_steps_right();

buffer[i++]=n&0xff;

buffer[i++]=n>>8;

break;

case 'u': // get last micro volumes

n=e_get_micro_volume(0);

buffer[i++]=n&0xff;

buffer[i++]=n>>8;

Page 99: Isa's Portfolio - Thesis

90

n=e_get_micro_volume(1);

buffer[i++]=n&0xff;

buffer[i++]=n>>8;

n=e_get_micro_volume(2);

buffer[i++]=n&0xff;

buffer[i++]=n>>8;

break;

case 'U': // get micro buffer

address=(char *)e_mic_scan;

e_send_uart1_char(address,600);//send sound buffer

n=e_last_mic_scan_id;//send last scan

buffer[i++]=n&0xff;

break;

case 'W': // read Devantec ultrasonic range sensor or Sharp Ir sensor (optional)

#ifdef LIS_SENSORS_TURRET

while (e_getchar_uart1(&c1)==0);

while (e_getchar_uart1(&c2)==0);

sensext_param[0] = (unsigned char)c1+((unsigned int)c2<<8);

// sscanf(buffer,"W,%d\r",&sensext_param[0]);

if (sensext_pres) // If the TP_sensors

tourret is present

{

if(e_sensext_process(sensext_param, sensext_value))

{

switch (sensext_param[0])

{

case -1: //i2c SRFxxx

buffer[i++] =

sensext_value[0]&0xff;

buffer[i++] =

sensext_value[0]>>8;

buffer[i++] =

sensext_value[1]&0xff;

buffer[i++] =

sensext_value[1]>>8;

// sprintf(buffer,"w,%u,%u\r\n",

sensext_value[0], sensext_value[1]);

break;

case -2: // i2c cmps03

buffer[i++] =

sensext_value[0]&0xff;

buffer[i++] =

sensext_value[0]>>8;

buffer[i++] =

sensext_value[1]&0xff;

buffer[i++] =

sensext_value[1]>>8;

// sprintf(buffer,"w,%d,%d\r\n",

sensext_value[0], sensext_value[1]);

break;

default: //analog (sharp,...)

buffer[i++] =

(sensext_value[0]&0xff);

buffer[i++] =

(sensext_value[0]>>8);

// sprintf(buffer,"w,%d\r\n",

sensext_value[0]);

break;

}

// uart_send_text(buffer);

}

else

{

switch(sensext_param[0])

{

case -1:

case -2:

sensext_value[0] = -1;

sensext_value[1] = -1;

buffer[i++] =

sensext_value[0]&0xff;

buffer[i++] =

Page 100: Isa's Portfolio - Thesis

91

sensext_value[0]>>8;

buffer[i++] =

sensext_value[1]&0xff;

buffer[i++] =

sensext_value[1]>>8;

break;

default:

sensext_value[0] = -1;

sensext_value[1] = -1;

buffer[i++] =

sensext_value[0]&0xff;

buffer[i++] =

sensext_value[0]>>8;

break;

}

// uart_send_static_text("wrong parameter\r\n");

}

}

else

{

uart_send_static_text("LIS sensors turret not present\r\n");

}

#endif

break;

default: // silently ignored

break;

}

while (e_getchar_uart1(&c)==0); // get next command

} while(c);

if (i!=0){

if (wait_cam) {

wait_cam=0;

while(!e_poxxxx_is_img_ready());

}

e_send_uart1_char(buffer,i); // send answer

while(e_uart1_sending());

}

} else if (c>0) { // ascii mode

if (bypass == 0) // bypass for wait command

{

while (c=='\n' || c=='\r')

e_getchar_uart1(&c);

buffer[0]=c;

i = 1;

do if (e_getchar_uart1(&c))

buffer[i++]=c;

while (c!='\n' && c!='\r');

buffer[i++]='\0';

buffer[0]=toupper(buffer[0]); // we also accept lowercase letters

}

switch (buffer[0]) {

case 'A': // read accelerometer

accx=e_get_acc(0);

accy=e_get_acc(1);

accz=e_get_acc(2);

sprintf(buffer,"a,%d,%d,%d\r\n",accx,accy,accz);

uart_send_text(buffer);

/* accelero=e_read_acc_spheric();

sprintf(buffer,"a,%f,%f,%f\r\n",accelero.acceleration,accelero.orientation,accelero.inclination);

uart_send_text(buffer);*/

break;

case 'B': // set body led

sscanf(buffer,"B,%d\r",&LED_action);

e_set_body_led(LED_action);

uart_send_static_text("b\r\n");

break;

case 'C': // read selector position

Page 101: Isa's Portfolio - Thesis

92

selector = SELECTOR0 + 2*SELECTOR1 + 4*SELECTOR2 +

8*SELECTOR3;

sprintf(buffer,"c,%d\r\n",selector);

uart_send_text(buffer);

break;

case 'D': // set motor speed

sscanf(buffer, "D,%d,%d\r", &speedl, &speedr);

e_set_speed_left(speedl);

e_set_speed_right(speedr);

uart_send_static_text("d\r\n");

break;

case 'E': // read motor speed

sprintf(buffer,"e,%d,%d\r\n",speedl,speedr);

uart_send_text(buffer);

break;

case 'F': // set front led

sscanf(buffer,"F,%d\r",&LED_action);

e_set_front_led(LED_action);

uart_send_static_text("f\r\n");

break;

#ifdef IR_RECIEVER

case 'G':

sprintf(buffer,"g IR check : 0x%x, address : 0x%x, data : 0x%x\r\n", e_get_check(), e_get_address(),

e_get_data());

uart_send_text(buffer);

break;

#endif

case 'H': // ask for help

uart_send_static_text("\n");

uart_send_static_text("\"A\" Accelerometer\r\n");

uart_send_static_text("\"B,#\" Body led 0=off 1=on 2=inverse\r\n");

uart_send_static_text("\"C\" Selector position\r\n");

uart_send_static_text("\"D,#,#\" Set motor speed left,right\r\n");

uart_send_static_text("\"E\" Get motor speed left,right\r\n");

uart_send_static_text("\"F,#\" Front led 0=off 1=on 2=inverse\r\n");

#ifdef IR_RECIEVER

uart_send_static_text("\"G\" IR receiver\r\n");

#endif

uart_send_static_text("\"H\" Help\r\n");

uart_send_static_text("\"I\" Get camera parameter\r\n");

uart_send_static_text("\"J,#,#,#,#\" Set camera parameter

mode,width,heigth,zoom(1,4 or 8)\r\n");

uart_send_static_text("\"K\" Calibrate proximity sensors\r\n");

uart_send_static_text("\"L,#,#\" Led number,0=off 1=on

2=inverse\r\n");

#ifdef FLOOR_SENSORS

uart_send_static_text("\"M\" Floor sensors\r\n");

#endif

uart_send_static_text("\"N\" Proximity\r\n");

uart_send_static_text("\"O\" Light sensors\r\n");

uart_send_static_text("\"P,#,#\" Set motor position left,right\r\n");

uart_send_static_text("\"Q\" Get motor position left,right\r\n");

uart_send_static_text("\"R\" Reset e-puck\r\n");

uart_send_static_text("\"S\" Stop e-puck and turn off leds\r\n");

uart_send_static_text("\"T,#\" Play sound 1-5 else stop sound\r\n");

uart_send_static_text("\"U\" Get microphone amplitude\r\n");

uart_send_static_text("\"V\" Version of SerCom\r\n");

#ifdef LIS_SENSORS_TURRET

if (sensext_pres) // If the TP_sensors

tourret is present

uart_send_static_text("\"W,#\" Sensor extension turret.

Analog: W,0; analog 5 LEDs: W,0->31; i2c dist: W,-1; i2c comp: W,-2\r\n");

else

uart_send_static_text("\"W,#\" Sensor extension turret not

detected. Function deactivated.");

#endif

break;

case 'I': // receive ready epuck (in waiting operation)

sscanf(buffer,"I,%d\r", &tempEpuck);

tempEpuck--;

if (stillwaiting == 0) // if we are not waiting yet

{

waitEpuck[tempEpuck] = 1; // just remember that this epuck is

waiting

Page 102: Isa's Portfolio - Thesis

93

uart_send_text("m(E-Puck is not waiting yet)\r\n");

break;

}

stillwaiting = 0;

waitEpuck[tempEpuck] = 1;

for(count = 0; count<3; count++) // check if all epucks we are waiting for

has arrived

{

if ( epucktowaitfor[count] == 1) // if we are waiting for this

epuck

{

if (waitEpuck[count] == 0) // if it hasn't

arrived

stillwaiting = 1; // we have to

wait

}

}

if (stillwaiting == 0) // if all has arrived

{

uart_send_text("m(E-Puck finished waiting)\r\n");

for(count = 0; count<3; count++)

{

if ( epucktowaitfor[count] == 1) // if the

epuck was waiting for us

{

waitEpuck[count] =

0; // reset its value to 'not waiting'

}

epucktowaitfor[count] = 0; // reset to we

are not waiting for any epuck

}

nexttask();

break;

}

else

{

uart_send_text("m(E-Puck still waiting) \r\n");

break;

}

break;

case 'J':// start wait

bypass = 0;

sscanf(buffer,"J,%d,%d,%d\r",&epucktowaitfor[0],&epucktowaitfor[1],&epucktowaitfor[2]);

epucktowaitfor[0] = 1;

waitEpuck[0] = 1;

uart_send_text("m(Commencing waiting now)\r\n");

uart_send_text("C,2(I,1)\r\n"); // send to waiting epucks

uart_send_text("C,3(I,1)\r\n");

stillwaiting = 1;

break;

case 'K': // calibrate proximity sensors

uart_send_static_text("k, Starting calibration - Remove any object in

sensors range\r\n");

e_calibrate_ir();

uart_send_static_text("k, Calibration finished\r\n");

break;

case 'L': // set led

sscanf(buffer,"L,%d,%d\r",&LED_nbr,&LED_action);

e_set_led(LED_nbr,LED_action);

uart_send_static_text("l\r\n");

break;

case 'M': // read floor sensors (optional)

#ifdef FLOOR_SENSORS

e_i2cp_enable();

for (i=0; i<6; i++) buffer[i] = e_i2cp_read(0xC0,i);

e_i2cp_disable();

sprintf(buffer,"m,%d,%d,%d\r\n",

Page 103: Isa's Portfolio - Thesis

94

(unsigned int)buffer[1] | ((unsigned int)buffer[0] << 8),

(unsigned int)buffer[3] | ((unsigned int)buffer[2] << 8),

(unsigned int)buffer[5] | ((unsigned int)buffer[4] << 8));

uart_send_text(buffer);

#else

uart_send_static_text("m,0,0,0\r\n");

#endif

break;

case 'N': // read proximity sensors

sprintf(buffer,"n,%d,%d,%d,%d,%d,%d,%d,%d\r\n",

e_get_calibrated_prox(0),e_get_calibrated_prox(1),e_get_calibrated_prox(2),e_get_calibrated_prox(3),

e_get_calibrated_prox(4),e_get_calibrated_prox(5),e_get_calibrated_prox(6),e_get_calibrated_prox(7));

uart_send_text(buffer);

break;

case 'O': // read ambient light sensors

sprintf(buffer,"o,%d,%d,%d,%d,%d,%d,%d,%d\r\n",

e_get_ambient_light(0),e_get_ambient_light(1),e_get_ambient_light(2),e_get_ambient_light(3),

e_get_ambient_light(4),e_get_ambient_light(5),e_get_ambient_light(6),e_get_ambient_light(7));

uart_send_text(buffer);

break;

case 'P': // set motor position

sscanf(buffer,"P,%d,%d\r",&positionl,&positionr);

e_set_steps_left(positionl);

e_set_steps_right(positionr);

uart_send_static_text("p\r\n");

break;

case 'Q': // read motor position

sprintf(buffer,"q,%d,%d\r\n",e_get_steps_left(),e_get_steps_right());

uart_send_text(buffer);

break;

case 'R': // reset

uart_send_static_text("r\r\n");

RESET();

break;

case 'S': // stop

e_set_speed_left(0);

e_set_speed_right(0);

e_set_led(8,0);

uart_send_static_text("s\r\n");

break;

case 'T': // stop

sscanf(buffer,"T,%d",&sound);

if(first==0){

e_init_sound();

first=1;

}

switch(sound)

{

case 1: e_play_sound(0,2112);break;

case 2: e_play_sound(2116,1760);break;

case 3: e_play_sound(3878,3412);break;

case 4: e_play_sound(7294,3732);break;

case 5: e_play_sound(11028,8016);break;

default:

e_close_sound();

first=0;

break;

}

uart_send_static_text("t\r\n");

break;

case 'U':

sprintf(buffer,"u,%d,%d,%d\r\n",e_get_micro_volume(0),e_get_micro_volume(1),e_get_micro_volume(

2));

uart_send_text(buffer);

break;

case 'V': // goto position

sscanf(buffer,"V,%d,%d,%d\r",&positionX,&positionY, &orientation);

if((gotopositionX - positionX) < threshold && (gotopositionY - positionY)

< threshold)

Page 104: Isa's Portfolio - Thesis

95

{

gotopositionX = 0;

gotopositionY = 0;

nexttask();

break;

}

distance = sqrt((float)((gotopositionX - positionX)*(gotopositionX -

positionX) + (gotopositionY - positionY)*(gotopositionX - positionX)));

angle = (float)atan2(gotopositionY - positionY, gotopositionX - positionX

);

turn_to_direction(angle-(orientation/100.0f));

move_cm(distance*10.0f, travelspeed);

uart_send_text("P\r\n");

break;

case 'W': // read Devantec ultrasonic range sensor or Sharp Ir sensor (optional)

#ifdef LIS_SENSORS_TURRET

sscanf(buffer,"W,%d\r",&sensext_param[0]);

if (sensext_pres) // If the TP_sensors

tourret is present

{

if(e_sensext_process(sensext_param, sensext_value))

{

switch (sensext_param[0])

{

case -1: //i2c SRFxxx

sprintf(buffer,"w,%u,%u\r\n",

sensext_value[0], sensext_value[1]);

break;

case -2: // i2c cmps03

sprintf(buffer,"w,%d,%d\r\n",

sensext_value[0], sensext_value[1]);

break;

default: //analog (sharp,...)

sprintf(buffer,"w,%d\r\n",

sensext_value[0]);

}

uart_send_text(buffer);

}

else

{

uart_send_static_text("wrong parameter\r\n");

}

}

else

{

uart_send_static_text("LIS sensors turret not present\r\n");

}

#endif

break;

case 'X': // Dummy command returning a number of bytes given as parameter

loadtask(buffer);

break;

default:

uart_send_static_text("m(z,Command not found)\r\n");

break;

}

}

}

}