Top Banner
Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedon aining Centre, ELMAK Elbit Systems of Macedonia, Airport Petrovec, Skopje, R ROBOTILNICA Intelligent Systems and Robotics, Skopje, R.Macedonia Vasko Sazdovski
35

Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

Dec 18, 2015

Download

Documents

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

Vision Aided Inertial Navigation using Single Video Camera Measurements

Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia

Pilot Training Centre, ELMAK Elbit Systems of Macedonia, Airport Petrovec, Skopje, R.Macedonia

ROBOTILNICA Intelligent Systems and Robotics, Skopje, R.Macedonia

Vasko Sazdovski

Page 2: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

22

OUTLINE OF THE PRESENTATION

• INTRODUCTION

• THEORY OF SIMULTANEOUS LOCALIZATION AND MAPPING (SLAM). INERTIAL NAVIGATION AND AUGMENTED STATE VECTOR.

• VISION SENSORS

• INERTIAL NAVIGATION AIDED BY SIMULTANEOUS LOCALIZATION AND MAPPING. SLAM AS SENSOR FUSION ALGORITHM

• INERTIAL NAVIGATION AIDED BY BEARING ONLY SIMULTANEOUS LOCALIZATION AND MAPPING

• CONCLUSIONS

• FUTURE WORK POSSIBILITIES

Page 3: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

INTRODUCTION

Future UAV’s have a need of high levels of autonomy and independence. They are going to become small, very agile and highly maneuverable.

Prof. Raffaello D'Andrea from ETH Zurich at ZURICH.MINDS 2012 showing the early quadrotor designs within its research group…

MeCam quadrotor

•14 sensors including side object detectors that enable "perfect and safe hovering”;•Voice control;•Autonomous person following;•Two Morpho autopilots and a video stabilization system;•Autonomous panoramas;•Real-time streaming to mobile devices;•$50 MSRP, potential availability "by the beginning of 2014“.

Page 4: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

44

INTRODUCTION

Integrated navigation system design requires selection of:

•set of a sensors and•computation power

that provides reliable and accurate navigation parameters (position, velocity and attitude) with high update rates and bandwidth in small and cost effective manner.

Piccolo autopilot used on UAV for collecting meteorological or map data over long distances. Cloud Cap Technology USA.

Modern INS/GPS Navigation System for UAV’sSize: 131 x 57 x 19 mm Weight: 80 grams

Navigator SL Cloud Cap Technology USA

Page 5: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

55

INTRODUCTION

Integrated navigation system design requires selection of:

•set of a sensors and•computation power

that provides reliable and accurate navigation parameters (position, velocity and attitude) with high update rates and bandwidth in small and cost effective manner.

MeCam quadrotor1GB RAM, SD card, 2.4GHz/5GHz Wi-Fi and Bluetooth connectivity.

Always Innovating Cortex-A9 SoC Module which can run from 1.0GHz to 1.5GHz

Page 6: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

INTRODUCTION

Many of today’s operational navigation systems rely on inertial sensor measurements.

Single axis gyro

Three axis accelerometer

Atomic Inertial Measurement Unit (IMU) for Sparkfun Inc.

Inertial navigation is diverging slowly from the real measurements with time.

Page 7: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

77

LORANOMEGANAVSATDMEAREA CORRELATOR

DOPPLERAIR DATA

GPS

INERTIAL

VERTICALGYRO

AHRS

MAGNETICCOMPASS

DIRECTIONALGYRO

POSITION SPEED

ATTITUDE HEADING

Ching-Fang Lin “Modern Navigation Guidance and Control Processing” Prentice Hall Inc. 1991

Additional sources of aiding information to the navigation parameters( non interactive aiding information)

INTRODUCTION

Page 8: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

INERTIAL

POSITION SPEED

ATTITUDE HEADING

KNOWNENVIRONMENT

VISION SENSORS

Additional sources of aiding information to the navigation parameters(interactive aiding information)

Vision sensors are must nowadays

INTRODUCTION

INERTIAL

POSITION SPEED

ATTITUDE HEADING

NOVEL SOLUTIONSEX. VISION-BASED

SLAM

UNKNOWNENVIRONMENT

Page 9: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

99

THEORY OF SIMULTANEOUS LOCALIZATION AND MAPPING (SLAM).INERTIAL NAVIGATION AND AUGMENTED STATE VECTOR.

1m

1p

2p

3p

4p

5p

6p

1̂r

1

2̂r

23

4

5

77̂r

2m

3m

4m

5m

7m

6m

3̂r

4̂r

5̂r

6̂r

6

7p

Simultaneous Localization and Mapping (SLAM) uses relative measurements (range and bearing) from the vehicle with respect to the environment to build a map whilst simultaneously using the generated map computes vehicle position.

7

1

m

m

x

x

vk

k Augmented state vector

Vehicle state vector

Map points

Unknown environment

Page 10: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

1010

m0p

1p

2p

3p

4p

5p

6p

00̂r

1̂r

1

2̂r

3̂r

4̂r

5̂r

6̂r

2

3

4

56

provide valuable information’s of the spatial relationship of the map point and the vehicle

Mapping of an unknown map point

kkkmik rpx ˆ

Localization using the map point

kkmikk rxp ˆ

THEORY OF SIMULTANEOUS LOCALIZATION AND MAPPING (SLAM).INERTIAL NAVIGATION AND AUGMENTED STATE VECTOR.

Repeated relative measurements of a map point (circular movement)

Page 11: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

ki

i

i

kmik

vk

iik v

zz

yy

xx

vxxhz

],,[

Observation models

k

iii

i

iii

i

iii

i

kmik

vk

iik v

zzyyxx

zzzzyyxx

yyzzyyxx

xx

vxxhz

222

222

222

)()()(

)()()(

)()()(

],,[

VISION SENSORS

Single video camera Stereo video camera

Measurement noiseLinear model

Nonlinear modelVehicle position

Map point

Page 12: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

INERTIAL NAVIGATION AIDED BY SIMULTANEOUS LOCALIZATION AND MAPPING. SLAM AS SENSOR FUSION ALGORITHM

As we proposed the approach is to augment the vehicle state with number of map point estimates of the map point:

.,,2,1][ nizyxx Tiiimik

The augmented state vector containing the vehicle state and the map point estimates is denoted as:TTmn

k

Tmk

Tvkk xxxx ][ 1

When SLAM is performed on aerial vehicles the vehicle state can be represented by Inertial Navigation mechanization equations which give position, velocity and attitude quaternion:

11

1

11

2

1][

kk

nk

nbk

nb

nk

nk

k

nk

nk

qtq

vtgfC

ptv

q

v

p

vkx

mnk

mk

vkk

vkv

kkkk

x

x

wuxf

wuxfx

1

11

1

1

),(

),(

The augmented process model is of the form

where is the control input containing the acceleration and gyro measurements , is the process noise.

kuTb

zby

bx

bk ffff ][ Tb

k rqp ][kw

Page 13: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

The equation for calculation of the map point estimates written in component form is

where is the measurement noise.

z

y

x

kkvk

imik

rz

ry

rx

vzxgx

ˆ

ˆ

ˆ

],,[

kv

],[],[

kki

kkkaugk zxg

xzxfx

Tzkk

zk

Txkk

xkk

xk

TxkkkT

augkaugkgRggPgPg

gPPfPfP )()(

The method for augmenting the map point estimates and building the covariance matrix is as follows. The new map point estimate is added to the augmented state vector and a new row and column are added to the covariance matrix:

xkgz

kg )(g kxkz

where and are Jacobians of function shown above with respect to the augmented state and observation , respectively

These equations are repeated until we augment the state vector with “sufficient” number of map point estimates. Through simulations we realized that we need more then three map point estimates.

INERTIAL NAVIGATION AIDED BY SIMULTANEOUS LOCALIZATION AND MAPPING. SLAM AS SENSOR FUSION ALGORITHM

Page 14: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

As the vehicle circles around the map point, repeated measurements are available from the stereo camera.

Tzyxkz ][

Using these measurements we can start updating the augmented state vector by using each of the map point estimates at a time. Extended Kalman Filter (EKF) can be implemented for the update.

The state covariance is propagated first

where is the Jacobian of the augmented process model with respect to the augmented state vector.

k

Txkkk

xkkk QfPfP 1|11|

xkf

Next using the measurements (observations) we update the augmented state vector and its covariance:

kkkkkk vKxx 1|| ˆˆ

1|1|| kkxkkkkkk PhKPP

where the innovation vector and the Kalman gain are calculated as:

is the Jacobian of the observation function with respect to the vehicle state and the map point estimate used in the update.

)ˆ( 1| kkkk xhzv

11|1| ][

k

Txkkk

xk

Txkkkk RhPhhPK

xkh )(h

INERTIAL NAVIGATION AIDED BY SIMULTANEOUS LOCALIZATION AND MAPPING. SLAM AS SENSOR FUSION ALGORITHM

Page 15: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

PERFORMANCES OF INERTIAL NAVIGATION AIDED BY SIMULTANEOUS LOCALIZATION AND MAPPING

Map point estimates

True estimated and divergent vehicle trajectories Position errors

398

399

400

401

402

403

404

405

548549

550551

552553

554555

2.8

3

3.2

0

2000

4000

6000

0

2000

4000

60000

500

1000

1500

2000

x[m]

Aerial vehicle-3D

y [m]

h [m

]

200

400

600

800

1000

200

400

600

800

10000

100

200

300

400

x[m]

Aerial vehicle-3D

y [m]

h [m

]

0 50 100 150 200 250 300 350-10

-8

-6

-4

-2

0

2

4

6

8

10position errors

t[s]

x [

m]

0 50 100 150 200 250 300 350-10

-5

0

5

10

15position errors

t[s]

y [m

]

0 50 100 150 200 250 300 350-3

-2

-1

0

1

2

3

4position errors

t[s]

z [

m]

Page 16: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

Map point estimates

0 50 100 150 200 250 300 350398

398.5

399

399.5

400

400.5

401

401.5

402

402.5map point

t[s]

x [m

]

0 50 100 150 200 250 300 350548

548.5

549

549.5

550

550.5

551

551.5

552

552.5map point

t[s]

y [m

]

0 50 100 150 200 250 300 3502

2.2

2.4

2.6

2.8

3

3.2

3.4

3.6

3.8

4map point

t[s]

z [m

]

0 50 100 150 200 250 300 3500

0.005

0.01

0.015

0.02

0.025

0.03

0.035

0.04determinant Pxm1

t[s]

[m]

Determinant of the covariance matrix of the first map point

estimate

-5

0

5

-5

0

5-2

-1

0

1

2

-2

-1

0

1

2

-2

-1.5

-1

-0.5

0

0.5

1

1.5

2

-1

0

1

The contour ellipsoid of the covariance matrix of the first

map point estimate

PERFORMANCES OF INERTIAL NAVIGATION AIDED BY SIMULTANEOUS LOCALIZATION AND MAPPING

Page 17: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

1717

0 50 100 150 200 250 300 3500

0.02

0.04

0.06

0.08

0.1

0.12

0.14

0.16

0.18

0.2determinant Pv

t[s][m

]

-4

-2

0

2

4

-3

-2

-1

01

2

3-2

-1

0

1

2

x [m]y [m]

z [m

]

-2-1

01

2

-2

-1

0

1

2-4

-2

0

2

4

0 50 100 150 200 250 300 3500

0.5

1

1.5

2

2.5

3

3.5

4

4.5

5x 10

-3 distance

t[s]

[m]

The relative distance between the first and second map point

estimates

Determinant of the covariance matrix of the vehicle position

The contour ellipsoid of the covariance matrix of the

vehicle position

PERFORMANCES OF INERTIAL NAVIGATION AIDED BY SIMULTANEOUS LOCALIZATION AND MAPPING

Page 18: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

1818

INERTIAL NAVIGATION AIDED BY BEARING ONLY SIMULTANEOUS LOCALIZATION AND MAPPING

Bearing Only Simultaneous Localization and Mapping (BOSLAM) is very attractive these days because it permits the use of single camera as sensor for measuring the bearing i.e. unit direction to the map points.

The major drawback of this solution is the problem of map point initialization from a single measurement.

Page 19: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

1919

Much of the research in BOSLAM is focused towards the problem of initialization of the map points from single camera measurements. In the literature two techniques are proposed to address the problem of map point initialization.

The first technique involves delaying the map point initialization until a criterion is fulfilled and sufficient baseline is available from different vehicle positions.

1.

2.The second technique tries to avoid the delay and initialize the map point from a single measurement The fact that after the first observation, the map point lies along on the line from the vehicle to the map point (the projection ray) is used

The approach presented before to augment the state vector not only with one map point estimate brings new parameterization of the map point in BOSLAM. The novelty comes from the usage of the certain number of map point estimates for update of the whole augmented state vector together with a combination of repeated measurements and motion in vicinity of the map point. This approach brings delayed initialization of the map points.

Both techniques have their pros and cons

INERTIAL NAVIGATION AIDED BY BEARING ONLY SIMULTANEOUS LOCALIZATION AND MAPPING

Page 20: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

Discussion on the observability of a system provides insights and understanding of the fundamental limits of the estimation processes.

0 10 20 30 40 50 60 70 80 90 1000

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

0.45

0.5vehicle position

iterations with time

info

rmat

ion

information x-axis

information y-axisinformation z-axis

0 10 20 30 40 50 60 70 80 90 1000

0.01

0.02

0.03

0.04

0.05

0.06

0.07

0.08

0.09

0.1map point estimate

iterations with time

info

rmat

ion

information x-axis

information y-axisinformation z-axis

0 10 20 30 40 50 60 70 80 90 1000

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

0.45

0.5vehicle position

iterations with time

info

rmat

ion

information x-axis

information y-axisinformation z-axis

0 10 20 30 40 50 60 70 80 90 1000

0.01

0.02

0.03

0.04

0.05

0.06

0.07

0.08

0.09

0.1map point estimate

iterations with time

info

rmat

ion

information x-axis

information y-axisinformation z-axis

Stationary Vehicle

Vehicle moving on a circular path

INERTIAL NAVIGATION AIDED BY BEARING ONLY SIMULTANEOUS LOCALIZATION AND MAPPING

Since observability analysis can give the best achievable performance even before the system is built, it can be considered as tool for computer analysis of many complicated estimation processes.

Page 21: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

2121

The EKF is very commonly used algorithm and, because of its simplicity, is very often chosen as the ”best” algorithm for implementation.

INERTIAL NAVIGATION AIDED BY BEARING ONLY SIMULTANEOUS LOCALIZATION AND MAPPING

Single video camera

k

iii

i

iii

i

iii

i

kmik

vk

iik v

zzyyxx

zzzzyyxx

yyzzyyxx

xx

vxxhz

222

222

222

)()()(

)()()(

)()()(

],,[

Nonlinear observation model

Inertial Navigation aided by BOSLAM as well as many other navigation problems are nonlinear and must be linearized (approximated) before applying the popular Kalman-like filtering algorithms. An EKF presents one such approximation.

Page 22: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

2222

INERTIAL NAVIGATION AIDED BY BEARING ONLY SIMULTANEOUS LOCALIZATION AND MAPPING

Because of numbers of significant problems that appear when implementing the EKF, other algorithms such as:

•Iterated Extended Kalman Filter (IEKF)•Unscented Kalman Filter (UKF)•Unscented Particle Filter (UPF)

were implemented, tested and compared with the EKF algorithm.

The EKF is very commonly used algorithm and, because of its simplicity, is very often chosen as the ”best” algorithm for implementation.

Inertial Navigation aided by BOSLAM as well as many other navigation problems are nonlinear and must be linearized (approximated) before applying the popular Kalman-like filtering algorithms. An EKF presents one such approximation.

Page 23: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

0 10 20 30 40 50 60 70 80 90 100-60

-40

-20

0

20

40

60position errors

t[s]

x [m

]

EKF

IEKFUKF

UPF

0 10 20 30 40 50 60 70 80 90 100-60

-40

-20

0

20

40

60position errors

t[s]

y [m

]

EKF

IEKFUKF

UPF

0 10 20 30 40 50 60 70 80 90 100-60

-40

-20

0

20

40

60

80position errors

t[s]

z [m

]

EKF

IEKF

UKF

UPF

0200

400600

800

0

200

400

600

8000

100

200

300

400

x[m]y [m]

z [m

]

True trajectoryUn-aided IN trajectory

EKF trajectory

Map point

IEKF trajectory

UKF trajectoryUPF trajectory

True estimated and divergent vehicle trajectories

PERFORMANCES OF INERTIAL NAVIGATION AIDED BY BEARING ONLY SIMULTANEOUS LOCALIZATION AND MAPPING

Page 24: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

2424

0 10 20 30 40 50 60 70 80 90 100380

385

390

395

400

405

410

415map point

t[s]

x [m

]

EKF IEKF

UKF

UPF

0 10 20 30 40 50 60 70 80 90 100535

540

545

550

555

560

565

570map point

t[s]

y [m

]

EKF IEKF

UKF

UPF

0 10 20 30 40 50 60 70 80 90 100-30

-25

-20

-15

-10

-5

0

5

10

15map point

t[s]

z [m

]

EKF IEKF

UKF

UPF

Map point estimates

PERFORMANCES OF INERTIAL NAVIGATION AIDED BY BEARING ONLY SIMULTANEOUS LOCALIZATION AND MAPPING

Page 25: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

Aiding Inertial navigation (IN) by BOSLAM exhibits a high degree of nonlinearity and typically in these applications an EKF introduces large estimation errors.

Other algorithms such as UKF and UPF demonstrate best performance and appear to be efficient estimators for the concept of IN aided by BOSLAM.

The SLAM aided IN and BOSLAM aided IN sensor fusion algorithm present reliable solutions that provide aiding information to IN from vision sensors. These algorithms successfully integrate the inertial and vision sensors with no a priori knowledge of the environment.

CONCLUSIONS

Page 26: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

QUADROTOR RESEARCH ARENA

GRASP Lab, University of Pennsylvania, extensive use of motion capture systems

Vicon T Series motion capture System

Page 27: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

QUADROTOR RESEARCH ARENA

Flying Machine arena , Institute for Dynamic Systems and Control ETH Zurich

Page 28: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

QUADROTOR RESEARCH ARENA

Localization and mapping done with two Hokuyo lidars and a servo motor. University of Pennsylvania

CityFlyer project, Robotics and Intelligent Systems Lab, City College of New York, City University of New York.

STARMAC Project in the Hybrid Systems Lab at UC Berkeley 

Page 29: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

QUADROTOR RESEARCH ARENA

• Hokuyo laser range-finder (1),• stereo cameras (2), • monocular color camera (3),• laser-deflecting mirrors for altitude (4), • 1.6GHz Intel Atom-based flight computer (5), • Ascending Technologies internal processor and IMU (6).

Robotics Group, CSAIL MIT:

Page 30: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

QUADROTOR RESEARCH ARENA

sFly European Project: Visual-Inertial SLAM for a small hexacopter (IROS 2012 video screenshot)

Computer Vision Group at TUM Germany: Autonomous Camera-Based Navigation of a Low-Cost Quadrocopter

Page 31: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

FUTURE WORK POSSIBILITIES

In our future research we are working on developing novel approaches i.e. efficient estimators (sensor fusion algorithms) which will provide navigation performances and accuracy to a level close to the Differential GPS using only single video camera and inertial sensors.

Our research work is very much focused on the practical experimentation and validation of the before defined problems. Quadrotor UAV is the chosen platform for the practical experiments.

Quadrotor Flying FrogROBOTILNICA Intelligent

Systems and Robotics

Custom modified quadrotor from KKMulticopter South Korea

Page 32: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

FUTURE WORK POSSIBILITIES

We are fascinated by insect based navigation. Flying insects especially inspire much of our research on autonomous navigation systems.

We are very much interested in how the flying insects adept at maneuvering in complex, unconstrained, hostile and hazardous environments.

Page 33: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

FUTURE WORK POSSIBILITIES

Our approach with the practical experiments presents a low cost approach. Low cost Inertial Measurement Unit (IMU) from Sparkfun Electronics provides the angular rates (gyro) measurements and accelerometer measurements. These inertial sensors data together with the video from a smartphone camera are transferred over WiFi and/or 3G to a remote server/PC for processing. The video processing is performed on the remote PC using OpenCV libraries, together with the sensor fusion algorithms. The results are transferred back to the quadrotor for guidance and control of the vehicle for performing the required maneuvers (motion) and the vehicle task itself.

Page 34: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

FUTURE WORK POSSIBILITIES

As with the case of singe UAV, simply passing or flying by a map point with no manoeuvre will not help much for autonomous navigation for swarm of UAV’s. Coordination and cooperation between the UAV’s performing manoeuvres over same environment features (map points) are needed. This investigation is expected to address the issues when low accuracy vision sensors are used on UAV’s. In this scenario the UAV’s can take measurements not just of the environment but also of each other. It is expected that these measurements can accelerate the convergence of the novel algorithms under experimentations

Page 35: Vision Aided Inertial Navigation using Single Video Camera Measurements Faculty of Electrical Engineering, “Goce Delcev” University, Shtip, R.Macedonia.

3535

QUESTIONS ARE WELCOMED

?THANK YOU FOR YOUR ATTENTION