Autonomous Robot Dancing

Post on 30-Jun-2015

416 Views

Category:

Education

2 Downloads

Preview:

Click to see full reader

DESCRIPTION

Autonomous Robot Dancing Driven by Beats and Emotions of Music- Nao

Transcript

AUTONOMOUS ROBOT DANCING DRIVEN BYBEATS AND EMOTIONS OF MUSIC

((in the name of the best creator))

Distributed Artificial IntelligenceCourse

INTRODUCTION

Many robot dances are preprogrammed by choreographers.

choreography:

2

OU

T O

F 1

5

primitive emotions

key frame (static poses)

Our work is made upof two parts:

(1) The first algorithm plans a sequence ofdance movements that is driven by the beats and the emo-tions detected through the preprocessing of selected dancemusic. (2) We also contribute a real-time synchronizing al-gorithm to minimize the error between the execution of themotions and the plan.

3

OU

T O

F 1

5

WE CREATE A LARGE LIBRARY OF MOTION PRIM-ITIVES, BY DIVIDING THE JOINTS OF THE NAO HUMANOID ROBOTINTO 4 CATEGORIES, WHERE EACH CATEGORY OF JOINTS CAN ACTUATEINDEPENDENTLY.

4

OU

T O

F 1

5

PUAL EKMAN PROPOSED 6 PRIMARY EMOTIONS:

1-happy 2-sad 3-surprised 4-angry 5-fear 6-disgust

5

OU

T O

F 1

5

JOINTS:

1. Head (Head): HeadYaw, HeadPitch

2. Left Arm (LArm): LShoulderPitch, LShoulderRoll, LElbowYaw, LElbowRoll

3. Right Arm (RArm): RShoulderPitch, RShoulderRoll, RElbowYaw, RElbowRoll

4. Legs (Legs): LHipYawPitch, LHipRoll, LHipPitch, LKneePitch, LAnklePitch, LAnkleRoll, RHipRoll, RHipPitch, RKneePitch, RAnklePitch, RAnkleRoll

6

OU

T O

F 1

5

emotion extraction:

emotion presentation:

(a , v)

a and v is between -1 , 1 7

OU

T O

F 1

5

SMERS(SVR)

94% agreement

super vector regression

beat tracking

beat tracking+ autocorrelation analysis+ neural network

amplitude ( the best candidate)

8

OU

T O

F 1

5

MAPPING MOTION PRIMITIVE TO ACTIVATION-VALENCE SPACE FROM STATIC POSTURES DATA

We collected 4 static postures of the NAO humanoid robot

for each of Ekman's 6 basic emotions:

Happy, Sad, Angry, Surprised, Fear and Disgust.

we have a totalof 24 emotional static postures.

6*4=24 9

OU

T O

F 1

5

10

OU

T O

F 1

5

EMOTION FOR NEXT MOTION PRIMITIVE

Motion primitives are selected sequentially and stretched

to all a whole number of beat times. To choose the next

motion primitive, we need the emotion at the end of the

previous motion primitive. We simply estimate the emotion

at each beat time by linearly interpolating the (a; v) values

11

OU

T O

F 1

5

THE MARKOV DANCER MODEL

12

OU

T O

F 1

5

exclusion we use an adaptive real-time synchronizing algorithm

primitive motions:

(i) be continuous

(ii) reflect the musical emotion

(iii) be interestingly non-deterministic

13

OU

T O

F 1

5

CONCLUSION

We show that we can automate robot dancing by formingschedules of motion primitives that are driven by the emo-tions and the beats of any music on a NAO humanoid robot.The algorithms are general and can be used on any robot.From emotion labels given for static postures, we can es-timate the activation-valence space locations of the motionprimitives and select the appropriate motion primitives foremotions detected in music.

14

OU

T O

F 1

5

THE END

15

OU

T O

F 1

5

top related