Top Banner
3D Upper Limb Motion Modeling and Estimation using Wearable Micro-sensors Zhiqiang Zhang 1,2 , Lawrence WC Wong 3 and Jian-Kang Wu 1,2 1 Graduate University, Chinese Academy of Sciences, Beijing, China 2 China-Singapore Institute of Digital Media, Singapore 3 National University of Singapore, Singapore Email: [email protected] Abstract—Human motion capture technologies are widely used in interactive game and learning, animation, film special effects, health-care and navigation. Because of the agility, upper limb motion estimation is the most difficult in human motion capture. Traditional methods always assume that the movements of upper arm and forearm are independent and estimate their movements separately; therefore, the estimated motion are always with serious distortion. In the paper, we proposed a novel ubiquitous upper limb motion estimation method using wearable micro-sensors, which concentrated on modeling the relationship of the movements between upper arm and forearm. Exploration of the skeleton structure of upper limb as a link structure with 5 degrees of freedom was firstly proposed to model human upper limb motion. After that, parameters were defined according to Denavit- Hartenberg convention, forward kinematic equations of upper limb were derived, and an Unscented Kalman filter was invoked to estimate the defined parameters. The experimental results have shown the feasibility and effectiveness of the proposed upper limb motion capture and analysis algorithm. Keywords-Ubiquitous Motion Modeling and Estimation, For- ward Kinematic Equation, Wearable Micro-sensors; I. I NTRODUCTION Motion capture serves as a key technology in a wide spectrum of applications, such as interactive game and learning, animation, film special effects, health-care and navigation. The existing motion capture systems are mainly based on video processing techniques, referred to as Mocap, such as Vicon [1] and Qualisys [2]. This type of systems are widely used in film industry. However, the Mocap has certain limitations: 1) there is a need for multiple (usually more than 10) high speed cameras structured and calibrated in a dedicated studio; 2) subjects have to wear a tailor-made suit with retro-reflective or light emitting markers; 3) there is a huge amount of data to be processed, and it is very difficult to estimate motion in real time; 4) the system is extremely expensive. With the rapid development of MEMS technology and miniature wireless solutions, human motion capture using micro sensors has attracted a lot of interests. Giansanti et al. [4] proposed to estimate upper limb motion and position by using of nine-accelerometers system for each body seg- ment. Dejnabadi et al. [5] developed a method of measuring joint angle (knee) using a combination of accelerometers and gyroscopes by placing a pair of virtual sensors on the adjacent segments at the center of rotation. Luinge et al. [6] presented a improved limb orientation estimation method using a Kalman filter to estimate sensor drift, and thereby improve estimation accuracy. Yun et al. [7] proposed an extended Kalman filter to estimate the orientation of human limb using combination of inertial sensors and magnetic sensors. They focused on the design of Kalman filter for real-time tracking of human body motion. Xsens [8] and InterSense [9] have the similar work using hybrid inertial sensors and magnetic or ultrasonic sensors. However, the above mentioned work always implicitly assumed that the movement of different segments of human body were inde- pendent and estimated their movement separately; therefore, the estimated motion were always with serious distortion. Exploring the use of geometrical constraints in the es- timation algorithm has shown the potential to solve the distortion problem. The basic idea is to use geometrical constraints as virtual measurements to revise the estimation errors. Hu et al. [10] proposed to use the combination of inertial sensors and visual sensors for the upper limb movement measurement. Inertial measurements were used for kinematic analysis, while visual sensor measurements and the constraint of upper limb lengths were regarded as compensative factors to update the kinematic results. The system can be used to estimate simple motion patterns. Veltink et al. [11] proposed a method for estimate of the orientations of the two arm segments using inertial sensors. The anatomical elbow constraint of no adduction in elbow joint was considered to revise the orientation estimation. In our previous work [12], we further extended the usage of constraints , explored the geometrical constraints systemat- ically, and proposed a novel motion estimation algorithm by hierarchical fusion of sensor data and constraints. Three geometrical constraints in elbow joint were modeled and used as virtual measurement to improve the estimation accuracy. However, the usage of geometrical constraints can only alleviate the distortion problem, and it can’t solve the problem essentially. This is mainly because the estimation is still based on assumption of independent body segments movements. To solve the distortion problem essentially, the skeleton structure of human body should be taken into consideration. The degrees of freedom for any segment, and the parameters 2010 International Conference on Body Sensor Networks 978-0-7695-4065-8/10 $26.00 © 2010 IEEE DOI 10.1109/BSN.2010.14 117
7
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
  • 3D Upper Limb Motion Modeling and Estimation using Wearable Micro-sensors

    Zhiqiang Zhang1,2, Lawrence WC Wong3 and Jian-Kang Wu1,21Graduate University, Chinese Academy of Sciences, Beijing, China

    2China-Singapore Institute of Digital Media, Singapore3National University of Singapore, Singapore

    Email: [email protected]

    AbstractHuman motion capture technologies are widelyused in interactive game and learning, animation, lm specialeffects, health-care and navigation. Because of the agility,upper limb motion estimation is the most difcult in humanmotion capture. Traditional methods always assume that themovements of upper arm and forearm are independent andestimate their movements separately; therefore, the estimatedmotion are always with serious distortion. In the paper, weproposed a novel ubiquitous upper limb motion estimationmethod using wearable micro-sensors, which concentrated onmodeling the relationship of the movements between upperarm and forearm. Exploration of the skeleton structure ofupper limb as a link structure with 5 degrees of freedomwas rstly proposed to model human upper limb motion.After that, parameters were dened according to Denavit-Hartenberg convention, forward kinematic equations of upperlimb were derived, and an Unscented Kalman lter was invokedto estimate the dened parameters. The experimental resultshave shown the feasibility and effectiveness of the proposedupper limb motion capture and analysis algorithm.

    Keywords-Ubiquitous Motion Modeling and Estimation, For-ward Kinematic Equation, Wearable Micro-sensors;

    I. INTRODUCTION

    Motion capture serves as a key technology in a widespectrum of applications, such as interactive game andlearning, animation, lm special effects, health-care andnavigation. The existing motion capture systems are mainlybased on video processing techniques, referred to as Mocap,such as Vicon [1] and Qualisys [2]. This type of systemsare widely used in lm industry. However, the Mocap hascertain limitations: 1) there is a need for multiple (usuallymore than 10) high speed cameras structured and calibratedin a dedicated studio; 2) subjects have to wear a tailor-madesuit with retro-reective or light emitting markers; 3) thereis a huge amount of data to be processed, and it is verydifcult to estimate motion in real time; 4) the system isextremely expensive.

    With the rapid development of MEMS technology andminiature wireless solutions, human motion capture usingmicro sensors has attracted a lot of interests. Giansanti etal. [4] proposed to estimate upper limb motion and positionby using of nine-accelerometers system for each body seg-ment. Dejnabadi et al. [5] developed a method of measuringjoint angle (knee) using a combination of accelerometersand gyroscopes by placing a pair of virtual sensors on the

    adjacent segments at the center of rotation. Luinge et al. [6]presented a improved limb orientation estimation methodusing a Kalman lter to estimate sensor drift, and therebyimprove estimation accuracy. Yun et al. [7] proposed anextended Kalman lter to estimate the orientation of humanlimb using combination of inertial sensors and magneticsensors. They focused on the design of Kalman lter forreal-time tracking of human body motion. Xsens [8] andInterSense [9] have the similar work using hybrid inertialsensors and magnetic or ultrasonic sensors. However, theabove mentioned work always implicitly assumed that themovement of different segments of human body were inde-pendent and estimated their movement separately; therefore,the estimated motion were always with serious distortion.

    Exploring the use of geometrical constraints in the es-timation algorithm has shown the potential to solve thedistortion problem. The basic idea is to use geometricalconstraints as virtual measurements to revise the estimationerrors. Hu et al. [10] proposed to use the combinationof inertial sensors and visual sensors for the upper limbmovement measurement. Inertial measurements were usedfor kinematic analysis, while visual sensor measurementsand the constraint of upper limb lengths were regarded ascompensative factors to update the kinematic results. Thesystem can be used to estimate simple motion patterns.Veltink et al. [11] proposed a method for estimate of theorientations of the two arm segments using inertial sensors.The anatomical elbow constraint of no adduction in elbowjoint was considered to revise the orientation estimation. Inour previous work [12], we further extended the usage ofconstraints , explored the geometrical constraints systemat-ically, and proposed a novel motion estimation algorithmby hierarchical fusion of sensor data and constraints. Threegeometrical constraints in elbow joint were modeled andused as virtual measurement to improve the estimationaccuracy. However, the usage of geometrical constraints canonly alleviate the distortion problem, and it cant solve theproblem essentially. This is mainly because the estimationis still based on assumption of independent body segmentsmovements.

    To solve the distortion problem essentially, the skeletonstructure of human body should be taken into consideration.The degrees of freedom for any segment, and the parameters

    2010 International Conference on Body Sensor Networks

    978-0-7695-4065-8/10 $26.00 2010 IEEEDOI 10.1109/BSN.2010.14

    117

    adminResaltado

    adminResaltado

    adminResaltado

    adminResaltado

    adminResaltado

    adminResaltado

  • to represent movement should be selected according to theskeleton structure. Hyde et al. [13] proposed a solution tothe estimation of upper-limb orientation using miniature ac-celerometers and gyroscopes. They dened a well-structuredupper limb rotation model, but they focused on the designthat minimizes the number of sensors whilst deliveringdesired motion estimates. In addition, they never consideredthe inherent drift problem of inertial sensors. Mihelj [14]proposed a method for computation of the inverse kinematicmodel of the human arm. The approach was based onmeasurements of the hand position and orientation as wellas acceleration and angular rate of the upper arm segment.Unfortunately, acquisition of hand position and orientationis very difcult and expensive. Vlasic et al. [15] proposeda human motion capture system using combination of ul-trasonic and inertial measurements. Based on the kinematicand kinetic analysis, they brought in Kalman lter to fusethe information from all three types of sensors. However, theultrasonic sensors are always suffered from serious signalinterference.

    Considering the state-of-the-art of the Micro-sensors forhuman motion capture, we proposed a novel ubiquitous up-per limb motion estimation method using three type of microsensors, namely, 3D accelerometer 3D micro-gyroscopes and3D magnetometers, for distortion free motion estimation.Our method concentrated on modeling the relationship ofthe movement between upper arm and forearm. The contri-butions of the paper are: 1) modeling and representation: ahuman arm can be represented by a skeleton structure withtwo rigid segments (upper arm and forearm) and two joints (shoulder joint and elbow joint). The usage of a link structurewith 5 degrees of freedom was rstly proposed to modelthe skeleton structure. After that, the link parameters andmechanisms are dened according to Denavit-Hartenbergconvention; 2) forward kinematic analysis: It is a method tocompute the orientation differences between any two linksin the link structure as a function of the link parameters.Meanwhile, it also constructs the relationship between linkparameters and sensor measurements. The good performanceof the experimental results have shown the feasibility andeffectiveness of the proposed upper limb motion estimationand analysis algorithm.

    The rest of the paper is organized as follows. Section IIdescribes rigid body modeling and link structure represen-tation for upper limb. Link parameters process model andforward kinematic equations are given in section III. Exper-imental results are given in Section IV. Finally, conclusionsand future work are in Section V.

    II. UPPER LIMB LINK STRUCTURE MODELINGA. Parameters denition

    As we all know, the hand motion has a negligible effecton the large motion dynamics of the upper limb, so thehand can be taken as a rigid extension of the forearm [16].

    Figure 1. The link structure of human upper limb. The black roundrepresents 1 joint with degree of freedom, while the black rectanglerepresent sensor unit which consists of three axis accelerometer, gyroscopeand magnetometer. The joints connected by dash lines mean that they areone joint with n degrees of freedom, where n is the number of jointsconnected by dash lines. However, axis 3 and 5 are superposed with link 3and link 5 respectively. Here, we split them up for clarity and readability.

    Therefore, a human upper limb can be represented by a chaincomposed of two rigid segments (upper arm and forearm)and two articulated joints ( shoulder joint and elbow joint).The articulated joints are always with 3 degrees of freedomwhich means that the forearm can move arbitrarily relative tothe upper arm. However, for a healthy subject, the elbow canonly admit 2 degrees of freedom, namely, exion/extensionand pronation/ supination. Abduction/adduction of the elbowis strictly forbidden. Therefore, articulated chain modelshould be revised and incorporated with degrees of freedomconstraints of the joints. The shoulder joint can admit 3degrees of freedom while the elbow joint can only admit2 degrees of freedom. However, in the rare case that amechanism is built with a joint having n degrees of freedom,it can be modeled as n joints of one degree of freedomconnected with n 1 links of zero length [17]. Therefore,as shown in Fig. 1, the human upper limb can be modeledby a link structure with 5 joints of one degree of freedom.In the Fig. 1, the links are numbered starting from theimmobile base of the arm (here means the shoulder), whichis called link 0. The rst moving body is link 1, and so on,out to the free end of the arm, which is link 5. Accordingto Denavit-Hartenberg convention, joint axes are dened bylines in space. Joint axis i is dened by a line in space, ora vector direction, about which link i rotates relative to linki1. The joint angle i is dened as the amount of rotationabout joint axis i between link i and link i 1. Once wehave known the joint angles i, i = 1, 2, , 5, the motionof upper limb is known.

    B. Frame afxing

    In order to describe the motion of each link relative toits neighbors, we dene a frame attached to each link. Thelink frames are named by number according to the link towhich they are attached. It means that frame i is attachedrigidly to link i. For link frame i, i = 1, 2, , 4, it can beconstructed step by step as follows:

    118

    adminResaltado

    adminResaltado

    adminResaltado

    adminResaltado

  • 1. Identify the joint axis i and i + 1 and draw innitelines along them.

    2. Identity the common perpendicular between axis i andi+1 or point of intersection. At the point of intersec-tion, or at the point where the common perpendicularmeets the axis i, assign the link frame origin. If axisi and i + 1 are parallel, any point at axis i can beregarded as the link frame origin.

    3. Assign the Zi axis pointing along the axis i.4. Assign the Xi pointing along the common perpendic-

    ular, or if the axis i and i + 1 intersect or parallel,assign Xi to be normal to the plane containing thetwo axes.

    5. Assign the Yi axis to complete a right-hand coordinatesystem, where Yi = (Xi Zi)/(Xi Zi).

    We select link frame 0 as our global coordinate, which isassigned to match frame 1 when 1 is zero. For link frame5, we set the axes dominantly as 1) assign the joint point asthe frame origin; 2) assign the Z5 axis pointing along theaxis 5; 3) assign Xi to be normal to the plane containing theZ4 and Z5; 4) assign the Yi axis to complete a right-handcoordinate system.

    After frame afxing according to our convention, the jointangle i can be redened as the angle between Xi1 and Ximeasured about Zi. Dening joint twist angle i as the anglebetween Zi and Zi+1 measured about Xi, and the rotationmatrix between frame i and frame i 1 can be written as:

    Rii1 = R(Zi, i)R(Xi1, i1) (1)

    where

    R(Zi, i) =

    cos(i) sin(i) 0sin(i) cos(i) 0

    0 0 1

    and

    R(Xi1, i1) =

    1 0 00 cos(i1) sin(i1)

    0 sin(i1) cos(i1)

    .

    Two sensor units, consisting of three axis accelerometer,gyroscope and magnetometer, are placed at link 3 and link5. Here we assume that soft tissues of upper limb have rigidbody motions property, so the position and orientation ofeach sensor unit will keep still related to the respective links.In other words, the rotation matrices between correspondingsensor units and links keep constant. Denote RU3 as therotation matrix between upper arm sensor unit and link 3,and RF5 as the rotation matrix between forearm sensor unitand link 5. RU3 and R

    F5 are known once the link frames are

    assigned.

    III. PROCESS MODEL AND MEASUREMENT MODEL

    As we have discussed in last section, the upper limbmotion are determined by the joint angles i, i = 1, 2, , 5.Our purpose is to recover the upper limb motion fromtwo wearable sensor units, each consisting of three-axis ac-celerometer, gyroscope and magnetometer, which are placedat upper arm and forearm respectively as shown in Fig. 1.We employ a Kalman lter to fuse among these three typesof sensors because Kalman lter provides a convenient,efcient, and elegant framework for combining differenttypes of measurements to recover the state of a givensystem. In this section, we will design the process modeland measurement model for the Kalman lter.

    A. Process model

    We dene the state vector x by using two types ofparameters, namely, the joint angles = (1, 2, 3, 4, 5)and the joint angular velocities = (1, 2, 3, 4, 5). Theprocess model is used to predict the evolution of the stateat time t+1 from time t, so the state vector should combinethe joint angles and the joint angular velocities as:

    x =(,

    )T= (1, 2, 3, 4, 5, 1, 2, 3, 4, 5)T (2)

    As the sampling rate t is very short, and here we set to0.02s, the process model is assumed to follow a constantangular velocity model:

    xt = Fxt1 + Gwt (3)

    where F =[

    I5 I5t0 I5

    ], G =

    [I5t2

    I5t

    ], and I5 is the

    unit matrix with order 5. wt = (w1 , w2 , w3 , w4 , w5), isthe Gaussian white acceleration noise with zero mean andcovariance matrix Q.

    B. Forward kinematic equations and measurement model

    The measurement model relates the measurement valuez to the value of the state vector x. The wearable sensorunit provides three types of measurement: acceleration,magnetic eld and angular rate. The generalized form ofthe measurement equation is

    zt =

    zgt

    zat

    zmt

    = h(xt) + vt = h(xt) +

    vgt

    vat

    vmt

    (4)

    where vt is assumed to be zero mean additional gaussianwhite noise with covariance matrix v, zat , z

    gt and z

    mt

    are all the acceleration, angular rate and magnetic eldmeasurements from the two sensor units respectively.

    As the movement of human upper limb segments isrelatively stable, and then tri-axis accelerometers mainlymeasure the gravity eld vector with respect to global co-ordinate system resolved in sensor local coordinate system.Dene g = [gx, gy, gz]T as the vector of the gravitational

    119

  • eld resolved in global coordinate system, and then theexpected measurements of these elds are given by thetransformation of g to the local sensor coordinate system,which can be represented as:

    zat =

    (zat,1

    zat,2

    )= Rot(xt, g) + vat (5)

    where zat,j =(za,xt,j , z

    a,yt,j , z

    a,zt,j

    )Tis the acceleration mea-

    surement of sensor unit j, j = 1, 2. Rot is a rotationoperator, which can be specialized according to forwardkinematic equation as:

    zat,1 = R10R

    21R

    32R

    U3 g + v

    at,1

    zat,2 = R10R

    21R

    32R

    43R

    54R

    F5 g + v

    at,2

    (6)

    where Rii1 is given in equation (1), vat =

    (vat,1

    vat,2

    ), is the

    acceleration measurement noise.The magnetic sensor measures the magnetic eld. The

    expected measurements of this eld are given by the trans-formation of global magnetic eld to the local sensor coordi-nate system. Similar to accelerometers measurement, denem = [mx,my,mz]T as the vector of the magnetic eldresolved in global coordinate system, and then the magneticmeasurement can be written as:

    zmt =

    (zmt,1

    zmt,2

    )= Rot(xt,m) + vmt (7)

    where zmt,j =(zm,xt,j , z

    m,yt,j , z

    m,zt,j

    )Tis the magnetic eld

    measurement of sensor unit j, j = 1, 2. Rot is the samerotation operator, which can be specied as:

    zmt,1 = R10R

    21R

    32R

    U3 m + v

    mt,1

    zmt,2 = R10R

    21R

    32R

    43R

    54R

    F5 m + v

    mt,2

    (8)

    where vmt =

    (vmt,1

    vmt,2

    ), is the magnetic eld measurement

    noise.Gyroscopes measure angular velocity in the local frame of

    each sensor. To derive the angular velocity of each sensor asa function of body joints, some virtual gyroscopes are placedat each joints, which can measure the joint angular velocityi accurately. As we all know, the angular velocity of linki + 1 is the same as that of link i plus a new componentcaused by rotational velocity at joint i + 1. This can bewritten in terms of frame i as:

    ii+1 = ii + R

    ii+1

    i+1i+1 (9)

    where i+1i+1 =(0, 0, i+1

    )T. By pre-multiplying both sides

    of equation (9) by Ri+1i , we can nd the description of theangular velocity of link i + 1 with respect to frame i + 1:

    i+1i+1 = Ri+1i

    ii +

    i+1i+1 (10)

    with initialization as 00 = 0.The gyroscopes measure the angular velocity of link 3

    and 5 which are resolved in sensor local coordinate system.Therefore, the angular measurement equation can be writtenas:

    zgt =

    (zgt,1

    zgt,2

    )= rot(33,t,

    55,t) + v

    gt (11)

    where zgt,j =(zg,xt,j , z

    g,yt,j , z

    g,zt,j

    )Tis the angular rate mea-

    surement of sensor unit j, j = 1, 2. rot is also a rotationoperator, which can be specied as:

    zgt,1 = RU3

    33,t + v

    gt,1

    zgt,2 = RF5

    55,t + v

    gt,2

    (12)

    where ii,t is the value of ii at time t, v

    gt =

    (vgt,1

    vgt,2

    ), is the

    angular rate measurement noise.Once the process model and measurement model are

    determined, a typical Kalman lter can be invoke to estimatethe joint angles. Because of the nonlinearity of the measure-ment equations, here we selected Unscented Kalman lterfor its high accuracy [18].

    IV. EXPERIMENTS AND DISCUSSION

    Experiments are conducted to evaluate the performanceof the proposed method. After a brief description of experi-mental settings, we present the upper limb motion estimationresults. A comparative discussion between the proposedmethod and other methods is also given.

    A. Experimental setup

    Before we use the each sensor unit to sample inertialdata, each sensor unit must be properly attached to thecorresponding body segments to reduce the calculation ofrotation matrix RU3 and R

    F5 . To reduce the extrusive effect

    of muscle in the movement of upper limb, one sensorunit is placed on the lateral side of the upper arm, whileanother sensor unit is placed on the ventral/palm side ofthe forearm near the wrist. For simplicity, we set RU3 andRF5 identity matrix dominantly. To reduce the orientationdifference between sensor coordinate systems and respectivelink coordinate systems as much as possible, the sensorcoordinate systems calibration is done. We keep the upperlimb in certain predened postures, and make sure 1) thex-axis of the sensor attached to upper arm is orthogonal tothe lateral side of the upper arm; 2) the z-axis of the upperarm sensor is along the direction of upper arm segment; 3)the x-axis of the sensor attached to forearm is orthogonal tothe palm side of the forearm; 4) the z-axis of the forearmsensor is along the direction of forearm segment.

    After sensor coordinate systems calibration, a subject wasasked to move the upper limb. A video camera recordedthe actions for comparison with our upper limb motion

    120

  • estimation results, which were visualized using 3D humanmodel and animations.

    B. Upper limb motion estimation results

    The initial conguration of the state variables x0 andcovariance matrix 0 is based on the initial posture ofupper limb. Here it is assumed that the upper limb laysdown as shown in Fig. 2(a), so we set all the elementsof vector x0 to 0, and set diagonal elements of 0 to 1and other elements to 0 empirically. The system noise isthe white Gaussian acceleration for the joint angles. Todescribe the strenuous movement of upper limb, we set thediagonal elements of Q to 2 and other elements to 0. Themeasurement noise covariance matrix V represents the levelof condence placed in the accuracy of the measurements. Inprinciple, V is not necessarily diagonal, but here we assumeit is for simplicity. The variances of any axis measurementcomponent can be determined directly from sensor measure-ments. For any axis of gyroscopes and magnetometers, weset the respective element of V to 0.1 and 0.5. However, asthe upper limbs own acceleration is not comparable to thegravity, we modeled it as the accelerometer measurementnoise, so set the corresponding element of V to 1. Note thatonce these parameters are set, they do not change over time.

    To illustrate the feasibility of our proposed method, asubject is asked to do the following task: move up the upperlimb to wave his hand for several times, and then returnto the initial posture. According to the estimated angles,3D human motion animation was constructed as shown in

    (a) (b)

    (c) (d)

    (e)

    Figure 2. The 3D human upper limb motion animation, which wasconstructed according to the estimated joint angles

    Fig. 2. Fig. 2(a) shows the initial posture of the subject;Fig. 2(b) and Fig. 2(c) shows the process of the subject risinghis forearm, which is accompanied with reversing its handorientation; Fig. 2(d) and Fig. 2(e) describes the process ofwaving his hand. As we can see from the gures, the 3Dhuman motion animation moves simultaneously with that ofthe subject. Although there is no ground true available in ourexperiments, it can be seen that the constructed upper limbmotion are very smooth without any jittering and spiking.It shows that our proposed algorithm can capture the upperlimb motion with reasonable motion estimation accuracy.

    Figure 3. Trace of the error covariance matrix.

    Since the Kalman gain was determined such that the sumof squared errors is minimized, one way to measure theconvergence of the Kalman lter is through examination ofthe trace of the error covariance matrix t. Fig. 3 shows thetrace of for the rst 2 seconds estimation. As we can seenfrom the gure, the sum of squared errors reaches a steadystate after approximately 0.1 second, which illustrates theconvergence of our algorithm.

    C. Comparison and discussion

    Having considered the fact that, to our knowledge, thereare mainly three types of estimation methods: 1) independentestimation of upper arm and forearm; 2) independent esti-mation plus geometrical constraints as virtual measurement;3) link structure estimation method. Our method belongs tothe third method, so we choose to compare our proposedmethod with method one [7] and method two [12].

    We applied the method in [7] and the method in [12] tothe same measurement data set to illustrate the performancesof different methods. Fig. 4 and Fig. 5 shows the estimatedjoint angles of upper limb. In the gures, the solid linesrepresent the results by the method in this paper. The dottedlines represent the results by method in [7] and methodin [12] respectively, which can be easily distinguished bysub-gures (a) and (b). As we can see from the gures,

    Comparing the method in [7] with our proposedmethod in the paper, there are only small differencesbetween the joints angles, except the pitch for forearm.The maximum value for the estimated pitch angle

    121

  • (a) The comparison between our method and no-constraint method

    (b) The comparison between our method and constraint method

    Figure 4. The euler angle representation of upper arm orientation. In thegure, the conventions roll, pitch and yaw represents the angles of joints1, 2 and 3 respectively.

    for forearm can reach 0.5 rad, which is really notacceptable because the pitch for forearm is nearly 0.Therefore, serious distortion will exist in the recon-structed upper limb motion.

    Incorporating the geometrical constraints in can reducemaximum value of the pitch for forearm from 0.5 radto less than 0.2 rad, which means the usage of ge-ometrical constraints can only alleviate the distortionproblem, but not solve the problem essentially. This ismainly because the estimation process is still based onassumption of independent movement of segments. Onthe other side, the other 5 estimated joint angles of ourproposed method in this paper and the method in [12]are nearly the same. Therefore, although there aresome distortion exists, the reconstructed upper limbmotion is still very similar to the subject movement.

    In summary, the movement of upper arm and forearm arenot independent. Assuming of the independent movementwill bring in huge estimation errors. Exploring the use of

    (a) The comparison between our method and no-constraint method

    (b) The comparison between our method and constraint method

    Figure 5. The euler angle representation of forearm orientation withrespect to the upper arm. In the gure, the conventions roll and yawrepresents the angles of joints 5 and 4 respectively, while pitch representsthe abduction/adduction of the elbow.

    geometrical constraints can only reduce the estimation er-rors. To solve the distortion problem essentially, the skeletonstructure of human body should be taken into consideration,and the parameters to represent the orientation of any seg-ment should be selected according to the skeleton structure.

    V. CONCLUSION AND FUTURE WORK

    We have developed a novel ubiquitous upper limb mo-tion estimation method by fusion of measurements fromsmall inertial/magnetic sensor modules containing triaxialangular rate sensors, accelerometers, and magnetometers.The method concentrated on modeling the relationship ofthe movement between upper arm and forearm. A linkstructure with 5 degrees of freedom was rstly proposedto model human upper limb motion. After that, parameterswere dened according to Denavit-Hartenberg convention,forward kinematic equations of upper limb were derived,and an Unscented Kalman lter was invoked to estimatethe dened parameters. The good experimental results have

    122

  • shown the feasibility of the proposed motion capture andanalysis algorithm.

    Our further work will be on the incorporating the upperlimb acceleration into the measurement equations to furtherimprove estimation accuracy. Comprehensive experiments toevaluate the performance and accuracy of the motion capturesystem will also be done. The extension of our method to thewhole body motion estimation is also under consideration.

    ACKNOWLEDGMENTThe authors would like to thank Mr. CHEN Jiang and Mr.

    LI Gang for their contribution to the paper. This researchis done for CSIDM Project No. CSIDM-200802 partiallyfunded by a grant from the National Research Foundation(NRF) administered by the Media Development Authority(MDA) of Singapore. It is also supported by the NationalNatural Science Foundation of China (NSFC) under grantNO. 60932001.

    REFERENCES

    [1] A. Kapur, A. Kapur, N. Virji-Babul, G. Tzanetakis, andP. Driessen. Gesture-Based Affective Computing on MotionCapture Data. Lecture Notes in Computer Science, 3784:1,2005.

    [2] L. Hamilton, R. Franklin, and N. Jeffery. Development of auniversal measure of quadrupedal forelimb-hindlimb coordi-nation using digital motion capture and computerised analysis.BMC Neuroscience, 8:77, 2007.

    [3] J.H. Grill and P.H. Peckham, Functional neuromuscularstimulation for combined control of elbowextension and handgrasp in C5 and C6 quadriplegics. IEEE Transactions onRehabilitation Engineering, 6(2):190199, 1998.

    [4] D. Giansanti, V. Macellari, G. Maccioni and A. Cappozzo,Is it feasible to reconstruct body segment 3-D position andorientation using accelerometric data, IEEE transactions onbiomedical engineering, 50(4):476483, 2003.

    [5] H. Dejnabadi, B. Jolles, and K. Aminian. A new approachto accurate measurement of uniaxial joint angles based on acombination of accelerometers and gyroscopes. BiomedicalEngineering, IEEE Transactions on, 52(8):14781484, 2005.

    [6] H.J. Luinge and P.H. Veltink, Measuring orientation of humanbody segments using miniature gyroscopes and accelerome-ters, Medical and Biological Engineering and Computing,43(2):273282, 2005.

    [7] X. Yun and E. Bachmann. Design, Implementation, andExperimental Results of a Quaternion-Based Kalman Filterfor Human Body Motion Tracking. Robotics and Automation,IEEE Transactions on, 22(6):12161227, 2006.

    [8] B. Xsens Technologies. Moven-inertial motion capturing.http://www.xsens.com/.

    [9] E. Foxlin, M. Harrington, I. Inc, and M. Burlington.WearTrack: a self-referenced head and hand tracker for wear-able computers and portable VR. Wearable Computers, TheFourth International Symposium on. pages 155162, 2000.

    [10] Y. Tao and H. Hu. A novel sensing and data fusionsystem for 3-d arm motion tracking in telerehabilitation. IEEETransactions on Instrumentation and Measurement, May 2008.

    [11] H. Luinge, P. Veltink, and C. Baten. Ambulatory measurementof arm orientation. Journal of Biomechanics, 40(1):7885,2007.

    [12] Z.Q. Zhang, Z.P. Huang, and J.K. Wu. Hierarchical In-formation Fusion for Human Upper Limb Motion Capture.Proceedings of 12th International Conference on InformationFusion, 2009.

    [13] R. Hyde, L. Ketteringham, S. Neild, and R. Jones. Esti-mation of Upper-Limb Orientation Based on Accelerometerand Gyroscope Measurements. Biomedical Engineering, IEEETransactions on, 55(2 Part 1):746754, 2008.

    [14] M. Mihelj, Inverse Kinematics of Human Arm Based on Mul-tisensor Data Integration, Journal of Intelligent and RoboticSystems, 47(2):139153, 2006.

    [15] D. Vlasic, R. Adelsberger, G. Vannucci, J. Barnwell,M. Gross, W. Matusik, and J. Popovic. Practical motion capturein everyday surroundings. ACM Transactions on Graphics(TOG), 26(3), 2007.

    [16] W. Maurel, 3D modeling of the human upper limb includingthe biomechanics of joints, muscles and soft tissues, Ph.dthesis, Ecole Polytechnique Federale De Lausanne, 1999.

    [17] J. Craig, Introduction to Robotics: Mechanics and Control,3/E NJ: Pearson Education, 2005.

    [18] B. Ristic and S. Arulampalam, Beyond the Kalman lter:Particle lters for tracking applications, Artech House, 2004

    123