Top Banner
Haptic Modeling Based on Contact and Motion Types Jing Xiao, Song You Computer Science Department University of North Carolina- Charlotte Charlotte, NC 28223, USA xiao @uncc.edu, [email protected] Abstract When a held object interacts with another object (in a task such as assembly), the haptic force and moment felt by the operator at any instant depend not only on the contact region but also on the type of the contact state and the type of motion of the held object prior to reaching the current contact configuration, especially in the presence of friction and gravity. In this paper, we address the influence of contact state and motion types on haptic force and moment and present an efficient method to model such haptic effects resulted from the interaction of two convex polyhedral solids, taking into account friction and gravity. Keywords: Haptic modeling, contact forces and moments, principal contacts, motion types, convex polyhedra, assembly. 1. Introduction Certain applications of haptic interaction, such as virtual assembly, virtual prototyping, and teleoperation require the simulated haptic force and moment to be physically accurate, which in turn requires physically accurate modeling of contact forces and moments [1 ]. In dynamic simulation, there is a rich literature on computing physically accurate contact forces and moments. Baraff developed methods for general rigid bodies in contact without and with friction [2,3], given a known external force. Ruspini and Khatib further presented a general framework for computing contact forces involving articulated systems [4] but without considering friction. The computation costs of contact forces are usually too high to yield real-time results especially in the presence of friction, as that is shown to be NP-hard in general [5]. In haptic rendering, on the other hand, the external force and moment exerted to the virtually held object/tool at any time by the human operator is unknown and part of what needs to be modeled, and the contact forces/moments have to be computed in real- time. Such real-time requirement is made more stringent by the fact that the force/moment computation has to be on top of collision detection, which usually takes a substantial amount of time. Therefore, work on haptic rendering is often based on simplified models such as a point contact [6] or modeling only the contact normal force [7]. However, to achieve physically accurate emulation of haptic force and moment, the effects of friction, gravity, and the type of contact (which dictates how the contact force distributes to produce a moment) have to be taken into account. In this paper, we present an efficient method to model and render haptic effects caused by the interaction between two convex polyhedral solids in real-time, taking into account friction, gravity, the type of contact state, and the prior motion of the held object. The key idea is to solve for the contact friction force and contact moment analytically based on the information of both the type of the contact configuration and the type of motion of the "held" object prior to reaching the current contact configuration, after obtaining a contact normal force based on the common spring force model. We have also applied our method and tested its efficiency in a haptic manipulation environment that we developed, where a human operator can "pick up" a virtual object via a PHANTOM Desktop device, make arbitrary contact to another virtual object freely, and make arbitrary compliant motion or guarded motion. The haptic force and moment are calculated instantly by the method at the high rate of 2kHz during the haptic interaction, where the force rendered is stable, smooth and felt realistic. 2. Principal Contacts and Related Frames To characterize the different types of contact between two convex polyhedra A and B, we use the notion of Principal Contacts (PCs) [8]. Here we first review the notion of PCs and then introduce a task frame with respect to each category of PC, which will be used later to facilitate calculation of contact forces and moments. 2.1. Categories of Principal Contacts A contact state between A and B is characterized by a principal contact, which, as defined in [8], is in terms of
6
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: iros02print

Haptic Modeling Based on Contact and Motion Types

Jing Xiao, Song You

Computer Science Depar tment

Universi ty of North C a r o l i n a - Charlotte

Charlotte, NC 28223, U S A xiao @uncc.edu, syou@uncc .edu

Abstract

When a held object interacts with another object (in a task such as assembly), the haptic force and moment felt by the operator at any instant depend not only on the contact region but also on the type of the contact state and the type of motion of the held object prior to reaching the current contact configuration, especially in the presence of friction and gravity. In this paper, we address the influence of contact state and motion types on haptic force and moment and present an efficient method to model such haptic effects resulted from the interaction of two convex polyhedral solids, taking into account friction and gravity.

Keywords: Haptic modeling, contact forces and moments, principal contacts, motion types, convex polyhedra, assembly.

1. Introduction

Certain applications of haptic interaction, such as virtual assembly, virtual prototyping, and teleoperation require the simulated haptic force and moment to be physically accurate, which in turn requires physically accurate modeling of contact forces and moments [1 ].

In dynamic simulation, there is a rich literature on computing physically accurate contact forces and moments. Baraff developed methods for general rigid bodies in contact without and with friction [2,3], given a known external force. Ruspini and Khatib further presented a general framework for computing contact forces involving articulated systems [4] but without considering friction. The computation costs of contact forces are usually too high to yield real-time results especially in the presence of friction, as that is shown to be NP-hard in general [5].

In haptic rendering, on the other hand, the external force and moment exerted to the virtually held object/tool at any time by the human operator is unknown and part of what needs to be modeled, and the contact forces/moments have to be computed in real- time. Such real-time requirement is made more stringent

by the fact that the force/moment computation has to be on top of collision detection, which usually takes a substantial amount of time. Therefore, work on haptic rendering is often based on simplified models such as a point contact [6] or modeling only the contact normal force [7]. However, to achieve physically accurate emulation of haptic force and moment, the effects of friction, gravity, and the type of contact (which dictates how the contact force distributes to produce a moment) have to be taken into account.

In this paper, we present an efficient method to model and render haptic effects caused by the interaction between two convex polyhedral solids in real-time, taking into account friction, gravity, the type of contact state, and the prior motion of the held object. The key idea is to solve for the contact friction force and contact moment analytically based on the information of both the type of the contact configuration and the type of motion of the "held" object prior to reaching the current contact configuration, after obtaining a contact normal force based on the common spring force model.

We have also applied our method and tested its efficiency in a haptic manipulation environment that we developed, where a human operator can "pick up" a virtual object via a PHANTOM Desktop device, make arbitrary contact to another virtual object freely, and make arbitrary compliant motion or guarded motion. The haptic force and moment are calculated instantly by the method at the high rate of 2kHz during the haptic interaction, where the force rendered is stable, smooth and felt realistic.

2. Principal Contacts and Related Frames

To characterize the different types of contact between two convex polyhedra A and B, we use the notion of Principal Contacts (PCs) [8]. Here we first review the notion of PCs and then introduce a task frame with respect to each category of PC, which will be used later to facilitate calculation of contact forces and moments.

2.1. Categories of Principal Contacts

A contact state between A and B is characterized by a principal contact, which, as defined in [8], is in terms of

Page 2: iros02print

a pair of contacting elements (i.e., edges, vertices, and faces) that are not the boundary elements of other contacting elements. The types of non-degenerate PCs are shown in Fig. 1. Each non-degenerate PC has a contact plane as determined by the face involved or the two edges of an edge-edge-cross type of PC

Face-Face Face-Edge/Edge-Face

i Face-Vertex/Vertex-Face Edge-Edge-Cross

Fig.l" Non-degenerate Principal Contacts

Based on the contact region of the PCs, we can further classify the non-degenerate PCs into the following three categories:

Plane PC: face-face, where the contact happens at a planar region.

Line PC: edge-face and face-edge, where the contact happens on an edge.

Point PC: vertex-face, face-vertex, edge-edge-cross, where the contact happens at a point.

2.2. PC-based Task Frames

Suppose that A is the virtually held object and B is static in the virtual environment. When A and B are in contact, we can establish a task frame according to the type of PC as the following: the origin of the frame is on the contact plane of the PC, the y-axis is along the normal of the contact plane pointing to the held object A, and the x- and z- axes are along the tangent of the contact plane and orthogonal to each other, following the right-hand rule (Fig. 2).

4

Fig. 2" An example task frame with respect to a line PC

3. Haptic Model ing

In this section we describe our method to model the haptic force and moment felt by a human operator when the "held" virtually object A (via a haptic device) interacts with the virtual object B. We will first describe how to model a contact force exerted to A by B and subsequently a haptic force. Next we will describe how to model a haptic moment. For convenience, we use the task frame to describe the contact force Fc, which can be decomposed into a normal force along the +y axis Fcy and a friction force Fcf along the xz plane (i.e., the contact plane).

3.1. Basic Assumptions

Currently we limit our considerations to polyhedral solids with evenly distributed mass so that the center of gravity is at the object 's centroid Pc, whose coordinates can be simply calculated as the average coordinates of all the vertices of the object. We also assume that the objects have evenly distributed stiffness with a constant stiffness coefficient K. We use a Coulomb friction

model with the static friction coefficient/~ and kinetic friction coefficient/~ D.

3.2. Contact Normal Force

First we say that A is in contact with B if the minimum distance d between them is within a small threshold e > 0 [9]. We model the magnitude of the contact normal force Icy based on the Hooke 's law as:

Fcy = K( e - d) (1)

where K is the stiffness coefficient. To prevent penetration, the constraint-based idea of a virtual proxy can also be used here [6].

3.3. Contact Friction Force

At any instant when the held A is contacting B, the friction force exerted to A depends on the prior motion of A before reaching the current contact configuration. Therefore, to solve the unknown friction force F c f - [Fcx, O, F<] r with physical accuracy, we consider the possible types of prior motion of A before reaching the current contact configuration to extract additional information.

Let t- and t + indicate the times before and after A reaches the current contact configuration respectively. We call A's motion during [t-, t +] the motion of A prior to the current contact configuration, or the prior motion, which is of one of the following types:

MTI: guarded move from no contact at t- to contact at t + MT2: compliant translation to maintain the PC

Page 3: iros02print

MT3" compliant rotation MT4" compliant translation and rotation.

These types of prior motions can be easily determined based on the prior and the current configurations of A at t- and t + respectively as well as the current PC of A and the prior state of A as described below:

• Prior motion is M T I if A was not in contact at t- (Fig. 3).

i -

t +

Fig. 3" An example of M T I

Prior motion is M T 2 if A was in contact at t- and maintains the same PC at t +. In addition, A's position changes from t- to t +, but its orientation changes little (Fig. 4).

t+ t- . . . . . . . . . . . . . . . . . . . . . . .

/ . ,

Fig. 4" An example of M T 2

Prior motion is M T 3 if A was in a PC PCI at t-, A's orientation changes from t- to t +, either A remains in PC~ or is in a neighboring ~ PC [10] of PC~ at t +, at least one contact point of A changes little from t- to t + (see Fig. 5 for examples).

t

(a) Different PC (b) Same PC

Fig. 5" Examples of M T 3

Prior motion is M T 4 if all conditions of M T 3 except the last one hold, i.e., all A's contact points translate from t- to t + while A's orientation changes. Fig. 6 shows some examples.

t +

(a) Different PC

i "-~.~ I ..... ........ ii ............... , ~ "

(b) Same PC

Fig. 6: Examples of M T 4

Note that to differentiate a M T 3 motion (i.e., pure rotation) from a M T 4 motion, we place an upper bound on the displacement of the contact point(s) on the rotation axis as described in details below.

We now describe how to determine the friction forces considering each of these prior motion types in turn.

P r i o r m o t i o n is MTI" We consider the impulse force of collision to be dominant in this case. Let Fi denote the impulse force applied to B from A at the point(s) of contact. Let the unit vector u = [a b c] v denote the direction of the linear velocity of the contact point 2 of A prior to making the contact (as shown in Fig. 3) represented in the task frame of the current PC, which is known.

N o S l i p

= t g O

F r i c t i o n C o n e

Fig. 7"A can either stick or slip after M T I .

Now, if u is inside the static friction cone as shown in Fig. 7, i.e.,

a 2 -+- C 2

_<//2 (2) b 2

Then A is static after making the contact, and the following holds: Fi is along u, F c - - Fi, or

1 Note that since the motion is considered instant, it is not possible to change the PC to a non-neighboring PC.

2 In case the contact state is a line PC or a plane PC, use the center point of the contact region to obtain the linear velocity.

Page 4: iros02print

f~x a

F~, + F~ b - 0 (3)

Fez

From (3) above, the friction force can be solved as:

Fcf= [Fcx 0 Fc~]r=-Fcy[a/b 0 c/b] r (4)

On the other hand, if (2) is not satisfied, A will slide (along the projection of u on the xz-plane) after making the contact, and the friction force can be determined as:

Fcs = [Fcx 0 Fcz]r=-/loFcy[a 0 c] r (5)

Prior motion is MT2: In this case A can be considered kinetic, and the kinetic friction force applied to A is along the contact plane of the PC and opposing the direction of the prior compliant translation. Such a direction is again known (Fig. 4) and can be denoted by a unit vector u~z. Thus, the kinetic

friction force is Fcf =-].lDFcyUxz.

Prior motion is MT3 or MT4: To determine the friction Fcf, we need to determine if the prior motion type of A is MT3 or MT4. First, the compliant rotation axis can be obtained from the PC types as well as A's orientations at t- and t +. Either (i) a single contact point is on the rotation axis or (ii) a contact edge is on it. Now let p be the contact point if (i) or the center point of the contact edge if (ii), and let Sp be the displacement of p from t- to t ÷ along the contact plane. Using the static friction model IIF4tl- Kpsp, where Kp is the proportional control gain [6,11 ], if

Sp < ltFcy/Kp,

then p is stuck, i.e., the prior motion is MT3, and

IIF4tl = KpSp < ,t/fcy.

Otherwise, the prior motion is MT4, and IIF4tl =/.IDFcy.

If the prior motion is MT3, the direction of Fcf can be obtained as normal to the rotation axis, along the contact plane of the current PC that A is in, and against the rotation (see Fig. 5 for an example). If the prior motion is MT4, the direction of Fc/can be modeled as along the direction of translation of p from t- to t +.

3.4. Haptic Force

With the contact force Fc determined, the haptic force Fh that the human operator should feel can be readily obtained as the sum of the contact force and the gravity force: Fh= F c + Fg, and can be rendered accordingly. To enable a smooth and stable rendering, we make fcy(t +) - fcy(t- ) if the prior motion is compliant and the difference in virtual deformation is small enough and also perform force interpolation or shading [6, 7] to the general Fc.

3.5. Haptic Moment

The haptic moment Mh felt by the human operator at any instant equals to the combined moment created by the contact force upon A and the gravity of A. Such a moment depends on how the contact force is distributed (or the equivalent points of action), which in turn depends on the type of the contact state. In the subsections below, we describe Mh with respect to the task frame for each PC category.

We use the following common notations. Let Pc denote the centroid of A. Let Pc be the position vector of Pc with respect to the task frame and Pcxz, Pc~y, Pcyz be the projection of Pc on the xz, xy, and yz planes of the task frame. Let Fgxz, Fgyz, Fgxy be the projections of the gravity Fg on the xz, yz and xy planes of the task frame respectively. Again Fc denotes the contact force consisting of a normal force Fcy and a friction force Fcf. Moreover, let RA~ denote the contact region (i.e., the set of contact points) between A and B when a PC is formed, v be the centroid of RA~ and Pv be its position vector in the task frame, which can be computed by averaging the coordinates of the vertices of RA~.

3.5.1. Haptic Moment for Point PC

Here Fc applies to the contact point v. For convenience, we further specify the origin of the task frame at v. Then the haptic moment Mh with respect to the task frame is:

Mh = PcX Fg (6)

Note that Fc does not produce a moment with respect to the task frame.

3.5.2. Haptic Moment for Line PC

Here Fc applies to the contact edge segment RA~ of object A, which is either a part of or a whole edge of A. Let Pl and P2 be the two end points of the contact edge segment. For convenience, we set the origin of the task frame at point pl and the x-axis of the task frame to be along the contact edge pointing to P2. The haptic moment Mh with respect to the task frame can be expressed as the vector sum: Mh = Mhx + Mhy+ Mhz, where

Mhx = Pcyz x Fgy z (7)

Mhy=P~:XFg~: + P~XFcf (8)

Mhz=PcxyXFgxy + PvXFcy (9)

Note that equation (7) captures the fact that the contact force does not produce a moment about the contacting edge in a line PC. Equation (8) captures the fact that the contact normal force does not produce a moment.

Page 5: iros02print

Equation (9) captures that the contact friction force does not produce a moment.

3.5.3. Haptic Moment for Plane PC

Here Fc applies to the contact surface region RA~3 of object A, which is either a part of or a whole face of A. For convenience, we further set the origin of the task frame at one of the vertices of polygon RA~ (note that a vertex of RA~ is not necessarily a vertex of object A). Moreover, let Pvx and Pvz be the projection vectors of P~ on the x-axis and z-axis respectively. Now the haptic moment Mh with respect to the task frame can be expressed as: Mh = Mhx + Mhy+ Mhz, where

Mhx = PcxzXFg~z + PvzXFcy (lO)

Mhy=PcxzXFgxz + P~XFcf (11)

Mhz- PcxyXFgxy + PvxXFcy (12)

4. Experiments

We have developed a program for real-time haptic rendering based on the introduced haptic modeling method. We have further applied a PHANTOM TM

Desktop device from SensAble Technologies (Fig. 8) to display the computed haptic force in real-time, which is connected to a personal computer with Intel Pentium III Processor 700MHZ and 256 MB system RAM.

A simple 3-D haptic scene was set up (see Fig. 9). It was a simulated room with a moveable object A (the red cube), a static object B (the other object), and four static walls. A user can "pick up" the object A by virtually attaching the PHANTOM device to A and move A randomly to make arbitrary principal contacts with the static environment and feel the haptic force.

~!!!!!!~!!!!~!!!!!!!!!!!!!!!!!!!!!!~!~!!!~!~!!~!!~!~!~!~i!~ii~i~i~i~!i~ii!~!i!~i!i~~!!~!~!~!~!iiiiii~!iiiiiiiii~!~

i iii ii i,i i i,i i,i,i,i;i,i,i,i,i,i,i,i,i,i iiii

Fig. 8" Haptic interaction via a PHANTOM Desktop

In this implementation, we used the algorithm by Zhang and Xiao [9], which builds on the efficient polytope distance algorithm by Gilbert et al. [12] to identify the principal contact and the contact plane normal in real-time. The update rate of haptic force and moment computation and force rendering is 2 kHz. Fig.

10 shows a sequence of states during a motion of the object A (moved by the human operator) in our experimental scene.

!!!!!!!!!!!!!!!!!!!!!!ii!!!!!~®

~ ~ i ~i'N ....... i~ i iiN iiiiiiiiii!N

I° iiIII

Fig. 9" Experimental scene

0 1 2

3 4 5

Fig. 10: Motion sequence of object A

Table 1 shows the haptic force and moment generated in real-time corresponding to each contact configuration of the "held" cube captured in each snapshot of Fig. 10. Both force and moment are described with respect to the task frame at each contact state. Note that the y-axis (i.e., the contact normal) did not change in these states, and one can see that Fhy is the largest in the first state, which shows the effect of impact after MT1. The x- and z-axes, on the other hand, changes according to each different PC (as described in Section 3.5).

Table 2 provides the values of the parameters used in our experiments, where m is the mass of A.

Page 6: iros02print

Table 1: Some experimental data from the real-time computation

PC Prior Move

Fh (N)

1 2 V-F E-F MT1 MT4

[0.27 3.92

-0.16] M h [38.74 (N. 3.19

ram) -30.21]

3 4 5 F-F V-F F-F

MT4 MT3 MT2

[-0.19 [0.11 [0.35 [-0.20 2.57 3.10 1.49 2.81

-0.08] 0.23] -0.44] 0.17] [-52.93 [0 [23.45 [0

2.38 3.45 -9.62 0 0] 0] 34.03] 0]

Table 2: Parameters used in the experiments

P Value P Value m 0.5 (kg) K 2.3 (N/mm) g -9.8 (N/kg) Kp 1.6 (N/mm) pt 0.7 e 1.0 (mm)

/-/D 0.08

5. Conclusions

We have introduced an efficient method to model the haptic effect felt by a human operator with respect to two interacting convex polyhedra (of which one is held by the operator), taking into account friction and gravity, by taking advantage of the information about the type of contact and the type of prior motion. A natural next step is to extend the approach to modeling the haptic effect with respect to two arbitrary polyhedra in contact, where a contact state, called a contact formation [8], generally consists of a number of principal contacts. Such an extension is feasible with the real-time identification of contact formations based on the automatically generated contact state space [10].

Acknowledgment

This work is funded by the U.S. National Science Foundation grant IIS-9700412.

References

[1]

[2]

Hollerbach, J.M., "Some Current Issues in Haptics Research," Proc. of the IEEE Int. Conf. on Robotics and Automation. pp. 757-762, San Francisco, USA. 2000.

Baraff, D., "Analytical Methods for Dynamic Simulation of Non-penetrating Rigid Bodies," Compter Graphics, 23(3): 223-232, August 1989.

[3] Baraff, D., "Fast Contact Force Computation for Non-penetrating Rigid Bodies," SIGGRAPH 94 Proceedings, pp. 23-34, August 1994.

[4] Ruspini, D. and Khatib, O., "Collision/Contact Models for Dynamic Simulation and Haptic Interaction," Proc. of the Ninth Int. Symposium on Robotics Research, pp. 185-194, Oct. 1999.

[5]

[6]

Baraff, D., "Coping with Friction for Non- penetrating rigid body simulation," Computer Graphics, 25(4): 31-40, August 1991.

Ruspini, D., Kolarov, K., Khatib, O., "Haptic Interaction in Virtual Environments," Proc. of the IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, Genoble, France, Sept. 1997.

[7] Gregory, A. et al., "Six Degree-of-Freedom Haptic Display of Polygonal Models," Proc. of IEEE Visualization, Oct. 2000.

[8] Xiao, J., "Automatic Determination of Topological Contacts in the Presence of Sensing Uncertainties," Proc. of the IEEE Int. Conf. on Robotics and Automation, pp. 65-70, Apr. 1993.

[9] Zhang, L. and Xiao, J., "Derivation of Contact States from Geometric Models of Objects," Proc. of the IEEE Int. Conf. Assembly and Task Planning, Pittsburgh, pp. 375-380, Aug. 1995.

[10] Xiao, J. and Ji, X., "On Automatic Generation of High-level Contact State Space," International Journal of Robotics Research, (and its first multi- media extension issue elJRR), 20(7): 584-606, July 2001.

[11] Salcudean, S.E., Vlaar, T.D., "On the Emulation of Stiff Walls and Static Friction with a Magnetically Levitated Input/Output Device," ASME Haptic Interfaces for Virtual Environment and Teleoperator Systems, Dynamic Systems and Control, pp. 123-130, April 1995.

[12] Gilbert, E., Johnson, D. Keerthi, S., "A Fast Procedure for Computing the Distance between Complex Objects in Three-dimensional Space," IEEE Journal of Robotics and Automation, 4(2): 193-203, April 1988.