Top Banner
Real–time haptic and visual simulation of bone dissection Marco Agus Andrea Giachetti Enrico Gobbetti Gianluigi Zanetti Antonio Zorcolo CRS4 VI Strada Ovest, Z. I. Macchiareddu, I-09010 Uta (CA), Italy {magus,giach,gobbetti,zag,zarco}@crs4.it – http://www.crs4.it October 4, 2002 Abstract Bone dissection is an important component of many surgical procedures. In this paper, we discuss a haptic and visual simulation of a bone cutting burr, that it is being devel- oped as a component of a training system for temporal bone surgery. We use a physically motivated model to describe the burr–bone interaction, that includes haptic forces eval- uation, the bone erosion process and the re- sulting debris. The current implementation, directly operating on a voxel discretization of patient-specific 3D CT and MR imaging data, is efficient enough to provide real–time feed- back on a low–end multi–processing PC plat- form. 1 Introduction Bone dissection is an important component of many surgical procedures. In this paper, we discuss a real–time haptic and visual imple- mentation of a bone cutting burr, that it is being developed as a component of a training simulator for temporal bone surgery. The spe- cific target of the simulator is mastoidectomy, a very common operative procedure that con- sists in the removal, by use of the burring tool, of the mastoid portion of the temporal bone. The importance of computerized tools to support surgical training for this kind of intervention has been recognized by a num- ber of groups, which are currently developing virtual reality simulators for temporal bone surgery (e.g. [26, 22]). Our work is character- ized by the use of patient-specific volumetric object models directly derived from 3D CT and MRI images, and by a design that pro- vides realistic visual and haptic feedback, in- cluding secondary effects, such as the obscur- ing of the operational site due to the accu- mulation of bone dust and other burring de- bris. The need to provide real–time feedback to users, while simulating burring and related secondary effects, imposes stringent perfor- mance constraints. Our solution is based on a volumetric representation of the scene, and it harnesses the locality of the physical system evolution to model the system as a collection of loosely coupled components running in par- allel on a multi-processor PC platform. Pre- vious work has demonstrated the effectiveness of voxel–based representations for the genera- tion of force feedback in the case of rigid body environments (e.g., [21]), virtual clay models (e.g., [7, 13, 25, 15]), or deformable bodies (e.g., [9, 14, 12, 16]). This article, an extended version of our IEEE Virtual Reality 2002 contribution ([6]), focuses on the modeling of the haptic and vi- sual effects of bone burring. We refer the reader to [5] for details on the other system components.
15

Real-Time Haptic and Visual Simulation of Bone Dissection

Apr 29, 2023

Download

Documents

Alfredo Rizza
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Real-Time Haptic and Visual Simulation of Bone Dissection

Real–time haptic and visual simulation of bone dissection

Marco Agus Andrea Giachetti Enrico Gobbetti Gianluigi ZanettiAntonio Zorcolo

CRS4VI Strada Ovest, Z. I. Macchiareddu, I-09010 Uta (CA), Italy{magus,giach,gobbetti,zag,zarco}@crs4.it – http://www.crs4.it

October 4, 2002

Abstract

Bone dissection is an important componentof many surgical procedures. In this paper,we discuss a haptic and visual simulation ofa bone cutting burr, that it is being devel-oped as a component of a training system fortemporal bone surgery. We use a physicallymotivated model to describe the burr–boneinteraction, that includes haptic forces eval-uation, the bone erosion process and the re-sulting debris. The current implementation,directly operating on a voxel discretization ofpatient-specific 3D CT and MR imaging data,is efficient enough to provide real–time feed-back on a low–end multi–processing PC plat-form.

1 Introduction

Bone dissection is an important component ofmany surgical procedures. In this paper, wediscuss a real–time haptic and visual imple-mentation of a bone cutting burr, that it isbeing developed as a component of a trainingsimulator for temporal bone surgery. The spe-cific target of the simulator is mastoidectomy,a very common operative procedure that con-sists in the removal, by use of the burringtool, of the mastoid portion of the temporalbone. The importance of computerized toolsto support surgical training for this kind of

intervention has been recognized by a num-ber of groups, which are currently developingvirtual reality simulators for temporal bonesurgery (e.g. [26, 22]). Our work is character-ized by the use of patient-specific volumetricobject models directly derived from 3D CTand MRI images, and by a design that pro-vides realistic visual and haptic feedback, in-cluding secondary effects, such as the obscur-ing of the operational site due to the accu-mulation of bone dust and other burring de-bris. The need to provide real–time feedbackto users, while simulating burring and relatedsecondary effects, imposes stringent perfor-mance constraints. Our solution is based on avolumetric representation of the scene, and itharnesses the locality of the physical systemevolution to model the system as a collectionof loosely coupled components running in par-allel on a multi-processor PC platform. Pre-vious work has demonstrated the effectivenessof voxel–based representations for the genera-tion of force feedback in the case of rigid bodyenvironments (e.g., [21]), virtual clay models(e.g., [7, 13, 25, 15]), or deformable bodies(e.g., [9, 14, 12, 16]).

This article, an extended version of ourIEEE Virtual Reality 2002 contribution ([6]),focuses on the modeling of the haptic and vi-sual effects of bone burring. We refer thereader to [5] for details on the other systemcomponents.

Page 2: Real-Time Haptic and Visual Simulation of Bone Dissection

(a) The Visible Human skull (b) The mastoid region

Figure 1: Surgical site. Mastoidectomy is performed in the region indicated by the rectanglein Fig. (a) and zoomed in Fig. (b). The images are taken directly from the volumetric rendererused in the simulator. The volumetric dataset has a resolution of 256x256x219 voxels and isderived from The Visible Human Male CT Dataset made available by The National Libraryof Medicine.

In our model, the burr bit is representedby a region of space that samples the volu-metric bone data to construct the elastic re-action and friction forces that the bone op-poses to the burring. The sampling algorithmis similar in spirit to the Voxmap PointShellapproach [21], even though here we use a vol-umetric region around the burr to select thebone voxels relevant to force calculation. Ouralgorithm for computing forces, loosely pat-terned on Hertz contact theory [18], is robustand a smooth function of the burr position.The computed forces are transfered to thehaptic device via a sample–estimate–hold [11]interface to stabilize the system. Bone ero-sion is modeled by postulating an energy bal-ance between the mechanical work performedby the burr motor and the energy needed tocut the bone, that it is assumed to be propor-tional to the bone mass removed. The actual

bone erosion is implemented by decreasing thedensity of the voxels that are in contact withthe burr in a manner that is consistent withthe predicted local mass flows. The processof accumulation of bone dust and other bur-ring debris are then handled using a particlesystem simulation based on simple, localized,sand-pile models. The resulting bone dissec-tion simulator provides haptic and visual ren-derings that are considered sufficient for train-ing purposes.

The rest of the paper is structured as fol-lows. Section 2 provides a brief description ofthe application area, while the following sec-tion is dedicated to the bone–burr interactionmodel. Section 4 describes how bone dust,debris, and water are simulated. Section 5is devoted to the techniques used to providereal–time visual rendering in parallel to thesimulation. Section 6 outlines how rendering

Page 3: Real-Time Haptic and Visual Simulation of Bone Dissection

(a) Mud formation (b) Obscuring effects

Figure 2: Operation scene. These two images are typical examples of what is seen by thesurgeon while performing a mastoidectomy. In (a) it is clearly visible the paste created bythe mixing of bone dust with water. If the paste and the water are not removed, they canobscure the field of view (b). Photos courtesy of Prof. Stefano Sellari Franceschini, ENTSurgery, Dept. of Neuroscience, University of Pisa.

and simulation are integrated in the trainingsystem. Implementation details and resultsare reported in section 7. Finally the last sec-tion reports on conclusions and future work.

2 Application area: mas-toidectomy

Mastoidectomy consists of removal of the aircavities just under the skin behind the ear (seefigure 1). It is the most superficial and com-mon surgery of the temporal bone, and it isperformed for chronic infection of the mastoidair cells (mastoiditis). The mastoid air cellsare widely variant in their anatomy and themain risks of the procedure are related to thedetection and avoidance of the facial nerve,venous sinuses and ”dura mater”.

In the typical mastoidectomy surgicalsetup, the surgeon looks at the region effectedby the procedure via a stereoscopic micro-scope and holds in her hands a high speedburr and a sucker, that she uses, respectively,

to cut the bone and to remove water (used tocool the burr bit) and bone paste producedby the mixing of bone dust with water, seefig. 2(a). Subjective analysis of video records,together with in-situ observations, [4], high-lighted a correlation between burring behav-iors and type and depth of bone. In the caseof initial cortex burring, burr tip motions ofaround 0.8 cm together with sweeps over 2-4 cm were evident. Shorter (1–2 cm) mo-tions with rapid lateral strokes characterizedthe post-cortex mastoidectomy. For deeperburring, 1 cm strokes down to 1 mm were ev-ident with more of a polishing motion qual-ity, guided using the contours from prior bur-ring procedures. The typical sweeping move-ment speed is of about 1 mm/s. Static burrhandling was also noted, eroding bone tissuewhilst maintaining minimal surface pressure.

The procedure requires bi-manual input,with high-quality force feedback for the dom-inant hand (controlling the burr/irrigator),and only collision detection for the non-dominant one (controlling the sucker). Vi-

Page 4: Real-Time Haptic and Visual Simulation of Bone Dissection

sual feedback requires a microscope-like de-vice with at least 4 DOFs.

The capability of replicating the effectscaused by the intertwining of the differentphysical processes is of primary importancefor training [17, 4]. Although the presenceof the water/bone paste mixture is essentiallyirrelevant with respect to the interaction be-tween the burr and the bone, its presencecannot be neglected in the creation of the vi-sual feed–back, because its “obscuring” effectsconstitute the principal cue to the user for theuse of the suction device (see figure 2).

3 Bone–burr interactionmodel

A detailed mechanical description of a rotat-ing burr cutting bone is complicated becauseit involves tracking the continuously chang-ing free surface of the material being cut; theimpact of the burr blades on the surface; theresulting stress distribution in the material;and the consequent plastic deformation andbreak–up.

To circumvent these complications, we havedivided the cutting process into two successivesteps. The first estimates the bone materialdeformation and the resulting elastic forces,given the relative position of the burr with re-spect to the bone. For efficiency reasons, wecurrently do not simulate in this first step thehigh frequencies due to the high speed con-tact between burr bit blades and bone. Thisis, in our opinion, a minor limitation of themodel, since human tactile sensing is limited,except for very fine feature recognition tasks,to 400Hz bandwidth [23]. The second esti-mates the local rate of cutting of the bone byusing an energy balance between the mechan-ical work performed by the burr motor andthe energy needed to cut the bone, that it isassumed to be proportional to the bone massremoved.

We will first describe this approach on acontinuum model and then specialize the re-

sults to a discretized voxel grid.

3.1 Continuum description

3.1.1 Forces evaluation

Figure 3 illustrates an idealized version of theimpact of burr on bone. The burr has a spher-ical bit, of radius R, that is rotating with an-gular velocity ~ω. At time step t the burr isjust outside the bone material, while at thenext time step it is intersecting the bone sur-face. In the following, we will refer to thesphere representing the burr bit as B, and tothe “contact surface” between the burr andthe bone as S.

All the relevant geometrical information iscontained in the volumetric distribution of thebone material. We use a characteristic func-tion χ(~r) to indicate the presence/absence ofbone, where ~r is measured from the center ofB. The first two moments of χ, restricted tothe region contained in B are, respectively,

M =∫

r<Rdr3χ(~r), (1)

~M1 =∫

r<Rdr3χ(~r)~r. (2)

We can now estimate the normal direction,n̂, to S, as n̂ = − ~M1/|M1| and the “thick-ness” h of B immersed in the bone, by solvingM = πh2(R− h

3 ). We can now derive, assum-ing that h

R << 1, and using Hertz’s contacttheory [18], an expression for the total force,~Fe, exerted on the burr by the elastic defor-mation of the bone:

~Fe = C1R2(

h

R)

32 n̂, (3)

where C1 is a dimensional constant, that de-scribes the elastic properties of the material.Moreover, we can give an expression for thepressure, ~P (~ξ), exerted by the burr on thepoint ~ξ of S:

~P (~ξ) = − 32πa2

√1− |~ξ|2

a2~Fe, (4)

Page 5: Real-Time Haptic and Visual Simulation of Bone Dissection

(a) Time=t (b) Time=t+1

Figure 3: The impact of burr on bone. Here we represent two successive instants, at time tand t + 1, of an idealized version of a surgeon burr. The burr has a spherical bit, of radiusR, that is rotating with angular velocity ~ω. The surface S is the effective “contact surface”between the burr and the bone.

where ~ξ is measured from the center of S, seefig. 3(b), and a is the radius of the contactregion. In Hertz’s contact theory, a can beestimated as

a = (C1R)13 F

13

e . (5)

From equation 4, we can estimate the fric-tional force, ~Fµ, that the bone will oppose tothe burr rotation:

~Fµ = µ

∫ξ<a

dσP (~ξ)~r(~ξ)× ~ω

|~r(~ξ)||~ω|, (6)

where µ is a friction coefficient, that links thefrictional forces for unit area to the locallyexerted pressure.

The total force that should be returned bythe haptic feedback device is, therefore, ~FT =~Fe + ~Fµ.

3.1.2 Erosion modeling

We assume that all the power spent by work-ing against frictional forces goes toward theerosion of the bone material. In other words,

we equate for each “contact surface” elementdσ

µP (~ξ)ωr(~ξ)

(1− (

~r(~ξ) · ~ω|~r(~ξ)||~ω|

)2)

dσ = αφ(~ξ)dσ,

(7)where α is a dimensional constant and φ(~ξ)is the mass flux at the contact surface point~ξ. Using the mass flux φ one can update theposition of the bone surface.

The formulas above have been written withthe implicit assumption that the burr bladesare very small with respect to the burr bitradius, and that their effect can be absorbedin the friction constant µ and in the “erosionconstant” α. Even though this is, in general,false, and Hertz’s theory is, strictly speak-ing, only valid for small elastic deformations,this formulation provides a computationallytractable, robust, expression for the responseforces that, at least in the limit of small h, isphysically reasonable.

Page 6: Real-Time Haptic and Visual Simulation of Bone Dissection

Figure 4: Voxel approximation. In orderto simplify computations, voxels are approx-imated with spheres of the same volume. Inthis way, simple formulas for volume and sur-face intersection can be derived.

3.2 Discretized description

3.2.1 Forces evaluation

In the simulator, the bone distribution is onlyknown at the level of a volumetric grid dis-cretized in cubic voxels. Eqs. (1,2,6) need,therefore, to be translated and re–interpreted.

A direct translation will transform integralsin sums over the voxels that have non–nullintersection with B. The evaluation of eachvoxel contribution is computationally com-plex, since it requires finding the intersectionsbetween B and the cube defining the voxel.To simplify matters, we are approximatingthe voxels with spheres of the same volume,centered at the voxel center, ~ci, with the ori-gin at the center of B. The radius of the voxelspheres, η, is, therefore, defined by 4

3πη3 = `3,where ` is the length of the voxel side.

Using this approximation, it is trivial to de-rive simple formulas that express, in terms ofthe distance d = |~ci|, the volume, ∆V , of theintersection region; the area, ∆σ, of the “in-tersection surface” and the actual distance, r,from the center of the intersection surface tothe center of B( see fig. 4).

∆v(d) =π

12(d3 − 6(R2 + η2)d

+ 8(R3 + η3)

− 3(η2 −R2)21d)

(8)

∆σ(d) =π

4(2(η2 + R2)− d2

− (η2 −R2)21d2

)(9)

r(d) =12d +

R2 − η2

21d

(10)

The required integrals then become

M∗ =∑

i

∆V (|~ci|)χi (11)

and~M∗

1 =∑

i

∆V (|~ci|)χiri

di~c1. (12)

To estimate the friction force, ~Fµ we con-vert the area integral (6) in

~Fµ = µ∑

i

∆σ(|~ci|)P (~ξi)~ci × ~ω

|~ci||~ω|, (13)

with~ξi =

ri

di(~ci −

(~ω · ~ci)ω2

~ω). (14)

The power spent by the frictional forces ona voxel is then

µP (ξi)ωri(~ξi)(

1− (~ci · ~ω|~ci||~ω|

)2)

∆σi = αφi∆σi,

(15)where φi is the mass flux per unit surface com-ing out of voxel i, via surface ∆σi. To evalu-ate P we use formula (4), where for a we usethe “effective” radius of the contact surfacea∗ =

√2Rh− h2.

3.2.2 Erosion modeling

Using the fluxes φi we can now erode the vox-els in the intersection region. In our currentimplementation, we associate a 8 bit counter

Page 7: Real-Time Haptic and Visual Simulation of Bone Dissection

with each voxel, representing the voxel den-sity, and decrease it by a value proportionalto the “assumed” amount of removed mass,∆Mi = ∆t∆σφi, where ∆t is the time step ofthe simulation, and the mass, Mi, containedin the voxel i. The bone material in the tem-poral bone area has a morphological structurethat ranges from compact bone, e.g., close tothe outer skull surface, to a porous, “trabec-ular”, consistency. The porous scale rangesfrom few millimeters down to scales well be-yond the resolution of the medical imagingdevices. In our model, the subscale modelingof the trabecular structures is absorbed in avoxel dependent erosion constant α.

3.3 Sample–Estimate-Hold Inter-face

A direct transmission of the computed forcesto the haptic device is, in the case of “almostrigid” contacts, usually plagued by mechani-cal instabilities. The typical solution for thisproblem is the introduction of an artificial,“virtual”, coupling between the haptic deviceand the virtual environment [8, 2].

In our system, we use a sample–estimate–hold approach [11] to remove the excess en-ergy injected by the standard zero–order holdof force employed by the haptic device drivers.With this technique, we compute the forcethat is sent to the haptic device based onthe previous zero–order representations pro-duced at regular intervals by our burr–boneinteraction model. This new value of force,when held over the corresponding samplinginterval, approximates the force–time integralmore closely than the usual zero–order hold[11].

4 Bone dust, debris and wa-ter simulation

A direct, “physically correct”, simulation ofbone dust and water behavior would require,to be able to capture all the dynamically rel-

evant length scales, a very fine spatial res-olution. This would conflict with the real–time requirements of the simulation. There-fore, we are modeling the dust/fluid dynamicsusing what essentially amounts to an hybridparticles-volumetric model, inspired by previ-ous work on particle systems and sandpiles[24, 19].

Bone dust, water and blood are modeledwith a single particle system. Each parti-cle has a mass, a position, a velocity and adynamic behavior. Water particles are intro-duced by the irrigator with an initial velocitydirected along the irrigator axis. Dust par-ticles are generated by the burr performingthe surgical bone drilling with an initial ve-locity depending on the rotation of the burritself and a creation rate depending on themass flux. Blood particles are generated bytissues with negligible initial speed. All par-ticles move according to Newton’s law whenfree, and interact with the other materials ac-cording to a set of rules that ensure that onlya single particle may occupy a given voxel atgiven time. Basically, when a particle entersa non empty voxel, it is reflected backwardsto the first free voxel. Its state is then mod-ified as a function of the colliding materialsand the particle velocity (see figure 4).

When a particle collides with the environ-ment, we choose between elastic scattering orsliding along the bone surface based on theparticle velocity. The random choice is madeaccording to a probability distribution thatfavors scattering for high impact velocities.Different materials are modeled by shapingthe probability distribution and by definingdifferent particles masses and reflection co-efficients. In particular, bone particles havea behavior similar to water, but higher massand higher probability to be scattered by hardbone.

We model bone paste formation by chang-ing the material of bone and water particlesto “bone paste” when they collide.

We also consider the interaction of particleswith the burr, by scattering away the particles

Page 8: Real-Time Haptic and Visual Simulation of Bone Dissection

that enter in contact with the burr bit with avelocity depending on the rotational axis andspeed of the burr.

Particles are deleted when they exit fromthe operation site.

5 Real–time visual rendering

The state of the simulation is entirely de-scribed by the contents of the rectilinear gridthat contains the material labels used in thesimulation. This also includes the particlesmodeling bone dust, debris and water. Weprovide real–time visual feedback in paral-lel with the simulation of the physical sys-tem with a direct volume rendering approach.Rendering such a dynamic volume under real-time constraints is particularly challenging.In our approach, a fast approximation of thediffuse shading equation [20] is computed onthe fly by the graphics pipe-line directly fromthe scalar data. We do this by exploiting thepossibilities offered by multi-texturing withthe register combiner OpenGL extension, thatprovides a configurable means to determineper-pixel fragment coloring [1]. The extensionis available on commodity graphics boards(e.g., NVIDIA GeForce series).

Object-aligned volume slices are composedback-to-front. The Lambert shading equa-tion is implemented in the graphics hard-ware by programming the register combiners,using multi-texturing to compute intermedi-ate slices and approximate opacity gradientswith forward differences. Gradient norms,that provide “surface strength” [10], are com-puted using a second order approximation ofthe square root programmed with the regis-ter combiners. We refer the reader to [5], formore detail on the rendering technique.

This procedure is extremely efficient, sinceall the computation is performed in parallelin the graphics hardware and no particularsynchronization is needed between the ren-derer and the process that is modifying thedataset. Only a single sweep through the vol-

ume is needed, and volume slices are sequen-tially loaded into texture memory on currentstandard PC graphics platform using AGP 4Xtransfers, which provide a peak bandwidth of1054 MB/s.

6 System integration

Our technique for bone dissection simulationhas been integrated in a prototype trainingsystem for mastoidectomy. We have exploitedthe difference in complexity and frequencyrequirements of the visual and haptic sim-ulations by modeling the system as a col-lection of loosely coupled concurrent compo-nents. Logically, the system is divided in a“fast” subsystem, responsible for the high fre-quency tasks (surgical instrument tracking,force feedback computation, bone erosion),and a “slow” one, essentially dedicated to theproduction of data for visual feedback. The“slow” subsystem is responsible for the globalevolution of the water, bone dust and bonepaste. The algorithms used to control thesimulations are local in character and they arestructured so that they communicate only viachanges in the relevant, local, substance den-sities. This arrangement leads naturally toa further break-up of the slow subsystem incomponents, each dedicated to the generationof a specific visual effect, and thus to a parallelimplementation on a multiprocessor architec-ture. The system runs on two interconnectedmultiprocessor machines. The data is initiallyreplicated on the two machines. The first isdedicated to the high-frequency tasks: hap-tic device handling and bone removal simula-tion, which run at 1 KHz. The second con-currently runs, at about 15–20 Hz, the low-frequency tasks: bone removal, fluid evolutionand visual feedback. Since the low-frequencytasks do not influence high-frequency ones,the two machines are synchronized using one-way message passing, with a dead reckon-ing protocol to reduce communication band-width.

Page 9: Real-Time Haptic and Visual Simulation of Bone Dissection

Figure 5: The voxel based particle collision detection: each particle simulation stepis subdivided in three sub-steps. First (subfigure A), particles are moved to target pointsaccording to their velocities; then (subfigure B), collisions with voxels containing bone orother particles are handled by moving colliding particles back to the first empty voxel; finally(subfigure C), scattering rules determine the new velocities assigned to the particles.

Figure 8: The virtual surgical setup

7 Implementation and results

Our current configuration is the following (seefig. 8):

• a single-processor PII/600 MHz with 256MB PC133 RAM for the high-frequencytasks; two threads run in parallel: onefor the haptic loop (1KHz), and one forsending volume and instruments positionupdates to the other machine;

• a dual-processor PIII/600 MHz with 512MB PC800 RAM and a NVIDIA GeForce

3 Ti 500 and running a 2.4 linux ker-nel, for the low frequency tasks; threethreads are continuously running on thismachine: one to receive volume and po-sition updates, one to simulate bone re-moval and fluid evolution, and one for vi-sual rendering;

• a Phantom Desktop haptic device for thedominant hand; the device is connectedto the single processor PC. It provides6DOF tracking and 3DOF force feedbackfor the burr/irrigator;

• a Phantom 1.0 haptic device for the non-dominant hand; the device is connectedto the single processor PC. It provides6DOF tracking and 3DOF force feedbackfor the sucker;

• an n-vision VB30 binocular display forpresenting images to the user; the binoc-ulars are connected to the S-VGA outputof the dual processor PC.

The performance of the prototype is suffi-cient to meet timing constraints for displayand force-feedback, even though the compu-tational and visualization platform is con-structed from affordable and widely accessiblecomponents. We are currently using a volume

Page 10: Real-Time Haptic and Visual Simulation of Bone Dissection

(a) Fe (b) Fµ

Figure 6: Virtual bone reaction against burr penetration. The computations are done inabsence of erosion, α = ∞, using the actual force evaluation kernel of the force–feedbackloop. In (a) we show the “elastic” response of the material, measured in units of C1R

2,as a function of the burr tip penetration depth in units of the burr bit radius R. Fig. (b)illustrates the “frictional” response of the material, with µ = 1/2 and for different anglesθ, θ = 30◦, 60◦, 90◦, between the surface normal and ω̂. The strength of Fµ increases forincreasing sin(θ). The knees in the Fµ curves correspond to the intersection of the burr bitwith a deeper bone voxel layer.

of 256x256x128 cubical voxels (0.3 mm side)to represent the region where the operationtakes place. The force–feedback loop is run-ning at 1 KHz using a 5x5x5 grid around thetip of the instruments for force computations.The computation needed for force evaluationand bone erosion takes typically 20µs, and lessthan 200µs in the worst case configuration.

In the following we will report on a seriesof experiments done using the prototype de-scribed above.

7.1 Force Evaluation

Figure 6 shows the reaction of the virtual boneagainst burr penetration. The computationsare done in absence of erosion, α = ∞, andusing the actual force evaluation kernel of theforce–feedback loop.

Figure 6(a) illustrates the “elastic” re-sponse of the material, measured in units ofC1R

2, as a function of the burr tip penetrationdepth measured in units of the burr bit radiusR. Figure 6(b) illustrates the “frictional” re-

sponse of the material, with µ = 1/2 and fordifferent angles θ, θ = 30◦, 60◦, 90◦, betweenthe surface normal and ω̂. The strength of Fµ

increases for increasing sin(θ). The knees inthe Fµ curves correspond to the intersectionof the burr bit with a deeper bone voxel layer.

Figure 7 shows the reaction of the virtualbone, again in runs with α = ∞, to a slidingmotion of the burr bit, immersed at a depthof R/4, over a flat bone surface. Fig. 7(a,b)show, respectively, the “elastic” and the “fric-tional” force response of the material, mea-sured in units of C1R

2, as a function of thedistance traveled along the plane measured inR units. The pair of curves in each figurecorrespond to a sliding motion over a bonesurface aligned along, respectively, one of thevoxel discretization axis, and a plane withnormal [0, 1√

2, 1√

2]. The fluctuations in the

force values are due to the “voxel sphere” ap-proximation used to compute F . The differ-ence in the wavelength of the fluctuations is afactor of

√2 as expected.

Page 11: Real-Time Haptic and Visual Simulation of Bone Dissection

(a) Fe (b) Fµ

Figure 7: Sliding motion, constrained experiment. The reaction of the flat surface of virtualbone to the sliding motion of a burr bit immersed at a depth of R/4. Fig. (a,b) show,respectively, the “elastic” and the “frictional” force response of the material, measured inunits of C1R

2, as a function of the distance traveled along the plane measured in R units. Thepair of curves in each figure correspond to a sliding motion over a bone surface aligned along,respectively, one of the voxel discretization axis, and a plane with normal [0, 1√

2, 1√

2]. The

fluctuations in the force values are due to the “voxel sphere” approximation used to computeF . The difference in the wavelength of the fluctuations is a factor of

√2 as expected.

7.2 Bone erosion

Figure 9 illustrates a “free–hand” experimentwhere bone is eroded by a polishing move-ment. The movement is similar to the onedescribed in the previous subsection, with asliding speed of about 10mm/sec, and α =3.1 × 106mm2/sec2. Figure 9(a) shows thedepth of the burr below the surface level asa function of time, while fig. 9(b) reports thecomponents of the force contributions and thetotal force applied to the haptic display dur-ing the movement.

We have gathered initial feedback aboutthe prototype system from specialist surgeonsfrom the University of Pisa who are collab-orating with us in this research. Subjectiveinput is being used to tune the parametersthat control force feedback. The overall real-ism of the simulation is considered sufficientfor training purposes. Fig. 10 shows a typicalerosion sequence. A demonstration movie isavailable on the IERAPSI project web site [3].

8 Conclusions and FutureWork

We have presented a physically motivatedhaptic and visual implementation of a bonecutting burr, that is being developed as acomponent of a training system for temporalbone surgery. The current implementation,directly operating on a voxel discretizationof patient-specific 3D CT and MR imagingdata, is efficient enough to provide real–timemultimodal feedback on a low–end multi–processing PC platform. In order to furtherimprove the efficiency of the simulation, weare currenlty working on evaluating interac-tion forces using hierarchical techniques.

While subjective input from selected endusers is encouraging, it would be of extremeinterest to compare our results with directforces measurements obtained by drilling ac-tual samples. Since, to our knowledge, thereare no available data on the subject in litera-ture, we are currently defining an experimen-

Page 12: Real-Time Haptic and Visual Simulation of Bone Dissection

(a) Depth (b) Forces

Figure 9: Bone erosion, polishing movement. A “free–hand” experiment where bone is erodedby a polishing movement. The sliding speed is about 10mm/sec, and α = 3.1×106mm2/sec2.Fig. (a) shows the depth of the burr below the surface level as a function of time. Fig. (b)reports the components of the force contributions and the total force applied to the hapticdisplay during the movement. The lower line is the friction force ~Fµ, the middle line is theelastic force ~Fel, and the upper line is the total force ~Ftot.

tal setup and measurement procedures.In our simulator, we are currently using

datasets that have the same resolution as theoriginal medical imaging data, and we are notdifferentiating between compact and trabecu-lar bone. It is our intention to explore thepossibility of running the simulator on syn-thetically refined datasets obtained by usingsub–voxel trabecular bone modeling.

9 Acknowledgments

We thank Pietro Ghironi, CRS4 technical ser-vices, for his support in setting up the surgicalsimulator hardware platform, and Alan Schei-nine for reviewing the manuscript.

These results were obtained within theframework of the European Union IERAPSIproject (EU-IST-1999-12175).

References

[1] OpenGL extensions registry. Available

from http://oss.sgi.com/projects/ogl-sample/registry/.

[2] R. Adams and B. Hannaford. Stable hap-tic interaction with virtual environments.IEEE Transactions on Robotics and Au-tomation, 15(3):465–474, 1999.

[3] M. Agus, F. Bettio, A. Giachetti, E. Gob-betti, G. Zanetti, and A. Zorcolo. Real–time haptic and visual simulation ofbone dissection. Video demonstration,http://www.crs4.it/ierapsi, 2001.

[4] M. Agus, A. Giachetti, E. Gobbetti,G. Zanetti, N. W. John, and R. J.Stone. Mastoidectomy simulation withcombined visual and haptic feedback.In J. D. Westwood, H. M. Hoffmann,G. T. Mogel, and D. Stredney, editors,Medicine Meets Virtual Reality 2002,pages 17–23, Amsterdam, The Nether-lands, Jan. 2002. IOS Press.

[5] M. Agus, A. Giachetti, E. Gobbetti,G. Zanetti, and A. Zorcolo. A multi-

Page 13: Real-Time Haptic and Visual Simulation of Bone Dissection

(a) (b)

(c) (d)

Figure 10: A virtual burring sequence. Here we show a typical bone cutting sequence per-formed in the mastoid region. The accumulation of debris, and its masking effects, is clearlyvisible.

processor decoupled system for the sim-ulation of temporal bone surgery. Com-puting and Visualization in Science, 5(1),2002.

[6] M. Agus, A. Giachetti, E. Gobbetti,G. Zanetti, and A. Zorcolo. Real-timehaptic and visual simulation of bone dis-section. In IEEE Virtual Reality Con-ference, pages 209–216, Conference heldin Orlando, FL, USA, March 24–28, Feb.2002.

[7] R. S. Avila and L. M. Sobierajski. Ahaptic interaction method for volume vi-sualization. In Proceedings of the confer-ence on Visualization ’96, pages 197–204.IEEE Computer Society Press, 1996.

[8] J. Colgate. Issues in the haptic displayof tool use. In Proceedings of ASMEHaptic Interfaces for Virtual Environ-ment and Teleoperator Systems, pages140–144, 1994.

[9] S. Cotin, H. Delingette, and N. Ayache.

Page 14: Real-Time Haptic and Visual Simulation of Bone Dissection

Real time volumetric deformable modelsfor surgery simulation. In VBC, pages535–540, 1996.

[10] R. A. Drebin, L. Carpenter, and P. Han-rahan. Volume rendering. ComputerGraphics, 22(4):51–58, August 1988.

[11] R. Ellis, N. Sarkar, and M. Jenkins. Nu-merical methods for the force reflectionof contact. ASME Transactions on Dy-namic Systems, Modeling, and Control,119(4):768–774, 1997.

[12] S. F. Frisken-Gibson. Using linked vol-umes to model object collisions, deforma-tion, cutting, carving, and joining. IEEETransactions on Visualization and Com-puter Graphics, 5(4):333–348, Oct./Dec.1999.

[13] T. A. Galyean and J. F. Hughes. Sculpt-ing: an interactive volumetric modelingtechnique. In Proceedings of the 18thannual conference on Computer graph-ics and interactive techniques, pages 267–274. ACM Press, 1991.

[14] S. Gibson. Volumetric object modelingfor surgical simulation, 1998.

[15] T. He and A. Kaufman. Collision detec-tion for volumetric objects. In Proceed-ings of the conference on Visualization’97, pages 27–ff. ACM Press, 1997.

[16] D. James and D. Pai. A unified treat-ment of elastostatic contact simulationfor real time haptics. Haptics-e, TheElectronic Journal of Haptics Research(www.haptics-e.org), 2(1), September2001.

[17] N. W. John, N. Thacker, M. Pokric,A. Jackson, G. Zanetti, E. Gobbetti,A. Giachetti, R. J. Stone, J. Campos,A. Emmen, A. Schwerdtner, E. Neri,S. S. Franceschini, and F. Rubio. Anintegrated simulator for surgery of the

petrous bone. In J. D. Westwood, edi-tor, Medicine Meets Virtual Reality 2001,pages 218–224, Amsterdam, The Nether-lands, January 2001. IOS Press.

[18] L. Landau and E. Lifshitz. Theory ofelasticity. Pergamon Press, 1986.

[19] X. Li and J. Moshell. Modeling soil: Re-altime dynamic models for soil slippageand manipulation. In Computer Graph-ics Proceedings, Annual Conference Se-ries, pages 361–368, 1993.

[20] N. Max. Optical models for direct volumerendering. IEEE Transactions on Visual-ization and Computer Graphics, 1(2):99–108, June 1995.

[21] W. A. McNeely, K. D. Puterbaugh, andJ. J. Troy. Six degrees-of-freedom hap-tic rendering using voxel sampling. InA. Rockwood, editor, Siggraph 1999, An-nual Conference Series, pages 401–408,Los Angeles, 1999. ACM Siggraph, Ad-dison Wesley Longman.

[22] B. Pflesser, A. Petersik, U. Tiede, K. H.Hohne, and R. Leuwer. Volume basedplanning and rehearsal of surgical inter-ventions. In H. U. L. et al., editor, Com-puter Assisted Radiology and Surgery,Proc. CARS 2000, Excerpta Medica In-ternational Congress, 1214, pages 607–612, Elsevier, Amsterdam, 2000.

[23] K. Shimoga. Finger force and touchfeedback issues in dextrous telemanipu-lation. In Proceedings of NASA-CIRSSEInternational Conference on IntelligentRobotic Systems for Space Exploration,NASA, Greenbelt, MD, September 1992.

[24] R. Sumner, J. O’Brien, and J. Hodgins.Animating sand, mud and snow. Com-puter Graphics Forum, 18:1, 1999.

[25] S. W. Wang and A. E. Kaufman. Vol-ume sculpting. In Proceedings of the 1995

Page 15: Real-Time Haptic and Visual Simulation of Bone Dissection

symposium on Interactive 3D graphics,pages 151–ff. ACM Press, 1995.

[26] G. Wiet, J. Bryan, D. Sessanna,D. Streadney, P. Schmalbrock, andB. Welling. Virtual temporal bone dissec-tion simulation. In J. D. Westwood, edi-tor, Medicine Meets Virtual Reality 2000,pages 378–384, Amsterdam, The Nether-lands, January 2000. IOS Press.