Transcript

A Nursing Simulation: Emergency Department Pain Management

Authors

Enilda Romero-Hall, Ph.D. Assistant Professor at the University of Tampa Ginger Watson, Ph.D. Associate Professor at Old Dominion University

Yiannis Papelis, Ph.D. Hector Garcia and Menion Croll Virginia Modeling, Analysis and Simulation Center

Purpose

In this 3D simulation environment trainees take on the role of an emergency department nurse assessing and managing the pain of patients. In the simulation environment, the trainees interact with three emotionally expressive animated patients. The three patients vary in their ethnicity, age, and emotion.

Patients

Arthur Smith Jose Rodriguez Lin Chan

Objectives Correctly confirm a patient’s identity

Apply proper nursing actions to initiate care of a patient.

Evaluate a patient, including vital signs and pain assessment

Determine the nursing care for a patient based on assessed patient condition as well as, reported and observed pain

Implement appropriate and safe pain management care of a patient

Learners

Nursing Students

Professional Nurses

Scenario

The scenario transpires during the day shift at an emergency department of a local hospital. The nurse provides care to patients who came to the facility after their golf cart flipped over earlier in the day.

Design Decisions

Performance measures of the evidence-based pain management critical path were translated into simulation actions in the 3D interactive environment

Specifications were established for the 3D simulation environment:

– Location Settings (emergency room and patient room) – User’s view point and camera position – Menu selection options (clickable pop-up menu) – Users’ narrative communication (pop-up narratives) – clickable critical path objects (thermometer, stethoscope)

Development

The emergency department environment was completed using AutoDesk Maya. This software allowed the rendering of the 3D virtual environment and objects such as the patient room, emergency department hallway, and all medical equipment.

Development

Development

The motion capture was conducted using the OptiTrack Optical Motion Capture System and processed with AutoDesk Motion Builder. This software allowed 3D object motion tracking, real-time streaming, and a multi-camera configuration.

Development

Development

The facial capture of emotion expression were captured using the FaceShift software and an Xbox Kinect. The audio captures were conducted using a simple iPhone audio recording application.

Development

Development

Unity 3D Game Engine was used to integrate the 3D virtual environment, the virtual objects, and the animated agents.

Implementation

Funding

The design and development of this 3D simulation was funded through a research program at the Virginia Modeling Analysis and Simulation Center (VMASC).

top related