Top Banner
Improving The Process of Cancer Care Session 1 of a 5 part series Process of Care Research Branch Division of Cancer Control and Population Sciences/Behavioral Research Program Stephen Taplin MD, MPH, Veronica Chollette RN, Erica Breslau PhD, Sarah Kobrin PhD, Carly Parry PhD, Heather Edwards PhD, Miho Tanaka PhD, Bryan Leyva Vengoechea, BS, Toccara Chamberlain, MA
28

Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

Sep 24, 2020

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

Improving The Process of Cancer Care

Session 1 of a 5 part series Process of Care Research Branch

Division of Cancer Control and Population Sciences/Behavioral Research Program

Stephen Taplin MD, MPH, Veronica Chollette RN,

Erica Breslau PhD, Sarah Kobrin PhD, Carly Parry PhD, Heather Edwards PhD, Miho Tanaka PhD, Bryan Leyva

Vengoechea, BS, Toccara Chamberlain, MA

Page 2: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

Series Purpose – for NCI • Solicit opinions from three sectors of the

community regarding problems in the quality of cancer care ◦ Providers, Researchers, Health Care Purchasers

• Identify potential research topics that might address those problems

• Focus the research agenda of PCRB upon major underlying factors affecting the processes of cancer care.

Page 3: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

For Participants

• Understand the perspectives of three communities with respect to problems in cancer care delivery

• Learn conceptual, analytic, and practical approaches to understanding and addressing problems in cancer care delivery

• Contribute to the development of NCI’s research agenda

Page 4: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

Ms F Unmarried and Desirous of Children

4

Gets insurance. Decides needs PE

MD Visit PE/Pap

Oncology

GYN Visit MD Visit Phone call for F/U

One Month Delay

Radiation Chemotherapy October 2001

Hysterectomy Performed June 14, 2001

2 weeks

-Refer to Gynecologist -Cervical Carcinoma In situ -Need Cone Bx

Need Hysterectomy

2nd Opinions

Last MD Visit PE/Pap at age 18

6 yrs pass as she does not have an MD

Page 5: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

Nurses and Staff 4

Nurses and Staff 3

Nurses and Staff 2

Nurses and Staff 1

Family

Rev. 2.03.10

Primary Prevention Detection Diagnosis Treatment Survivorship

TYPES OF CARE:

LONG-TERM OUTCOMES

• MORTALITY • QUALITY

OF CARE

Provider 2 Provider 1 Provider 3 Provider 4

Community Services

Patient

Adapted from Taplin et al 2009 (16)

Is this case a problem of individual failures? Groups, not individuals saw this woman. How could we measure the functioning of those groups to evaluate whether they were creating the conditions for success?

Page 6: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

Objectives

• Describe decision-based framework for designing team performance systems.

• Illustrate tradeoffs between components of this framework.

• Discuss applied and research-based examples of measuring team performance with observation and social sensors.

6

Page 7: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

A FRAMEWORK FOR DEVELOPING MEASURES OF TEAMWORK

Understanding the tradeoffs

Page 8: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

Decision Point Design Framework for Team Performance Measures

Why? •Evaluation •Feedback •Research •Certification •Needs analysis

How? •Observation •Self-report •Scoring methodology

What? •Teamwork competencies •Multi-level evaluation

Where? •Learning environment •‘On the job’ •Hybrid approach

When? •Frequency •Timing relative to interventions

Who? •Selecting, training, and supporting raters

Rosen, Scheibel, Salas, Wu, Silvestri, & King, 2013

• What are the key decisions? • What are the main options? • What are the tradeoffs? • What are the inter-

dependencies?

Page 9: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

How do you measure?

3/13/2014 9

Method Strengths Challenges Self-report Familiarity Temporal resolution Surveys Flexibility Respondent burden

Established validity Observation Objectivity Maintaining reliability Behavioral markers Established validity Cost / logistics Social sensors Continuous / dynamic Privacy / trust Automated collection of social Low-cost Complexity of data interaction data Activity traces ‘Free data’ (sort of) Privacy / trust Enduring data produced through Can characterize Complexity of data task completion (email, ping, e- distributed interaction white boards, EMR use)

Page 10: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

What do you measure?

Inputs Mediators Outputs

Action processes •Communication •Leadership •Performance Monitoring •Back-up behavior •Adaptation & learning

Transition processes •Planning •Goal specification

Interpersonal processes •Conflict management

Effectiveness •Task Outcomes •Member Satisfaction •Viability

Team Learning Outcomes •Δ Knowledge •Δ Skill •Δ Attitudes

Knowledge stock •Shared mental models •Transactive memory systems

Task characteristics •Interdependence

Org. context •Culture

Page 11: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

What do you measure?

Inputs Mediators Outputs

1DeChurch & Mesmer-Magnus, 2010 2Lepine et al., 2008 3Gully et al., 2002 4Beal et al., 2003

1ρ = .38 Team Cognition SMM & Transactive

Memory

Team Effectiveness

Team Affect Cohesion,

Efficacy/Potency

3ρefficacy = .35; 4ρcohesion = .17/.31

Team Behavior Action, Transition,

Interpersonal

2ρ = .29 1ρ = .43

Page 12: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

Abstraction Hierarchy for Behavioral Markers

Rosen, Scheibel, Salas, Wu, Silvestri, & King, 2013

Team / Task Generic

Team / Task Specific

Teamwork Competency

Clinical Domains

Team

Task or Situation

Leadership

Surgery

Team

Task or Situation

Trauma

Team

Task or Situation

OTAS

Mayo Trauma Team Performance Scale

Examples

Page 13: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

Measurement Systems: From Markers to Metrics

Expectations for

Performance

Abstract / Generic Content

Specific / Concrete Content

The Rater / Observer

Behavioral Specificity of

Content

The Tool / Protocol

Where is the knowledge

burden?

Page 14: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

SOCIAL SENSORS Example

14

Page 15: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

Developing methods to measure healthcare team performance in acute and chronic care settings

Michael A. Rosen, PhD Assistant Professor, Armstrong Institute for Patient Safety and Quality

November 6th , 2013

Page 16: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

Capabilities

• IR and Bluetooth sensors – Proximity – Location

• Microphones – Speaking (yes/no) and

conversational analysis – Pitch / volume – Actual audio

• Accelerometer – Activity – Posture

Armstrong Institute for Patient Safety

and Quality

16 www.sociometricsolutions.com

Page 17: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

Emerging validity evidence

• Team inputs – Personality traits (~ r = .3 to .4)1,2

• Team Mediators – High reliability with observational measures in the

ED (r = .96, p <.001)3

– Classification of trauma team tasks (87.5% accuracy)4

• Team mediators outcomes – Face to face interaction time predicted LOS in

PACU (r = .53, p < .01)1

Armstrong Institute for Patient Safety and Quality

17

1. Olguin Olguin et al., 2009 2. Mehl, Gosling & Pennebaker, 2006 3. Kannampallil et al., 2011 4. Vankipuram et al., 2011

Page 18: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

Pilot work

• ‘Micro’ validity evidence generated – Sensor data covaries reasonably well with

perceptions of interaction (r = .59, p < .01) • Data visualization

– We can’t analyze all of the complexity yet (more on this later), but we can see it.

– Basis of the ‘interaction mirror’ intervention

Page 19: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

Bluetooth detections Each blue segment is a person. Each line is a connection between people over time.

Page 20: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

This data was collected at over 6 ½ hours.

Page 21: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

Time speaking. In addition to detections, we can extract a variety of information from the microphone data. Circos lets us visualize these metrics as histograms, heat maps, line graphs, etc.

Page 22: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

Total number of seconds speaking over 10 minute period

Speech analysis. For each badges, the raw microphone data is parsed and each second of speech categorized as: speaking, listening (badge is silent in the proximity of another badge speaking), overlap (two badges in proximity are speaking), and silent (all badges in proximity are not speaking). This can be broken down to a 60 second time scale (or lower if needed).

Page 23: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

Additional data streams. While the badges collect a number of other metrics, the most immediately useful are activity level captured through an accelerometer, and various forms of voice analysis (the badges provide a form of spectral analysis).

Page 24: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

Additional data streams. While the badges collect a number of other metrics, the most immediately useful are activity level captured through an accelerometer, and various forms of voice analysis (the badges provide a form of spectral analysis).

Heat map of mean volume level per 10 minute interval, standardized within person.

Heat map of mean activity level per 10 minute interval (across people).

Page 25: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

Needed research in this area

• Technical – improving sensor properties and performance

• Analytic – developing real-time predictive algorithms

• Psychometric – establishing validity and generalizability evidence

• Socio-cultural – building a culture of trust in sensor-based systems

• Interventional – feedback and alerting displays

Armstrong Institute for Patient Safety

and Quality

25

Page 26: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

Our next steps

• Iterative participatory design of feedback displays for different roles and levels: individual, team, and unit views for immediate feedback and analyzing trends over time.

• Validation of sensor-based measures against traditional gold standards for teamwork and workflow (self-report and observational methods).

• Development of predictive analytics for dynamic network data: advancing the methods of tensor decomposition of networked sensors.

Armstrong Institute for Patient Safety and Quality

26

Page 27: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

Questions?

• Mike Rosen – [email protected]

Armstrong Institute for Patient Safety and Quality

27

Page 28: Improving The Process of Cancer Care...•Frequency •Timing relative to interventions Who? •Selecting, training, and supporting raters Rosen, Scheibel, Salas, Wu, Silvestri, &

Continuing the Discussion • We invite you to join us in the upcoming Cyber Discussions. Remember, your participation

is essential to shaping this research agenda. • Save-the-dates:

• To register, go to: http://dccps.nci.nih.gov/brp/pcrb/cyberseminars.html • If you have questions, contact Veronica Chollette ([email protected])

Wednesday, March 19, 2014, 2:00 PM - 3:00 PM EST Cooperation, Competition and Team Performance: Towards a Contingency Report Dr. Stephen Humphrey July 9, 2014, 2:00 PM - 3:00 PM EST Team Based Measures in Primary Care Dr. Richard Ricciardi November 5, 2014, 2:00 PM - 3:00 PM EST Research Priorities in Cancer Care Teams Research Dr. Eduardo Salas July 1, 2015, 2:00 PM - 3:00 PM EST Team Cognition: Understanding the Factors That Drive Process and Performance Dr. Steve Fiore