Top Banner
Aligning Evaluation of Instructional Technology Programs to the Innovation Cycle Yvonne Belanger Head, Program Evaluation Duke Center for Instructional Technology [email protected]
23
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: belanger_aea_2007

Aligning Evaluation of Instructional Technology Programs to the Innovation Cycle

Yvonne Belanger

Head, Program EvaluationDuke Center for Instructional [email protected]

Page 2: belanger_aea_2007

Key challenges for IT in higher education

Engaging the full range of stakeholdersPromoting well-informed decision-makingHelping technology initiatives move from pilot to full support

Page 3: belanger_aea_2007

Presentation Overview

Contextual factors and background

Evolution of the Duke Digital Initiative

Evaluation Process and Findings

Linking evaluation to the innovation cycle

Page 4: belanger_aea_2007

Program Context

Duke UniversityInstitutional characteristicsStudent characteristicsTechnology environment

Page 5: belanger_aea_2007

Evaluator context

Center for Instructional Technology“Supports the academic mission of Duke University by helping instructors find innovative ways to use technology to achieve their teaching goals”Part of Perkins Library

Page 6: belanger_aea_2007

Evolution of the Duke Digital Initiative

2004-05:First Year iPod Experience

Key Precursor events2001 - Strategic plan Building on Excellence2002 - New CIO

Page 7: belanger_aea_2007

First Year iPod ProgramEvaluation goals

Documenting innovative technology usesDetecting unanticipated outcomesEngaging stakeholders in program review

Page 8: belanger_aea_2007

Contextual factors of iPod evaluation

Intense media scrutiny and interest from other educational institutionsInternal skepticism from faculty and students about program valueNeed for decision-making information for senior administrators about program continuation

Page 9: belanger_aea_2007

Evaluation methodology

o Mixed method (primarily qualitative)o ParticipatoryKey data sources

Classroom observationsFaculty project reportsSurvey of first year students and facultyFaculty and student focus groups

Page 10: belanger_aea_2007

First Year iPod Program Evaluation Findings

For faculty…Spurred faculty to incorporate course content via digital mediaEngaged new groups of facultyImproved faculty responsiveness to individual needsFostered flexible and efficient course delivery mechanisms (time saving)

75% of first-year students used their iPod in a class or for independent support of their studies

Page 11: belanger_aea_2007

First Year iPod Program Evaluation Findings

For students…Promoted the transition to interactive, project-based, & collaborative learningIncreased student engagement in class activitiesFreedom from place-based resources

Most popular feature of the iPod for academic use?Recording digital audio

Page 12: belanger_aea_2007

First Year iPod Program Evaluation Findings

For the university…Supported research and writing in the undergraduate curriculumHeightened interest in digital audio recording, especially classroom audio captureCopyright and IP issues became exponentially more difficult“Stress testing” of technology infrastructureProgram sparked many substantive collaborations with universities from around the world, government agencies and corporationsSignificant publicity

Page 13: belanger_aea_2007

Transition to “Duke Digital Initiative”

Innovative and effective teachingCurriculum enhancementInfrastructure developmentKnowledge sharing within and outside Duke

Change in scope (new technologies)

Articulation of phased program model…

Shift to more formal goals and academic focus

Page 14: belanger_aea_2007

Service/Technology widely supported for valuable technologies

Development of infrastructure and support models for promising technologies

Pilot projects and introduction of new technology

Standard Support and IntegrationPhase 3

Extension and TransitionPhase 2

ExperimentationPhase 1

DDI phased program model

Page 15: belanger_aea_2007

Aligning evaluation to DDI program model

Confirming sustainability of support modelsMeasuring long-term

impact

Identifying and documenting stable use patternsDefining use cases for institutional supportAssessing the impact of proposed changes on key stakeholders

Documenting innovative useDetecting unanticipated outcomesMeasuring short-term impact“Comprehensive scanning” (Chen, 2005)

Phase 3 Standard Support

and Integration

Phase 2 Extension and

Transition

Phase 1 Experimentation

Page 16: belanger_aea_2007

How our evaluation strategies evolve with the innovation cycle

Goal seeking Goal driven

Discovery oriented Confirmatory

Implementation focused

Outcomes focused

Narrow project focus

Broad program perspective

Formative Summative

Page 17: belanger_aea_2007

Aligning evaluation strategies to project maturity: iPod program example

•Faculty feedback sessions•Student surveys

•Faculty feedback sessions•Student surveys

•Faculty feedback sessions•Student & faculty surveys•Cross-unit staff feedback sessions

Program level

Project level

Data collection methods

•Faculty reports•More generic surveys•Student surveys•Faculty reports

•Focus groups•Observations•Interviews•Tailored surveys

Phase 3AY2006-2007

Phase 2AY2005-2006

Phase 1AY2004-2005

Page 18: belanger_aea_2007

DDI: Where are we now?

Increased use of multimedia and demand for multimedia support servicesSuccessful and increasing use of tablet PCs and iPods for enhanced classroom presentation and multimedia displayBroader integration of student multimedia projects into coursework

Source: 2006-07 DDI Program Annual Report

Page 19: belanger_aea_2007

Growth of iPod Use 2004-2007

Number of iPod  C ours e S ec tions  (by S ubjec t Area) F all 2004 ‐ S pring  2007

0 20 40 60 80 100

F all 2004

S pring  2005

F all 2005

S pring  2006

F all 2006

S pring  2007

S ciences

S ocial S ciences

Humanities

L anguages

Page 20: belanger_aea_2007

So what have we really learned?

Evaluation has a key role to play in planningDeveloping systematic processes to evaluate and monitor that are sustainable, given our evaluation capacityReporting about failures as well as successes enhances your credibility

Page 21: belanger_aea_2007

Unresolved challenges

Hand-offs from evaluation staff to program staff for ongoing monitoringDealing with blurry edges of the programFinding resources to investigate long-term impactsFinancial sustainability

Page 22: belanger_aea_2007

Credits

The Duke Digital Initiative is a joint project of

Office of the Provost

Office of the Executive Vice President

Office of Information Technology

Division of Student Affairs

Center for Instructional Technology

Duke Libraries

Duke Computer Store

And many individuals, both internal and external to Duke

Photographs: Mark Zupan, Perkins Library

Page 23: belanger_aea_2007

Related Websites

Duke Center for Instructional Technology: http://cit.duke.edu/

Information about the Duke Digital Initiative: http://www.duke.edu/ddi/

DDI Faculty Projects: http://cit.duke.edu/help/ddi/archive.html

CIT Evaluation Reports (including DDI): http://cit.duke.edu/reports