Transcript

Seeking Evidence of Impact: Answering "How Do We Know"

Veronica Diaz, PhDAssociate DirectorEDUCAUSE Learning Initiative, EDUCAUSE :::League for InnovationLearning College Summit, Phoenix, AZ

Today’s Talk

• Review what it is to seek impact of (teaching and learning innovations)

• Consider some strategies for using evaluation tools effectively

• Determine ways to use evidence to influence teaching practices

• Review ways to report results

Who are we?

Academic instructionFaculty development

Instructional technologyInstructional design

LibraryInformation technologySenior administration

Other

Why are we here?

I am working in evaluating T&L innovations now.

Evaluation of T&L is part of my formal job description.

My campus units director or VP mandates our gathering evidence of impact.

A senior member of the administration (dean, president, senior vice-president) mandates gathering

evidence of impact in T&L.

I am working as part of a team to gather evidence.

Accreditation processes are prompting us to begin measurement work.

Why evidence of impact?

Technologies to Measure• Web conferencing • LMS and individual features • Lecture capture • Mobile learning tools (laptops,

ebooks, tablets) • Clickers• Collaborative tools • Student generated content• Web 2.0 and social networking

technologies • Learning spaces • OER• Personal learning environments

• Online learning: hyflex course design, blended learning programs, synchronous/asynchronous delivery modes, fully online programs

• Eportfolios • Multimedia projects and tools:

pod/vod casts • Simulations • Early alert systems • Cross-curricular information

literacy programs• Large course redesigns

Technologies and their connection/relationship to…

• Student engagement • Learning related interactions • Shrink the large class • Improve student to faculty interaction • Student retention and success • Specific learning outcomes

12

3 most important indicators you use to measure the evidence of impact of technology-based innovations in T&L

What is “evidence?”• Grades (was frequently

mentioned) • Learning outcomes (was

frequently mentioned) • Satisfaction• Skills • Improved course evaluations • Measures of engagement and

participation • Retention/enrollment rates • Graduation rate • Direct measures of student

performance (at the course level and cumulative)

• Interview data• Institutional data

• Faculty/student technology utilization rates

• Data on student/faculty facility and satisfaction with using technology

• Successfully implementing technology

• Job placement • Student artifacts • Better faculty reviews by students • Course redesign to integrate

changes; impact on the ability to implement best pedagogical practice

• Rates of admission to graduate schools

• Success in more advanced courses

Methods/techniques you ROUTINELY USE for gathering evidence of impact of technology-based innovations in T&L

Most difficult tasks associated with measurement were ranked as follows

1. Knowing where to begin to measure the impact of technology-based innovations in T&L

2. Knowing which measurement and evaluation techniques are most appropriate

3. Conducting the process of gathering evidence

4. Knowing the most effective way to analyze our evidence

5. Communicating to stakeholders the results of our analysis

I have worked with

evaluative data

YesNo

Course level (in my own course) Course level (across several

course sections)At the program level (math,

English)At the degree level

Across institution or several programs

Other

I have worked with

evaluative data at the

USING EVALUATION TOOLS EFFECTIVELY

Technologies and their connection/relationship to

• Student engagement • Learning related interactions • Shrink the large class • Improve student to faculty interaction • Student retention and success • Specific learning outcomes

Remember

Triangulate to tell the full story. The impact of a curricular innovation should

be “visible” from a variety of perspectives and measurement techniques.

…..

Three most commonly used evaluation tools: 1. questionnaires (paper or online), 2. interviews (individual or focus group), and 3. observations (classroom or online).

5 Steps

1. Establish the goals of the evaluation: What do you want to learn?

2. Determine your sample: Whom will you ask?3. Choose methodology: How will you ask? 4. Create your instrument: What will you ask? 5. Pre-test the instrument: Are you getting what

you need? (PILOT YOUR TOOLS/STRATEGIES)

What is a good question?

• Significance: It addresses a question or issue that is seen as important and relevant to the community

• Specificity: The question focuses on specific objectives• Answerability: The question can be answered by data

collection and analysis;• Connectedness: It’s linked to relevant research/theory• Coherency: It provides coherent explanations that rule

out counter-interpretations• Objectivity: The question is free of bias

• Whom does your evidence need to persuade?

• Quantitative. This approach starts with a hypothesis (or theory or strong idea), and seeks to confirm it.

• Qualitative. These studies start with data and look to discover the strong idea or hypothesis through data analysis.

• Mixed. This approach mixes the above methods, combining the confirmation of a hypothesis with data analysis and provides multiple perspectives on complex topics.

– Example: starting with a qualitative study to get data and identify the hypothesis and then following on with a quantitative study to confirm the hypothesis.

The Double Loop

implementing innovations

and/or improvements

conducting evaluation of

those innovations

adjusting the innovation based on the evaluation

implementing the improvements

Methods?Support in data collection?

Double loop?

USING EVIDENCE TO INFLUENCE TEACHING PRACTICES

“higher education institutions seem to have a good understanding of the assessment process through the use of rubrics, e-

portfolios, and other mechanisms, but the difficulty seems to be in improving the yield of the assessment processes, which is more of a

political or institutional culture issue”

Why We Measure

• Inward (course level, inform teaching, evaluate technology use, reflective)

• Outward – Share results with students – Share results with potential students – Share results with other faculty (in/out of

discipline) – Share results at the institutional or departmental

level (info literacy, writing, cross course projects) – Results can be a strategic advantage

Lessons from Wabash National Study• a 3-year research and assessment project• provides participating institutions extensive evidence about the

teaching practices, student experiences, and institutional conditions that promote student growth across multiple outcomes

1. Inputs – the attitudes and values that students bring into college2. Experiences – the experiences that impact students once they

are in college3. Outcomes – the impact that college has on student ability and

knowledgehttp://www.liberalarts.wabash.edu/wabash-study-2010-overview/

Measuring student learning and experience is the easiest step in the assessment process. The real challenge begins once faculty, staff, administrators, and students at institutions try to use the evidence to improve student learning.

www.learningoutcomesassessment.org/documents/Wabash_000.pdf

Lessons from Wabash National Study

Faulty assumptions about using evidence to improve: lack of high-quality data is primary obstacle for using

assessment evidence to promote improvements providing detailed reports of findings is the key

mechanism for kicking of a sequence of events culminating in evidence-based improvements

intellectual approach that faculty and staff use in their scholarship facilitates assessment projects

Lessons from Wabash National Study

• Perform audits of your institutions information about student learning and experience

• Set aside resources for faculty, student & staff responses before assessment evidence is shared

• Develop communication plans to engage a range of campus representatives in data discussions

• Use conversations to identify 1-2 outcomes on which to focus improvement efforts

• Engage students in helping to make sense of and form responses to assessment evidence.

http://www.qmprogram.org/research

Research Grants: 2010 & 2011

Organizational Level Data

Learner Satisfaction

• Student satisfaction higher in QM reviewed courses & non-reviewed courses than in courses at non-QM institutions. (Aman dissertation, Oregon State, 2009)

• Course evaluation data showed student satisfaction increased in redesigned courses. (Prince George’s Community College, MD, 2005)

• Currently conducting a mixed methods study student & faculty perceptions of QM reviewed courses. (University of the Rockies)

Student Learning

• Grades improved with improvements in learner-content interaction (result of review). (Community College of Southern Maryland, 2005)

• Differences approaching significance on outcome measures. (Swan, Matthews, Bogle, Boles, & Day, University of Illinois/Springfield, 2010+)

• QM Rubric implementation positive effect on student higher-order cognitive presence & discussion forum grades via higher teaching presence. (Hall, Delgado Community College, LA, 2010)

Organizational Level Data

Teacher Learning

• Use of QM design standards led to “development of a quality product, as defined by faculty, course designers, administrators, and students, primarily through faculty professional development and exposure to instructional design principles” (p. 214). (Greenberg dissertation, Ohio State, 2010)

• Currently utilizing TPACK framework to explain process by which new online teachers use the QM rubric and process when designing an online course. (University of Akron)

Organizational Learning

• There may be a carryover effect to non-reviewed courses when institution commits to the QM standards. (Aman dissertation, Oregon State, 2009)

• Faculty/design team respond different when QM presented as a rule rather than a guideline. (Greenberg dissertation, Ohio State, 2010)

• Extended positive impact on faculty developers & on members of review teams. (Preliminary analysis 2009; comprehensive summer 2011)

• Alignment in the curriculum between course objectives, goals, and assessments

• Faculty members identify which assignments they have aligned with learning objectives

• Design rubrics or instructions to prompt them at various data-collection points

• Departments or colleges are asked to report data online at the end of each term with prompts for comparison and reflection

• Doing so makes the data ready for larger scale assessment effortsSession Recording and Resources:

http://net.educause.edu/Program/1027812?PRODUCT_CODE=ELI113/GS12

What organizational mechanisms

do you have in place to measure outcomes?

REPORTING RESULTS

• Match your research design to the type of information in which your anticipated consumers are interested or to which they will best respond.

• Match your data-collection method to the type of data to which your information consumer will respond or is likely to respect.

• Keep it simple, to the point, and brief. Know who is consuming your data or research report, who the decision makers are, and how your data is being used to make which decisions, if any.

• Although time-consuming, it might be worthwhile to tailor your reports or analysis to the audience so as to emphasize certain findings or provide a deeper analysis on certain sections of interest.

GOOD RESEARCH: TIPS AND TRICKS

• Be careful of collecting too much data

– Be aware of reaching the point at which you are no longer learning anything from the data

• Write up and analyze your data as soon as possible

– Consider recording the interviews or your own observations/notes

• Record interviews or focus groups--even your own observations or impressions immediately following the interaction

• Besides all the usual good reasons for not reinventing the wheel and using others’ tested surveys, tools, or methods, doing so gives you a point of comparison for your own data– http

://www.educause.edu/Resources/ECARStudyofUndergraduateStuden/217333

• When collecting data, talk to the right people • Don’t overschedule

– Be sure to space out interviews, focus sessions, observations or other tactics so that you can get the most from your interactions

Guiding Questions or Next Steps

• Who are the key stakeholders for the innovative teaching and learning projects in which I am involved?

• How can I help faculty members communicate the results of their instructional innovations to a) students, b) administrators, and c) their professional communities?

• What “evidence” indicators do my key stakeholders value most (i.e., grades, satisfaction, retention, others)?

• Which research professionals or institutional research collection units can assist me in my data collection, analysis and reporting efforts?

Collecting Cases

• Project Overview– Project goals, context,

and design – Data collection methods – Data analysis methods – Findings – Communication of

results – Influence on campus

practices

• Reflection on Design, Methodology, and Effectiveness – Project setup and design – Project data collection

and analysis – Effectiveness and

influence on campus practices

• Project Contacts • Supporting Materials

Online Spring Focus SessionApril 2011http://net.educause.edu/eli113 ……….

Read about the initiative: http://www.educause.edu/ELI/SEI ……….

Get involved: http://www.educause.edu/ELI/SeekingEvidenceofImpact/OpportunitiesforEngagement/206626……….

Join the ELI Evidence of Impact Constituent Group

http://www.educause.edu/cg/EVIDENCE-IMPACT

SEI Focus Session Content• These items for the 2011 Online Spring Focus Session on seeking evidence of

impact can be found at http://net.educause.edu/eli113. • ELI Seeking Evidence of Impact Resource List, includes websites, reports, articles,

and research: http://net.educause.edu/section_params/conf/ELI113/SFSResourceListAllFINAL.pdf.

• ELI Seeking Evidence of Impact Discussion Questions: http://net.educause.edu/section_params/conf/ELI113/discussion_prompts_team-indiv2011.doc.

• ELI Seeking Evidence of Impact Activity Workbook, Day 1 and 2: http://net.educause.edu/section_params/conf/ELI113/activity_prompts_team-indiv2011.doc.

• ELI Seeking Evidence of Impact Reflection Worksheet: http://net.educause.edu/section_params/conf/eli103/reflection_worksheet.doc.

• Presentation slides and resources for all sessions can be found at http://net.educause.edu/eli113/2011ELIOnlineSpringFocusSessionRecordings/1028384.

Other Related Resources • Focus Session Learning Commons: http://elifocus.ning.com/ • Full focus session online program:

http://net.educause.edu/Program/1027810 • ELI Seeking Evidence of Impact initiative site:

http://www.educause.edu/ELI/SEI – Resource site:

http://www.educause.edu/ELI/SeekingEvidenceofImpact/Resources/206625

– Suggest an additional resource: http://tinyurl.com/resourceidea – Get involved:

http://www.educause.edu/ELI/SeekingEvidenceofImpact/OpportunitiesforEngagement/206626

– Contribute: http://tinyurl.com/elisei

CONTACT INFORMATION

Veronica M. Diaz, PhDAssociate DirectorEDUCAUSE Learning Initiativevdiaz@educause.edu

Copyright Veronica Diaz, 2011. This work is the intellectual property of the author. Permission is granted for this material to be shared for non-commercial, educational purposes, provided that this copyright statement appears on the reproduced materials and notice is given that the copying is by permission of the author. To disseminate otherwise or to republish requires written permission from the author.

top related