Top Banner
John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013
51

John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Dec 25, 2015

Download

Documents

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

John Sutton

Principal Investigators Meeting – MSP FY 12Washington, DC

December 16, 2013

Page 2: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Any opinions, suggestions, and conclusions or recommendations expressed in this presentation are those of the presenter and do not necessarily reflect

the views of the National Science Foundation; NSF has not approved or endorsed its content. This project is funded through the NSF Research and

Technical Assistance (RETA) program (DRL 1238120).

Page 3: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Professional Learning Network for

Mathematics and Science Partnership Projects

• Learn and Share: Challenges and Successes

• Improve Skills

• Engage in Reflective Evaluation

Page 4: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

The Goal of the TEAMS project is to:

Strengthen the quality of MSP project evaluations and build the capacity of the evaluators by strengthening their skills related to evaluation design, methodology, analysis, and reporting.

Page 5: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Promoting MSP Effectiveness Through Evaluation

MSP projects represent a major federal effort to support advancements in science, technology, engineering, and mathematics (STEM) disciplines and careers. Recognizing the vital role of evaluation in this national effort to promote STEM disciplines and careers, NSF MSP projects have an obligation to ensure their project evaluation is designed and conducted in a rigorous manner.

Page 6: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Regardless of funding sources, project evaluation plays a vital role in every Mathematics and Science Partnership (MSP) project by:

Promoting MSP Effectiveness Through Evaluation

• Assessing the degree to which projects attain their goals and objectives;

• Advancing the field by sharing lessons learned and evaluation findings; and

• Improving the overall effectiveness of the project through formative evaluation.

Page 7: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Promoting MSP Effectiveness Through Evaluation

• Fosters increased understanding of evaluation design and implementation, in particular new and innovative methodologies.

• Promotes the use of longitudinal data systems in MSP evaluations.

• Strengthens the role of evaluation as a means of improving project effectiveness and contributing to the knowledge of the field.

Technical Evaluation Assistance in Mathematics and Science (TEAMS):

Page 8: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation

• Works closely with the NSF staff to develop and implement strategies to encourage innovation and increased rigor in MSP evaluations.

• Conducts ongoing needs assessment to identify issues that pose

challenges for the work of evaluators of MSP projects. • Offers no-cost technical assistance to address these issues and

challenges.

• Provides venues for MSP evaluators and project staff to share strategies and findings from MSP evaluations.

Technical Evaluation Assistance in Mathematics and Science (TEAMS):

Page 9: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Evaluation Approaches

Often, external evaluations provide:

– Formative feedback to improve projects and suggest mid-course corrections

– Summative reporting of project outcomes and impacts

– Project monitoring for accountability

Page 10: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Resources to Inform Evaluation

Institute of Education Sciences, U.S. Department of Education, and National Science Foundation. (2013). Common Guidelines for Education Research and Development. Washington, DC: IES and NSF.

Frechtling, J. (2010). 2010 User-Friendly Handbook for Project Evaluation. REC 99-12175. Arlington, VA: National Science Foundation

Page 11: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Resources to Inform Evaluation

Heck, D.J. & Minner, D.D. (2010). Technical report: Standards of evidence for empirical research, math and science partnership knowledge management and dissemination. Chapel Hill, NC: Horizon Research, Inc.

Guthrie, Wamae, Diepeveen, Wooding, & Grant. (2013). Measuring research: A guide to research evaluation frameworks and tools. RAND Europe.

Page 12: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Research Types

Types Description

Foundational Research Early-state or Exploratory

Research Design and Development

Research Efficacy Research Effectiveness Research Scale-Up Research

Each of these types of research have different evaluation purposes and require different types of evaluation approaches.

Page 13: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Measuring Research: Key Rationales

Advocacy

Demonstrate the benefits of supporting research, enhance understanding of research and its processes among policymakers and the public, and make the case for policy and practice change.

AccountabilityShow that money and other resources have been used efficiently and effectively, and to hold researchers accountable.

Analysis

Understand how and why research is effective and how it can be better supported, feeding into research strategy and decision-making by providing a stronger evidence base.

AllocationDetermine where best to allocate funds in the future, making the best possible use of limited funding.

Page 14: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Standards of Evidence

Specify indicators for empirical evidence in six domains:

1. Adequate documentation2. Internal validity3. Analytic precision4. Generalizability/external validity5. Overall fit6. Warrants for claims

Page 15: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Results of Needs Assessment Survey 11/2013

22%

17%46%

15%

Experienced with MSP Evaluations

No prior experienceSomewhat experiencedA little experienceVery experienced

Page 16: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Results of Needs Assessment Survey 11/2013

Challenge Posed for Each Aspect of Evaluation Instrumentation (38%) Theory of Action and Logic Model (27%) Establishing Comparison Groups (24%) Evaluation Design (24%) Sampling (19%) Measurable Outcomes and Evaluation Questions

(19%) Data Analysis Methodology (16%) Data Collection (16%) Reporting (14%)

Page 17: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Results of Needs Assessment Survey 11/2013

Other Evaluation Challenges Instrumentso Instruments for Science and Engineeringo Instruments Aligned to State Standardso Instruments Aligned to Content of MSP

Valid and Reliable Performance Tasks Classroom Observation Protocols

Page 18: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Results of Needs Assessment Survey 11/2013

Where Additional Assistance Needed Comparison Groups in Rural Settings Random Groups/Comparison Groups Large Enough Sample Size/Strategies for

Random Selection Evaluation Design and Measurable Outcomes

for New Projects Data Collection/Statewide Task Excessive Evaluation of Students and Teachers

Page 19: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Strategic Plan

Tasks Task 1: Intranet – Project Internal Storage and

Retrieval Structure Task 2: Website – teams.mspnet.org Task 3: Outreach – Ongoing Communications Task 4: National Advisory Board – Guidance and

Review Task 5: Help Desk – Quick response to Queries Task 6: Document Review – Identify commonalities –

develop resources Task 7: Webinars – Topics to Inform

Page 20: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Strategic Plan

Tasks Task 8: Communities of Practice – Guided discussions

around evaluation topics Task 9: Direct Technical Assistance - Strategies and

activities at the project level Task 10: National Conferences – Presentations to

inform others work Task 11: Annual Meeting – Focus on Evaluation Task 12: Data Sources – Information about data sets

and utility Task 13: Instrument Review – share information

about what is being used, by whom, and for what

Page 21: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Principal Investigator Needs and Assistance

Task 3: Outreach

• Principal Investigators receive TEAMS communications to know what is available regarding resources and technical assistance.

• Identify additional resources, templates, processes, and measures being used by project for sharing with other MSP project PIs and evaluators.

• Communicate with TEAMS regarding specific project needs for information and technical assistance.

Task 5: Help Desk

• Encourage project staff and evaluators to pose queries for TEAMS to respond.

Task 6: Document Review

• Based on PI review of reports, especially challenges identified by evaluator, contact TEAMS staff for follow-up resources or technical assistance.

Page 22: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation

Task 7: Webinars

• Invitations sent to PIs and evaluators to participate in webinars.

• Identify topics for which webinars can be prepared and provided and communicate that to TEAMS.

• Encourage your evaluator and project staff to present/participate in offered webinars.

Task 8: Communities of Practice

• Based on PI review of reports, especially challenges and needs identified by individual project, recommend possible topics to TEAMS staff.

• Consider participation and encourage project staff and evaluator to participate in discussions.

Principal Investigator Needs and Assistance

Page 23: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation

Task 9: Direct Technical Assistance

• Based on insights and familiarity with individual project, including review of reports, contact TEAMS staff for follow-up with specific technical assistance and resources.

• Identify Evaluation topics for which technical assistance could be provided to project staff and evaluators.

Task 10: National Conferences

• Share information with TEAMS about upcoming presentations from your project, especially if related to evaluation.

• TEAMS staff could help post presentations to share interesting findings from project.

Principal Investigator Needs and Assistance

Page 24: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Tier Definitions

Tier Group Description Services

1 Evaluators and researchers of projects other than NSF- and ED-funded MSP projects

Access to website that provides links to available evaluation research and resources, research briefs, and other TEAMS publications

2 Evaluators of NSF- and ED-funded MSP projects and external evaluators of other projects

Help Desk services (Task 5)Webinars (Task 7)Communities of practice (Task 8)

3 Evaluators of NSF-funded MSP projects

Annual Conference (Task 11)

4 Evaluators of NSF-funded MSP projects that are confronting specific challenges

Communities of practice specifically for Tier 4 projects with common needs (Tasks 8 & 9)Direct technical assistance (Task 9)

Page 25: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation

Task 11: TEAMS Annual Meeting

• Help identify changes in project staff • Help identify specific projects to highlight and

participate• Help promote participation in meetings (allow

resources to be used for this purpose)

Task 12: Data Sources

• Identify projects that are using public databases in their reporting

• Share information about projects asking about use of public databases

Principal Investigator Needs and Assistance

Page 26: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation

Task 13: Instrument Review

• Contact TEAMS with queries regarding specific instruments for specific use.

• Share information with TEAMS regarding challenges encountered regarding instruments.

• Identify and share unique instruments being used in project.

• Consider using instruments from other projects as appropriate.

Principal Investigator Needs and Assistance

Page 27: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation

In Summary, Principal Investigators can:

• Identify needs;

• Share information between projects and TEAMS;

• Encourage involvement;

• Facilitate communication; and

• Promote high quality evaluation approaches.

Principal Investigator Needs and Assistance

Page 28: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Website (http:teams.mspnet.org) and Help Desk

Page 29: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Website (http:teams.mspnet.org) and Help Desk

Page 30: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Instruments

Considerations• Using measures of established quality vs. alignment

to the specific goals/approaches of the projecto Internally developed & piloted instrumentso Externally developed & validated instrumentso Collection & analysis of teacher work from the PD

Page 31: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Instruments

Benefits• Internally developed instruments can help

demonstrate results were what was intended and promised

• Externally validated instruments can help demonstrate findings are credible and more broadly important

• Use of multiple instruments provides triangulation of data for findings

• Use of internally developed instruments and teacher work samples can help in refining the program and informing providers about participants’ learning

Page 32: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Instruments

Lessons Learned• As evaluation informs the project and the project

evolves, this sometimes requires instrument changes• Modifying instruments (adding and/removing items

over time) and aligning data sets after modifications to keep up with evolving project needs

• Adding new instruments or removing instruments (when initial instrumentation isn’t providing appropriate data – i.e., teacher knowledge, etc.)

• Verify instrument validity and reliability after modifications and include information in reports.

Page 33: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Develop a Conceptual Model of the

Project and Identify Key Evaluation PointsTheory of Action

Why This/Hypothesiso Based on interpretation of current research

Describes the experience of the intended audience o Cognitively or behaviorally

Expected Outcomeo If This/Then This

Page 34: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Develop a Conceptual Model of the

Project and Identify Key Evaluation PointsModel Components

Inputs Activities Outputs Short-term Outcomes Long-term Outcomes Contextual Factors

Page 35: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Example of Logic Model

Page 36: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Develop an Evaluation Plan

Steps

Determining what type of design is required to answer the questions posed

Selecting a methodological approach and data collection instruments

Selecting a comparison group Timing, Sequencing, and Frequency of Data

Collection

Page 37: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Develop Evaluation Questions and

Define Measurable OutcomesSteps

Identify Key Stakeholders and Audiences Formulating potential evaluation questions of

interest to the stakeholders and audiences Defining outcomes in measureable terms Prioritizing and eliminating questions

Page 38: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Conducting the Data Collection

Considerations

Obtain necessary clearances and permission. Consider the needs and sensitivities of the

respondents. Make sure your data collectors are adequately

trained and will operate in an objective, unbiased manner.

Obtain data from as many members of your sample as possible.

Cause as little disruption as possible to the ongoing effort.

Page 39: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Analyzing the Data

Considerations

Check the raw data and prepare them for analysis.

Conduct initial analysis based on the evaluation plan.

Conduct additional analyses based on the initial results.

Integrate and synthesize findings.

Page 40: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Standards of Evidence and Brief Descriptions

Analytic PrecisionIndicators Description

Measurement Validity/Logic of Research Process

Reliable Measures/Trustworthy Techniques

Appropriate and Systematics Analysis

The extent to which the findings of a study were generated from systematic, transparent, accurate and thorough analyses.

Page 41: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Standards of Evidence and Brief Descriptions

Analytic PrecisionIndicators Description

Unit of Analysis Issues Power Effect Size Multiple Instruments Multiple Respondents All Results

The extent to which the findings of a study were generated from systematic, transparent, accurate and thorough analyses.

Page 42: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Reporting the Findings

Considerations

Background (Context, sites, intervention, etc.) Evaluation study questions Evaluation procedures (description of

measures used and purposes) Study Sites and Sample Demographics Data Collection (administration, participants

counts, timelines for acquiring data, etc.) Data analyses (what methods for what

measures, limitations, missing data, etc.) Findings Conclusions (and recommendations)

Page 43: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Standards of Evidence and Brief Descriptions

Generalizability/External ValidityIndicators Description

Findings for Whom Generalizable to population

or theory Generalizable to different

contexts

The extent to which you can come to conclusions about one thing (e.g., population) based on information about another (e.g., sample).

Page 44: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Disseminate the Information

Considerations

The funding source(s) Potential funding sources Others involved with similar projects or areas

of research Community members, especially those who

are directly involved with the project or might be involved

Members of the business or political community, etc.

Page 45: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Standards of Evidence and Brief Descriptions

Warrants for ClaimsIndicators Description

Limitations Decay and Delay of the

Effect Efficacy Conclusions/Implications

Logically Drawn from Findings

The extent to which the data interpretation, conclusions, and recommendations are justifiable based on the evidence presented.

Page 46: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Evaluation Topics and Components to Consider

Evaluation Topics Evaluation Design Component

Develop logic model Identify contextual

conditions

Development of a conceptual model (logic model) of the program

Articulate goals clearly Define multiple

achievement outcomes

Development of evaluation questions and measureable outcomes

Page 47: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Evaluation Topics and Components to Consider

Evaluation Topics Evaluation Design Component

Address shifting project and evaluation priorities

Development of the evaluation design

Format measures (hard-copy, electronic, etc.) and schedule administration

Display data effectively Data management

Collection of data

Page 48: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP Evaluation Evaluation Topics and Components to Consider

Evaluation Topics Evaluation Design Component

Conduct appropriate data analyses to respond to evaluation questions

Analysis of data

Report intended impact on various populations

Report findings to different audiences

Provision of information to interested audiences

Page 49: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

Meeting the Needs of MSP EvaluationOngoing Needs Assessment

• At your tables, please write down one or two anticipated evaluation challenges and/or needs that your project perceives it may need assistance related to project/program evaluation.

Page 50: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

What Questions Do You Have Regarding TEAMS?

TEAMS contact information: teams.mspnet.org

Meeting the Needs of MSP Evaluation

Page 51: John Sutton Principal Investigators Meeting – MSP FY 12 Washington, DC December 16, 2013.

TEAMS Contacts

John T. Sutton, PI Dave Weaver, Co-PI

RMC Research Corporation633 17th Street Suite 2100 Denver, CO 80202-1620

RMC Research Corporation111 SW Columbia Street Suite 1030 Portland, OR 97201-5883

Phone: 303-825-3636 Toll Free: 800-922-3636 Fax: 303-825-1626 Email: [email protected]

Phone: 503-223-8248 Toll Free: 800-788-1887 Fax: 503-223-8399 Email: [email protected]