Top Banner
RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding
25

RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.

Mar 31, 2015

Download

Documents

Gael Wilby
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.

RAE Review:

Interim feedback from facilitated workshops

Jonathan Grant and Steven Wooding

Page 2: RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.

Plan

Introduction Who we are and what we do Where we went and who we spoke to What we did and why we did it

Results from 6/9 workshops Task 1: What is quality research? How should it be assessed? Task 2: Assessing model systems Task 3: Building a better system Task 4: Implementation and implications

Some emerging observations Preference for expert review, but needs refining Increased transparency and clarity of process Tension between comparability and appropriateness Need structures to support change

Page 3: RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.

Who we are and what do we do

RAND Europe

Independent not-for-profit public policy think-tank

‘Cousin’ of RAND; US based independent think-tank employing 1600 researchers

RAND Europe. Established in Leiden (NL) in 1992; Cambridge in 2001

In UK programmes include Transport, Information Society, Health and R&D Policy

Current projects include VFM study on government department research for NAO and on scientific mobility for WHO

Page 4: RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.

Where we went and who we spoke to

Based on 6/9 Workshops

93 people

Positions

26 Administrators 38 Senior academics 13 Academics 8 Research Fellows 8 Unclassified

Fields 32 Medicine, science & engineering 16 Social Science 15 Arts and Humanities 29 Not research active 1 Unclassified

*Excludes Cambridge, Reading & Belfast (n=c50)

Page 5: RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.

What we did and why we did it

Facilitated workshops

Provide framework for structured thinking

Captured outputs in a standard and comparable form

Allows comparison between mixed and like groups of people (e.g., administrators only vs. mix of HoDs, fellows, and research officers)

Purpose is to listen, not evaluate

Page 6: RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.

Agenda

Task 1: What is quality research? How

should it be assessed?

Task 2: Assessing model systems

Task 3: Building a better system

Task 4: Implementation and

implications

Page 7: RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.

Task 1: What is high quality research? How should it be assessed?

Purpose

Stimulate wide ranging thinking on the most important aspects marking out high quality research and research assessment systems

Task 1

Introductions

Identify 5 characteristics of high quality research

Identify 5 characteristics of research assessment system

Vote (5 votes for each; allocated as seen fit)

Page 8: RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.

‘Top 10’ characteristics of high quality research

0 10 20 30 40

Innovative

Longevity

Value for Money

High Quality Publication

Academic Impact

Rigour

Peer Recognition

Defining Research Agenda

Original

International

Page 9: RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.

‘Top 10’ characteristics of research assessment

systems

0 10 20 30 40 50 60 70

Resists Game Playing

Cost Effective

Advance

Not Burdensome

Rigour

Quality not Quantity

Comparable

Explicit

Flexible

Transparent

Page 10: RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.

Task 2: Assessing models systems

Purpose

Evaluate the strengths and weaknesses of 4 ‘model systems’: Expert review; Algorithms/metrics; Self-assessment; and, Historical Ratings

Task 2

Split into 4 groups of c4-5

2 groups look at ‘good’ aspects of 2 systems each

2 groups look at ‘bad’ aspects of 2 systems each

Also identify questions that need answering

Page 11: RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.

Algorithms / metrics

Selected questions How to recognise novelty? What do you count? How to ensure comparability?

‘Good features’ Transparent (4) Objective (3) Cheap (3) Simple (2)

‘Bad features’ Not suitable for all subjects (5) Spurious objectivity (3) Open to ‘game playing’ (2) Metrics are proxy measures (2)

Page 12: RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.

Expert review

Questions Who are the experts and how are they selected? How do you build in an appeals mechanism How do you recognise innovation?

‘Good features’ Acceptable to community

(5) Based on specialised

knowledge (3)

‘Bad features’ Not comprehensive (3) Not transparent (3) Perceived bias (3) In consistent (2) Expensive (2)

Page 13: RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.

Historical ratings

Questions How do you take account of changing performance? Who makes the judgement? (HEI or individual) How far back?

‘Good features’ Light touch (3) Cheap (2) Ability to plan (2)

‘Bad features’ Inhibits change (3) Low credibility (3) Perpetuates silos (2)

Page 14: RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.

Self-assessment

Questions How would you police it? Who sets the goals? How do you penalise inflated results?

‘Good features’ Sensitive to discipline (2) Formative – considers self

(2) Ownership & Trust (2)

‘Bad features’ No cross discipline

comparability (3) Open to game playing (3) No confidence in system (2) Effort could be large (2)

Page 15: RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.

Task 3: Building a better system

Purpose

Design ideal research assessment system

Task 3

Split in to 3 or 4 different groups of c4-7

Select ‘seed’ system from Task 2

Build on this using aspects of other system

Present back to plenary

Page 16: RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.

Identifying a base system

Starting point

Expert review (16/18 breakout groups)

Self-assessment (1/18 breakout groups)

1 ‘failed’ (to reach a decision)

(remember based on 6/9 workshops)

Page 17: RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.

Refining the expert system

Transparency and clarity of process

Establish rules at outset

Early (i.e., up to 5 years before assessment)

Don’t change rules during process

Provide feedback

Legal contract between FCs and HEIs (rules won’t change, in return for guarantee of no challenge)

Clarity of funding outcome

Page 18: RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.

Refining the expert system

Longer time period between

assessments

Review every 8-10 years

‘Triggering’ mechanisms for interim review at 4-5 years.

Self declaration for new or emerging areas

Metrics for decline areas

Sampling (selective or random) with other departments

Page 19: RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.

Refining the expert system

Broader panels

Mirror Research Councils/AHRB

Broad or supra-panels have broad freedom to establish discipline specific rules

Sub-panel operate to those rules, reporting to supra-panel

Aims to solve tension between comparability and appropriateness

Page 20: RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.

Refining the expert system

Continuous rating scale

More grades

Summation of individuals ‘scores’

Possibly based on ranking

Continuous funding scale

No (or reduced) step changes in funding

Page 21: RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.

Refining the expert system

Conceptual (non subject) AoAs

Based on research outputs

Formal sciences Theorems

Explanatory sciences Laws

Design sciences Technological rules

Human sciences Artifacts and knowledge

‘Users’ from relevant AoA

Page 22: RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.

Refining the expert system

‘Lay‘ Panel members

Possible experience lay members (such as judges) as panel chair

Rolling review

One discipline every year

Transfer fees

Reward departments that nurture future high flyers

Return all staff

Page 23: RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.

Task 4: Implementation and implications

Purpose

Examine system would be put into practice and evaluate repercussions

Task 4

Working in same groups with ‘devil’s advocate’

Identify 5 steps for implementation

Identify 5 things that could go wrong with the system

Identify 5 changes to the UK research as a result of the new research assessment system

Page 24: RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.

Implication of change

Some changes to UK research:

Ensure research capacity is activated wherever it is found

Better support and funding for younger researchers

Good support for emerging areas of research

Equal opportunities are improved

Better recognition of interdisciplinary research

Funding directly following excellence

Less obsession with the RAE

Page 25: RAE Review: Interim feedback from facilitated workshops Jonathan Grant and Steven Wooding.

Emerging observations

Preference for expert review, but needs refining

Increased transparency and clarity of process

Tension between comparability and

appropriateness

Need structures to support change

But ….

Excludes Cambridge, Reading and Belfast

Not analysed by discipline and or profession