Top Banner
Delivering change: demonstrating success Peter Andrews and Margaret L Ruwoldt Service Improvement and Innovation in Universities 11 - 12 August 2016
27

Peter Andrews and Margaret L Ruwoldt Service Improvement ...

May 12, 2022

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Peter Andrews and Margaret L Ruwoldt Service Improvement ...

Delivering change:demonstrating success

Peter Andrews and Margaret L Ruwoldt

Service Improvement and Innovation in Universities11-12 August 2016

Page 2: Peter Andrews and Margaret L Ruwoldt Service Improvement ...

The student satisfaction gap

When performance exceeds expectation, the student is satisfied.

If our performance fails to meet expectations the student is disappointed and, consequently, dissatisfied.

Page 3: Peter Andrews and Margaret L Ruwoldt Service Improvement ...

American Customer Satisfaction Index (ACSI)

Management

Front lineServiceDelivery

Perceived Service

Expected service

Student satisfaction gap

Service Specifications

Student

Perception of student need /

expectation

Influenced by:- peers- needs & circumstances - other experiences-

Service communication

Del

iver

y ga

p

Policy gap

Page 4: Peter Andrews and Margaret L Ruwoldt Service Improvement ...

Flashback to 2011: many potential data points…FTE (HEW) by

function and activity/process (Measure inputs)

• Uniforum

Student Satisfaction

Assessments(Measure outputs /

outcomes)

• International Student Barometer (ISB)• InSync Student Services Survey • Best practice learnings• Australian Graduate Survey • Student Service Support Survey (Insync)• Customer Service Benchmarking Australia

(CSBA)• Melbourne Experience Survey (MES)• Service Performance Matrix discussion• Institutional review processes• Student focus groups

Page 5: Peter Andrews and Margaret L Ruwoldt Service Improvement ...

… but no systematic evaluative approach Assessments Benchmarking Surveys

Baseline for targets

Page 6: Peter Andrews and Margaret L Ruwoldt Service Improvement ...

UoM Student Service Evaluation Framework 2011

Page 7: Peter Andrews and Margaret L Ruwoldt Service Improvement ...

SSEF objectivesInform decision making and enable planning and prioritization of programs and services that

need to be improved

Provide tangible evidence to ensure we are appropriately investing resources into programs

and services that will benefit students

Help improve the effectiveness of our programs and services

Help identify efficiency gains and opportunities in the delivery of our programs

and services

Provide an opportunity to benchmark our programs and services against other service

providers

Page 8: Peter Andrews and Margaret L Ruwoldt Service Improvement ...

Use ratings

To set standards in service

scorecards

On targeted functions / activities

To improve service delivery

To meet student

expectation

To increase satisfaction

ratings

How SSEF worked

Page 9: Peter Andrews and Margaret L Ruwoldt Service Improvement ...

How SSEF worked: setting targetsTest assumptions about the assessments Set targets against student assessments prioritising low performing yet high impact areas to minimise the gapUse historical data to baseline targetsReview, refine and agree on the targets with stakeholdersRepresent targets in scorecards – set high if doing well, low (but higher than baseline) when doing poorly

Page 10: Peter Andrews and Margaret L Ruwoldt Service Improvement ...

Align scorecard targets to the values and vision of the services

How SSEF works: ScorecardsHow SSEF worked: scorecards

Dimensions:1. Customer service2. Service delivery3. Business practices4. Quality assurance5. Compliance

Page 11: Peter Andrews and Margaret L Ruwoldt Service Improvement ...

Internal Survey A (User)

External Survey B (User)

Survey C (Stakeholder)

1 2 3 1 2 3 1 2 3

• Aligning common survey indicators across different survey types - indicators 1, 2 and 3 across the three surveys are common

• Evaluate performance trends/patterns (e.g. always poor, variable, always good across time and surveys, across all or some service providers)

How SSEF worked: align and evaluate measures

Page 12: Peter Andrews and Margaret L Ruwoldt Service Improvement ...

Flash forward: Melbourne University in 2015-16

• 45,000 EFT student enrolments• 33% international, 50% postgraduate, 55% female

• Broad UG degrees, professional quals at grad level• Ranked #1 in Australia, top 50 international• $2bn income

• 40% govt grants, 50% student fees, 10% other• 7500 FTE staff in 2016

• split 50-50 academic/professional

Page 13: Peter Andrews and Margaret L Ruwoldt Service Improvement ...

Strategy: teaching, learning, student experience

• International mobility• Work-integrated learning• Blended learning, flexibility, innovative teaching• Expanded scholarships program• Increase indigenous enrolments• Research-led curriculum• Student precinct & housing

Page 14: Peter Andrews and Margaret L Ruwoldt Service Improvement ...

The Melbourne Operating Model

Page 15: Peter Andrews and Margaret L Ruwoldt Service Improvement ...

SystemsProcesses

PolicyShared Services

Changing student profileInvestment in research

and teachingE-learning

Funding mixStaffing limitsCost savings

Student experienceAcademic achievementBusiness improvement

Page 16: Peter Andrews and Margaret L Ruwoldt Service Improvement ...

Revisiting SSEF: design into practiceDesign Reality

Compile and evaluate data annually Labour-intensive, manual data processing

Prioritise areas for improvement Organisational change

Form & test hypothesis about root causes Pressure to deliver quick wins

Verify opportunities & feasibility Complexity of organisationalstructure and systems

Stakeholders agree on preferred solution Competing business priorities

Plan and implement Resource constraints

Benefits profiling: measure, track, report Lag outcomes; baseline measures often not available

Page 17: Peter Andrews and Margaret L Ruwoldt Service Improvement ...

Updating SSEF: finding appropriate indicatorsLead indicators measure service improvement action – what actions are to be done to improve performance

• more likely to be measures of actions taken to improve processes / user capacity / capability

Lag indicators measure performance that has already happened

% of students reading help

guides

% of students successfully

complete task unaided

$ savings in staff time

Lead Lag

% of staff undertake

training

Page 18: Peter Andrews and Margaret L Ruwoldt Service Improvement ...

Service more

students

Increased staff skills

Faster turnaround

timeIncrease student

information

Updating SSEF: the problem of benefit attribution

Improved student

satisfaction

Accountability for the change: is the benefit due to the improvement effort or some other factor?

Page 19: Peter Andrews and Margaret L Ruwoldt Service Improvement ...

Updating SSEF: other design challenges

Data integrity, access, maintenance & availabilityUnderstanding student needs and expectations:

• Consultation, representation, feedback• PLUS co-creation and engagement in decision-making

Shared Services model introduces complexityMultiple stakeholders in every process & project

Page 20: Peter Andrews and Margaret L Ruwoldt Service Improvement ...

The UK Gov example

Page 21: Peter Andrews and Margaret L Ruwoldt Service Improvement ...

A new way of thinking about performance

Simplified framework to enable consistent and regular reporting of performance:• against the benefits outlined by Melbourne

Operating Model • to clients who fund and/or use the services• based on stakeholder needs,

and when stakeholders need it• aimed at continual improvement.

Page 22: Peter Andrews and Margaret L Ruwoldt Service Improvement ...

A new way of thinking about performance

• Evaluate only what is within our control• Use modern tools to allow stakeholders to explore

data from different perspectives• Align reporting with operational drivers:

• Improve student experience • Enable academic performance• Improve professional excellence• Growth and reinvestment

Page 23: Peter Andrews and Margaret L Ruwoldt Service Improvement ...

Academic Services Performance Framework 2016Mandatory KPIs Inter-library loans Enrolment

variationsCourse advice Careers

workshopsCost per transaction (or user)

$2.30 per item EVs per student (head count): Engineering1.8, Science 0.7

2 x HEW7 FTE advisers (small faculty)

External facilitator $1500/day, up to 50 students

Completion rate 75% of requests fulfilled

All EVs completed within 2 business days

Number of drop-insessions per week

Attendance at workshops

Digital engagement

10% of requests lodged on paper or via email

80% of students able to self-resolve EVs online

Usage of self-helpinfo on website

Attendance at webinars, esp. Southbank & other small campuses

User satisfaction >96% in MelbResearch ExpSurvey (MRES)

Increased satisfaction in StdtExp Survey (SES)

Stdt Exp Survey (SES), thumbs-up in Stop1 queue system

10% increase in LinkedIn connections

Page 24: Peter Andrews and Margaret L Ruwoldt Service Improvement ...

Making it sustainable

Page 25: Peter Andrews and Margaret L Ruwoldt Service Improvement ...

Making it sustainable

Page 26: Peter Andrews and Margaret L Ruwoldt Service Improvement ...

Questions and discussion

Page 27: Peter Andrews and Margaret L Ruwoldt Service Improvement ...