Delivering change: demonstrating success Peter Andrews and Margaret L Ruwoldt Service Improvement and Innovation in Universities 11 - 12 August 2016
Delivering change:demonstrating success
Peter Andrews and Margaret L Ruwoldt
Service Improvement and Innovation in Universities11-12 August 2016
The student satisfaction gap
When performance exceeds expectation, the student is satisfied.
If our performance fails to meet expectations the student is disappointed and, consequently, dissatisfied.
American Customer Satisfaction Index (ACSI)
Management
Front lineServiceDelivery
Perceived Service
Expected service
Student satisfaction gap
Service Specifications
Student
Perception of student need /
expectation
Influenced by:- peers- needs & circumstances - other experiences-
Service communication
Del
iver
y ga
p
Policy gap
Flashback to 2011: many potential data points…FTE (HEW) by
function and activity/process (Measure inputs)
• Uniforum
Student Satisfaction
Assessments(Measure outputs /
outcomes)
• International Student Barometer (ISB)• InSync Student Services Survey • Best practice learnings• Australian Graduate Survey • Student Service Support Survey (Insync)• Customer Service Benchmarking Australia
(CSBA)• Melbourne Experience Survey (MES)• Service Performance Matrix discussion• Institutional review processes• Student focus groups
… but no systematic evaluative approach Assessments Benchmarking Surveys
Baseline for targets
UoM Student Service Evaluation Framework 2011
SSEF objectivesInform decision making and enable planning and prioritization of programs and services that
need to be improved
Provide tangible evidence to ensure we are appropriately investing resources into programs
and services that will benefit students
Help improve the effectiveness of our programs and services
Help identify efficiency gains and opportunities in the delivery of our programs
and services
Provide an opportunity to benchmark our programs and services against other service
providers
Use ratings
To set standards in service
scorecards
On targeted functions / activities
To improve service delivery
To meet student
expectation
To increase satisfaction
ratings
How SSEF worked
How SSEF worked: setting targetsTest assumptions about the assessments Set targets against student assessments prioritising low performing yet high impact areas to minimise the gapUse historical data to baseline targetsReview, refine and agree on the targets with stakeholdersRepresent targets in scorecards – set high if doing well, low (but higher than baseline) when doing poorly
Align scorecard targets to the values and vision of the services
How SSEF works: ScorecardsHow SSEF worked: scorecards
Dimensions:1. Customer service2. Service delivery3. Business practices4. Quality assurance5. Compliance
Internal Survey A (User)
External Survey B (User)
Survey C (Stakeholder)
1 2 3 1 2 3 1 2 3
• Aligning common survey indicators across different survey types - indicators 1, 2 and 3 across the three surveys are common
• Evaluate performance trends/patterns (e.g. always poor, variable, always good across time and surveys, across all or some service providers)
How SSEF worked: align and evaluate measures
Flash forward: Melbourne University in 2015-16
• 45,000 EFT student enrolments• 33% international, 50% postgraduate, 55% female
• Broad UG degrees, professional quals at grad level• Ranked #1 in Australia, top 50 international• $2bn income
• 40% govt grants, 50% student fees, 10% other• 7500 FTE staff in 2016
• split 50-50 academic/professional
Strategy: teaching, learning, student experience
• International mobility• Work-integrated learning• Blended learning, flexibility, innovative teaching• Expanded scholarships program• Increase indigenous enrolments• Research-led curriculum• Student precinct & housing
The Melbourne Operating Model
SystemsProcesses
PolicyShared Services
Changing student profileInvestment in research
and teachingE-learning
Funding mixStaffing limitsCost savings
Student experienceAcademic achievementBusiness improvement
Revisiting SSEF: design into practiceDesign Reality
Compile and evaluate data annually Labour-intensive, manual data processing
Prioritise areas for improvement Organisational change
Form & test hypothesis about root causes Pressure to deliver quick wins
Verify opportunities & feasibility Complexity of organisationalstructure and systems
Stakeholders agree on preferred solution Competing business priorities
Plan and implement Resource constraints
Benefits profiling: measure, track, report Lag outcomes; baseline measures often not available
Updating SSEF: finding appropriate indicatorsLead indicators measure service improvement action – what actions are to be done to improve performance
• more likely to be measures of actions taken to improve processes / user capacity / capability
Lag indicators measure performance that has already happened
% of students reading help
guides
% of students successfully
complete task unaided
$ savings in staff time
Lead Lag
% of staff undertake
training
Service more
students
Increased staff skills
Faster turnaround
timeIncrease student
information
Updating SSEF: the problem of benefit attribution
Improved student
satisfaction
Accountability for the change: is the benefit due to the improvement effort or some other factor?
Updating SSEF: other design challenges
Data integrity, access, maintenance & availabilityUnderstanding student needs and expectations:
• Consultation, representation, feedback• PLUS co-creation and engagement in decision-making
Shared Services model introduces complexityMultiple stakeholders in every process & project
The UK Gov example
A new way of thinking about performance
Simplified framework to enable consistent and regular reporting of performance:• against the benefits outlined by Melbourne
Operating Model • to clients who fund and/or use the services• based on stakeholder needs,
and when stakeholders need it• aimed at continual improvement.
A new way of thinking about performance
• Evaluate only what is within our control• Use modern tools to allow stakeholders to explore
data from different perspectives• Align reporting with operational drivers:
• Improve student experience • Enable academic performance• Improve professional excellence• Growth and reinvestment
Academic Services Performance Framework 2016Mandatory KPIs Inter-library loans Enrolment
variationsCourse advice Careers
workshopsCost per transaction (or user)
$2.30 per item EVs per student (head count): Engineering1.8, Science 0.7
2 x HEW7 FTE advisers (small faculty)
External facilitator $1500/day, up to 50 students
Completion rate 75% of requests fulfilled
All EVs completed within 2 business days
Number of drop-insessions per week
Attendance at workshops
Digital engagement
10% of requests lodged on paper or via email
80% of students able to self-resolve EVs online
Usage of self-helpinfo on website
Attendance at webinars, esp. Southbank & other small campuses
User satisfaction >96% in MelbResearch ExpSurvey (MRES)
Increased satisfaction in StdtExp Survey (SES)
Stdt Exp Survey (SES), thumbs-up in Stop1 queue system
10% increase in LinkedIn connections
Making it sustainable
Making it sustainable
Questions and discussion