Keynote address The power of learning analytics: a need to move towards new methodologies in education? VU Amsterdam, 13 October 2016 @DrBartRienties Reader in Learning Analytics A special thanks to Avinash Boroowa, Aida Azadegan, Shi-Min Chua, Simon Cross, Rebecca Ferguson, Lee Farrington-Flint, Christothea Herodotou, Martin Hlosta, Wayne Holmes, Garron Hillaire, Simon Knight, Nai Li, Vicky Marsh, Kevin Mayles, Jenna Mittelmeier, Vicky Murphy, Quan Nguygen, Tom Olney, Lynda Prescott, John Richardson, Jekaterina Rogaten, Matt Schencks, Mike Sharples, Dirk Tempelaar, Lisette Toetenel, Thomas Ullmann, Denise Whitelock, John Woodthorpe, Zdenek Zdrahal, and others…A special thanks to Prof Belinda Tynan for her continuous support on analytics at the OU UK
84
Embed
Keynote address The power of learning analytics: a need to ... · Prescott, John Richardson, Jekaterina Rogaten, Matt Schencks, Mike Sharples, Dirk Tempelaar, Lisette Toetenel, Thomas
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Keynote address The power of learning analytics: a need to move towards new methodologies in education? VU Amsterdam, 13 October 2016
@DrBartRienties Reader in Learning Analytics
A special thanks to Avinash Boroowa, Aida Azadegan, Shi-Min Chua, Simon Cross, Rebecca Ferguson, Lee Farrington-Flint, Christothea Herodotou, Martin Hlosta, Wayne Holmes, Garron Hillaire, Simon Knight, Nai Li, Vicky Marsh, Kevin Mayles, Jenna Mittelmeier, Vicky Murphy, Quan Nguygen, Tom Olney, Lynda Prescott, John Richardson, Jekaterina Rogaten, Matt Schencks, Mike Sharples, Dirk Tempelaar, Lisette Toetenel, Thomas Ullmann, Denise Whitelock, John Woodthorpe, Zdenek Zdrahal, and others…A special thanks to Prof Belinda Tynan for her continuous support on analytics at the OU UK
What is learning analytics?
http://bcomposes.wordpress.com/
(Social) Learning Analytics “LA is the measurement, collection, analysis and reporting of data about learners
and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs” (LAK 2011)
Social LA “focuses on how learners build knowledge together in their cultural
and social settings” (Ferguson & Buckingham Shum, 2012)
Dyckhoff, A. L., Zielke, D., Bültmann, M., Chatti, M. A., & Schroeder, U. (2012). Design and Implementation of a Learning Analytics Toolkit for Teachers. Journal of Educational Technology & Society, 15(3), 58-76.
Agenda? You choose 1. The power of 151 Learning Designs on 113K+ students at the
OU? 2. Analytics4Action: evidence-based interventions? 3. OU Analyse: predictive analytics with automated student
recommender? 4. Key drivers for 100K+ student satisfaction? 5. Opportunities of learning analytics/elearning for educational
research, teaching practice, and wider policy implications.
Assimilative Finding and handling information
Communication
Productive Experiential Interactive/
Adaptive
Assessment
Type of activity
Attending to information
Searching for and processing information
Discussing module related content with at least one other person (student or tutor)
Actively constructing an artefact
Applying learning in a real-world setting
Applying learning in a simulated setting
All forms of assessment, whether continuous, end of module, or formative (assessment for learning)
Examples of activity
Read, Watch, Listen, Think about, Access, Observe, Review, Study
• student feedback data (140) • VLE data (141 modules) • Academic Performance (151)
• Data sets merged and cleaned • 111,256 students undertook these modules
Toetenel, L. & Rienties, B. (2016). Analysing 157 Learning Designs using Learning Analytic approaches as a means to evaluate the impact of pedagogical decision-making. British Journal of Educational Technology.
Constructivist Learning Design
Assessment Learning Design
Productive Learning Design
Socio-construct. Learning Design
VLE Engagement
Student Satisfaction
Student retention
Learning Design 151 modules
Week 1 Week 2 Week30+
Rienties, B., Toetenel, L., Bryan, A. (2015). “Scaling up” learning design: impact of learning design activities on LMS behavior and performance. Learning Analytics Knowledge conference.
n = 150 (Model 1-2), 140 (Model 3), * p < .05, ** p < .01 Table 4 Regression model of learner satisfaction predicted by institutional and learning design analytics
n = 150 (Model 1-2), 140 (Model 3), * p < .05, ** p < .01 Table 5 Regression model of learning performance predicted by institutional, satisfaction and learning design analytics
• Size of module and discipline predict completion
Rienties, B., Toetenel, L., (2016). The impact of learning design on student behaviour, satisfaction and performance: a cross-institutional comparison across 151 modules. Computers in Human Behavior, 60 (2016), 333-341
Communication
Toetenel, L., Rienties, B. (2016) Learning Design – creative design to visualise learning activities. Open Learning.
So what does the OU do in terms of interventions on learning analytics?
The OU is developing its capabilities in 10 key areas that build the underpinning strengths required for the effective deployment of analytics
Strategic approach
41
43
Analytics4Action framework
Implementation/testing methodologies
• Randomised control trials • A/B testing
• Quasi-experimental • Apply to all
Community of inquiry
framework: underpinning
typology
Menu of response actions
Methods of gathering data Evaluation Plans
Evidence hub
Key metrics and drill downs
Deep dive analysis and
strategic insight
46
Menu of actions Learning design (before start) In-action interventions (during module)
Cognitive Presence • Redesign learning materials
• Redesign assignments
• Audio feedback on assignments
• Bootcamp before exam
Social Presence • Introduce graded discussion forum activities
• Screencasts of “how to survive the first two weeks”
• Organise additional videoconference sessions
• Call/text/skype student-at-risk
• Organise catch-up sessions on specific topics that
students struggle with
Problem specification – the OU model
• Given: – Demographic data at the Start (may include information about
student’s previous modules studied at the OU and his/her objectives) – Assessments (TMAs) as they are available during the module – VLE activities between TMAs – Conditions student must satisfy to pass the module
• Goal: – Identify students at risk of failing the module as early as possible so
• Forum (F), Subpage (S), Resource (R), OU_content (O), No activity (N)
• Possible activity combinations in a week: F, FS, N, O, OF, OFS, OR, ORF, ORFS, ORS, OS, R, RF, RFS, RS, S
FS F RFS OFS ORF N O S RF R OF OR ORS ORFS OS RS
Start
FS F RFS OFS ORF N O S RF R OF OR ORS ORFS OS RS
FS F RFS OFS ORF N O S RF R OF OR ORS ORFS OS RS
FS F RFS OFS ORF N O S RF R OF OR ORS ORFS OS RS
FS F RFS OFS ORF N O S RF R OF OR ORS ORFS OS RS
FS F RFS OFS ORF N O S RF R OF OR ORS ORFS OS RS
Pass Fail No submit TMA-1 time
VLE opens
Start
Activity space
FS F RFS OFS ORF N O S RF R OF OR ORS ORFS OS RS
FS F RFS OFS ORF N O S RF R OF OR ORS ORFS OS RS
Start
FS F RFS OFS ORF N O S RF R OF OR ORS ORFS OS RS
FS F RFS OFS ORF N O S RF R OF OR ORS ORFS OS RS
FS F RFS OFS ORF N O S RF R OF OR ORS ORFS OS RS
FS F RFS OFS ORF N O S RF R OF OR ORS ORFS OS RS
Pass Fail No submit TMA-1 time
VLE opens
Start
VLE trail: successful student
FS F RFS OFS ORF N O S RF R OF OR ORS ORFS OS RS
FS F RFS OFS ORF N O S RF R OF OR ORS ORFS OS RS
Start
FS F RFS OFS ORF N O S RF R OF OR ORS ORFS OS RS
FS F RFS OFS ORF N O S RF R OF OR ORS ORFS OS RS
FS F RFS OFS ORF N O S RF R OF OR ORS ORFS OS RS
FS F RFS OFS ORF N O S RF R OF OR ORS ORFS OS RS
FS F RFS OFS ORF N O S RF R OF OR ORS ORFS OS RS
Pass Fail No submit TMA-1 time
VLE opens
Start
VLE trail: student who did not submit
Probabilistic model: all students time
TMA1
VLE
start
Module VLE Fingerprint
Four predictive models built from legacy data by Machine Learning
Prediction sheet: example
Dashboard: Module view Time Machine and VLE overview with notifications
Prediction table Dashboard: Module view
Filter
Dashboard: Module view
Dashboard: Student view VLE activities, TMA results, time machine
Nearest neighbours, Predictions with real scores, Personalised recommender
Dashboard: Student view
Feedback from tutors
Dashboard: Tutor view
Background of QAA Study
• HE increasingly competitive market: student satisfaction has become an important component of Quality Assurance (QA) and Quality Enhancement (QE, Kember & Ginns, 2012; Rienties, 2014).
• Measurement of student satisfaction is important to pinpoint strengths and identify areas for improvement (Coffey & Gibbs, 2001; Zerihun, Beishuizen, & Os, 2012).
• Potential benefits and drawbacks of student evaluations have been well-documented in the literature (see for example Bennett & De Bellis, 2010; Crews & Curtis, 2011),
o Recent research continues to suggest strong resistance amongst academic staff (Crews & Curtis, 2011; Rienties, 2014).
o Most student survey instruments lack of focus on key elements of rich learning, such as interaction, assessment and feedback.
• With the increased importance of NSS and institutional surveys on academic and educational practice, there is a need for a critical review of how these data are used for QA and QE.
Key Questions of the Project
1. To what extent are institutions using insights from NSS and institutional surveys to transform their students’ experience?
2. What are the key enablers and barriers for integrating student satisfaction data with QA and QE
3. How are student experiences influencing quality enhancements a) What influences students’ perceptions of overall satisfaction the most? Are student
characteristics or module/presentation related factors more predictive than satisfaction with other aspects of their learning experience?
b) Is the student cohort homogenous when considering satisfaction key drivers? For example are there systematic differences depending on the level or programme of study?
Methodology (Logistic Regression) & Validation
Step 1: A descriptive analysis was conducted to discount variables that were unsuitable for satisfaction modelling.
Step 1 also identified highly correlated predictors and methodically selected the most appropriate.
Module
Presentation
Student
Concurrency
Study history
Overall Satisfaction
SEaM
UG new, UG continuing, PG new and PG continuing students were modelled separately at Step 2.
Step 2: Each subset of variables was modelled in groups. The variables that were statistically significant from each subset were then combined and modelled to identify the final list of key drivers
We found at Step 3 that the combined scale provided the simplest and most interpretable solution for PG students and the whole scale for UG students. The solution without the KPI’s included was much easier to use in terms of identifying clear priorities for action.
Step 3 Validation: all models have been verified by using subsets of the whole data to ensure the solutions are robust. A variety of model fit statistics have also been used to identify the optimum solutions.
Satisfaction Modelling: Undergraduate Continuing Students
% planned life cycle
15
Module: Examinable Component
14
Module: Level of study
13
Module: Credits
12
Q6 Method
of delivery
11
Q11 Assignmen
t completio
n
09
Q23 Tutor knowledg
e
07
Q3 Advice &
guidance
05
Q13 Qualificati
on aim
03
KPI-05 Teaching materials
01
KPI-06 Workload
10
Q9 Assignmen
t instruction
s
08
Q14 Career
relevance
06
Q5 Integratio
n of materials
04
Q36 Assessment
02
Importance to Overall Satisfaction
Li, N., Marsh, V., & Rienties, B. (2016). Modeling and managing learner satisfaction: use of learner feedback to enhance blended and online learning experience. Decision Sciences Journal of Innovative Education, 14 (2), 216-242.
Satisfaction Modelling: Undergraduate New Students
Age
07
Q14 Career
relevance
05
Q3 Advice &
guidance
03
KPI-05 Teaching materials
01
Q13 Qualificatio
n aim
06
Q5 Integratio
n of materials
04
Q36 Assessment
02
Importance to Overall Satisfaction
Li, N., Marsh, V., Rienties, B., Whitelock, D. (2016). Online learning experiences of new versus continuing learners: a large scale replication study. Assessment & Evaluation in Higher Education. DOI: 10.1080/02602938.2016.1176989.
Conclusions (Part I)
1. Learning design strongly influences student engagement, satisfaction and performance
2. Visualising learning design decisions by teachers lead to more interactive/communicative designs
Conclusions (Part II)
1. 10 out of 11 modules improved retention
2. Visualising learning analytics data can encourage teachers to intervene in-presentation and redesign afterwards
Conclusions (Part III)
1. Need for educational theory to unpack learning analytics
2. Need for educationalists to embrace learning analytics ….
3. Lets work together
Keynote address The power of learning analytics: a need to move towards new methodologies in education? VU Amsterdam, 13 October 2016
@DrBartRienties Reader in Learning Analytics
A special thanks to Avinash Boroowa, Aida Azadegan, Simon Cross, Rebecca Ferguson, Lee Farrington-Flint, Christothea Herodotou, Martin Hlosta, Lynda Prescott, Kevin Mayles, Tom Olney, Jekaterina Rogaten, Dirk Tempelaar, Lisette Toetenel, Thomas Ullmann, Denise Whitelock, John Woodthorpe, Zdenek Zdrahal, and others…A special thanks to Prof Belinda Tynan for her continuous support on analytics at the OU UK