Measuring the Impact of Vizzle on Student Learning Outcomes Research Report Vizzle Visual Learning Software By: Kristine Turko, PhD July 30, 2018
Measuring the Impact of Vizzle on Student Learning Outcomes
Research Report
Vizzle Visual Learning SoftwareBy: Kristine Turko, PhDJuly 30, 2018
2Measuring the Impact of Vizzle on Student Learning Outcomes
ContentsOverview 3
Teacher and Student Participant Profiles 4
Method 5
Data Analysis 6
Summary and Conclusions 9
Research Phase Summary Reports 11
Phase I: Preliminary Work, Teacher and Participant Selection 11
Phase II: Teacher Training 12
Phase III: Data Collection 13
References 13
3Measuring the Impact of Vizzle on Student Learning Outcomes
Overview
Vizzle’s potential role
in the classroom is
misunderstood and
undervalued.
Teachers are
embracing
technology in
the classroom.
Students are eager
to engage in on-line
learning resources.
Teachers
regularly dedicate
classroom time to
on-line learning
opportunities.
Research questions
Is Vizzle an effective
learning tool?
How much student
interaction with Vizzle
is needed in order to
measure learning?
Is Vizzle equally effective
in neurotypical and
neurodiverse students?
Are teachers using Vizzle
appropriately to measure
learning improvement?
Assumptions
4Measuring the Impact of Vizzle on Student Learning Outcomes
Teacher and Student Participant Profiles
Seven teacher participants were selected
on a voluntary basis with the help of Jane
Stoner, Director of Special Education at
Alliance City Schools. All teachers agreed
to participate throughout the Spring 2018
semester in collecting data from students
using visual learning technology in the
classroom. Participation involved student
recruitment and consent, visual learning
software instruction to ensure data was
collected in a useable format, selection
of comparable lessons across software
platforms, and regular meetings with Kristine
Turko for instruction and review.
Student participants were chosen based
on diagnostic evaluations and teacher
recommendations. It is important to note that
student transition is frequent in Alliance City
School District, which contributes to attrition
and emphasizes the need for visual learning
software that accommodates students who
often move between teachers.
5Measuring the Impact of Vizzle on Student Learning Outcomes
Method
Procedure
Teachers were instructed to collect
data for a period of 4 weeks per
condition (i.e., Vizzle vs. Other).
All classrooms had dedicated
technology time prior to the start
of the research and teachers were
instructed to use the software
during this allotted time. Time
varied greatly from one classroom
to the next with minimum weekly
use at (M = 20 min) and maximum
weekly use of (M = 90 min).
Quick Stats
Teachers trained to use Vizzle prior to the start
of the study (i.e., established users) dedicated
significantly more time per week to the use of
technology in their classroom than new users.
Data reports provide measures of use that are
not standard across platforms. For example,
Vizzle reports time per lesson, while MobyMax
reports weekly use. This makes comparison
across platforms more challenging and limits
comparison to percent correct. However,
the amount of time spent engaged in on-line
learning was standardized across platforms.
Independent Variables
Dependent Variables
Variables were chosen prior to the start of the research. While additional variables could be useful (e.g.,
student diagnosis, focus (# of problems solved in less than 2 minutes divided by the total number of
problems), number of problems completed, etc.), conservative standards were selected for meaningful
comparison with a small sample size. Data for incomplete lessons was excluded (175/598).
Software
(Vizzle vs. Other)
Student
(IEP vs. TD*)
Time per
lesson
Percent
correct
(*Typically Developing)
6Measuring the Impact of Vizzle on Student Learning Outcomes
STUDY
STUDY
1
2
Data Analysis
One-way between groups
experimental design examining
the difference between IEP
and TD students on PERCENT
CORRECT USING VIZZLE.
One-way between groups
experimental design examining
the difference between IEP
and TD students on TIME PER
LESSON USING VIZZLE.
Group
IEP
IEP
8
8
11.69164
1.33795
4.13362
.47304
TD
TD
11
11
8.52838
.77255
2.57140
.23293
Mean NStd.
DeviationStd. Error
Mean
88.138%correct
81.2316%correct
3.0398Min.
5.8632Min.
Independent
Samples Test:
t-test for Equality
of Means
t dfSig.
(2-tailed)Mean
DifferenceStd. ErrorDifference
1.493 1.493 1.493 1.493 1.493
-5.825 -5.825 -5.825 -5.825 -5.825
percent correct using Vizzle.
time per lesson using Vizzle.
Group Statistics
7Measuring the Impact of Vizzle on Student Learning Outcomes
STUDY
3
Data Analysis
One-way repeated measures experimental design examining the difference
between VIZZLE AND MOBYMAX amongst STUDENTS WITH IEPs on
PERCENT CORRECT, with lessons matched for time on task and subject.
Paired Samples Test: Paired Differences tMean df
Sig. (2-tailed)
Std. Deviation
Std. ErrorMean
2.86218.63802 7 .024 18.422 6.51316percent correct using Vizzle and MobyMax
Paired Samples Statistics
NStd.
DeviationStd. Error
Mean
88.138%correct
Mean
69.5%correct
8 11.69164 4.13362
8 24.65766 8.7178
8Measuring the Impact of Vizzle on Student Learning Outcomes
STUDY
4
Data Analysis
Group comparison amongst students in the NEUROTYPICAL CLASSROOM
examining the average PERCENTAGE CORRECT on lessons in VIZZLE
VERSUS IREADY. iReady data for 8 of the 11 students in that classroom was
available for this analysis.
Group Statistics
The student outcomes in Vizzle versus iReady is statistically equivalent. Students in both Vizzle and
iReady average approximately 79% correct across lessons.
NStd.
DeviationStd. Error
MeanMean
8 8.62318 3.04875
8 12.73283 4.5017478.875%correct
78.845%correct
Independent Samples Test:
t-test for Equality of Means t dfSig.
(2-tailed)Mean
DifferenceStd. ErrorDifference
-.006 14 .996 -.03000 5.43696percent correct
Equal variances assumed
9Measuring the Impact of Vizzle on Student Learning Outcomes
Summary and ConclusionsUse of technology in education has skyrocketed
since the birth of the Apple iPad in 2010,
particularly in the form of visual learning software
(Allen, Hartley, & Cain, 2016). There is very little
research investigating the efficacy of visual
learning technology within the population of
children diagnosed with ASD. However, the pace
of the software production continues to increase.
Data is needed to help determine, 1. if the
technology is effective, and 2. how it can better
serve its intended population.
While the current research suggests that students
are completing lessons at a mastery level, we do
not know if the practiced skills generalize beyond
the on-line learning environment (Fletcher-Watson
et al., 2015). Real world application is difficult, if
not impossible, to measure in a controlled manner.
However, comparison to mastery between practice
and use in standardized testing could lead to
further insight. Regardless of generalization, the
current research suggests that on-line practice
increases levels of engagement.
10Measuring the Impact of Vizzle on Student Learning Outcomes
• While Vizzle is effective in educating students with
IEPs and those without, the value of the product
lies within the features that benefit students with
special needs. Administrators, teachers, and
parents are all stakeholders who need to know
about the attributes of Vizzle that make it a stand
out relative to competitors.
• There was a significant difference in the level of
mastery between Vizzle and MobyMax within
students with an IEP, such that more mastery was
demonstrated when using Vizzle (88% versus 70%,
respectively). Note that teachers matched lessons
across software (e.g., Vizzle – Long or Short Vowel
Words; MobyMax – Find and Say Long Vowel
Sounds) to control for difficulty across platforms.
• Interestingly, students with IEPs and typically
developing students completed lessons at the
same level of mastery. This suggests that the level
of difficulty was appropriate within each group.
• One distinguishing feature present in Vizzle but not
MobyMax is the ability to personalize feedback.
This was a feature utilized by the participating
teachers. It is possible that this personalization
promoted more engagement, which then led to
better outcomes, as has been found in previous
research (Kucirkova et al., 2014).
Summary and Conclusions
Phase I: Preliminary Work, Teacher and Participant Selection
In Phase 1 of the research study I have met with the following administrators
in Alliance City Schools (ACS) to discuss the Vizzle research project and its
timeline: Jane Stoner (Special Education Coordinator), Corey Muller (Principal,
Parkway Elementary, and Michelle Balderson (Principal, Rockhill Elementary).
I have identified teachers that will participate in the data collection phase of
the study (Phase 3). Six teachers in Alliance City School District have agreed
to participate. Five of the teachers work with students with special needs,
and one is a teacher in a neurotypical 2nd grade classroom. I met with each
teacher individually during the week of February 19th, and we met as a group
on February 28th. During those meetings the project objectives, expectations,
and timelines were discussed.
ACS Student Participants
Students in each special need’s classroom have been chosen for assessment
purposes (12 students total, ranging from preschool to high school). All of
the students in the neurotypical 2nd grade classroom will be included in the
assessment (approximately 30 students) .
Learning goals have been selected for all student participants and parent
consent forms are being collected. The participating teachers are currently
familiar with Vizzle, iReady, and Moby Max. However, no one visual learning
tool is used consistently in the classrooms. The teachers will systematically
compare these tools. There will be 3 weeks of data collection each, for Vizzle
and one comparison tool, for all student participants.
Phase II: Teacher Training
Teacher training for data collection is set for the week of March 12th. I am
piloting data collection with each teacher the week of March 19th. Data
collection will occur over the span of 6 weeks, March 24-May 4 (this is 2
weeks longer than originally planned, as it was determined that additional time
is needed to accumulate the data required for analysis).
ACS Teacher
Participants
BRIAN BADERRockhill ElementaryIntervention [email protected]
STEPHANIE BARRAlliance High SchoolIntervention [email protected]
JASON DOTSONParkway ElementaryIntervention [email protected]
KATHERINE ELLIOTTEarly Learning CenterKindergarten Intervention [email protected]
LUCINDA OWENSAlliance High SchoolIntervention [email protected]
BECKY SIVULARockhill Elementary2nd Grade Classroom [email protected]
LESLI WALLERRockhill Elementary2nd Grade Classroom [email protected]
11Measuring the Impact of Vizzle on Student Learning Outcomes
Research Phase Summary Reports
Phase II: Teacher Training
In Phase 2 I have trained the teachers that are participating in the data collection phase of the study (Phase
3). Seven teachers in Alliance City School District are collecting data. Five of the teachers work with students
with special needs, and two are teachers in a neurotypical 2nd grade classroom (one additional teacher in
the neurotypical classroom joined the group to help with the data collection, I have added her to the teacher
participant list that follows). There is a total of five different classroom environments from which participants have
been chosen, ranging from kindergarten to high school.
- Jason Dotson (3 students)
- Kath Elliott (2 students)
- Brian Bader (3 students)
- Stephanie Barr and Lucinda Owens (2 students)
- Becky Sivula and Lesli Waller (11 students)
I met with each teacher two times in their individual classrooms to plan their baseline and testing phases for
the project. In addition to training, student consent forms were collected during this phase and IEP/ERTs were
reviewed (i.e., intake review) for each student with an autism diagnosis. The intake review will be summarized in the
final report.
The manipulated independent variable in all classrooms is exposure to Vizzle. The dependent variable is the
amount of time it takes to meet learning goal criteria, and the percent correct for each goal.
Phase III: Data Collection
The data collection phase is in progress. Baseline data collection will conclude on April 20th.
12Measuring the Impact of Vizzle on Student Learning Outcomes
Research Phase Summary Reports
Phase III: Data Collection
Seven teachers in Alliance City School District collected data. Five of the teachers work with students with
special needs, and two are teachers in a neurotypical 2nd grade classroom. Data was collected in five classroom
environments, ranging from kindergarten to high school.
- Jason Dotson (3 students)
- Kath Elliott (2 students)
- Brian Bader (3 students)
- Stephanie Barr and Lucinda Owens (2 students)
- Becky Sivula and Lesli Waller (11 students)
The manipulated independent variable in all classrooms is exposure to Vizzle. The dependent variable is the
amount of time it takes to meet learning goal criteria, and the percent correct for each goal.
Phase IV: Data Analysis
Data is currently being aggregated for analysis.
13Measuring the Impact of Vizzle on Student Learning Outcomes
Research Phase Summary Reports
Allen, M. L., Hartley, C., & Cain, K. (2016). iPads and the
use of “apps” by children with autism spectrum disorder:
Do they promote learning? Frontiers in Psychology, 7,
1305.
Dweck, C. S. (1986). Motivational processes affecting
learning. American Psychologist, 41(10), 1040-1048.
Fletcher-Watson, S., Petrou, A., Scott-Barrett, J., Dicks, P.,
Graham, C., O’Hare, A., … McConachie, H. (2016). A trial
of an iPadTM intervention targeting social communication
skills in children with autism. Autism, 20(7), 771–782.
Kucirkova, N. (2014). iPads in early education: separating
assumptions and evidence. Frontiers in Psychology, 5, 715.
Warmington M., Hitch G. J., & Gathercole S.E. (2013).
Improving word learning in children using an errorless
technique. Journal of Experimental Child Psychology,
114(3), 456- 465.
References