An analysis of generative dialogue patterns across interactive learning environments: Explanation, elaboration, and co- construction Robert G.M Hausmann Pittsburgh Science of Learning Center (PSLC) Learning Research and Development Center University of Pittsburgh
47
Embed
An analysis of generative dialogue patterns across interactive learning environments: Explanation, elaboration, and co-construction Robert G.M Hausmann.
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
An analysis of generative dialogue patterns across interactive learning environments: Explanation, elaboration, and co-construction
Robert G.M Hausmann
Pittsburgh Science of Learning Center (PSLC)Learning Research and Development Center
Participants– University of Pittsburgh undergraduates (N = 136)– Course credit
Research Questions– Can undergraduate dyads be trained to interact effectively (i.e., co-
construct)?– What effect do certain dialog types have on problem solving and
learning?
Fabrication Cost
Modify Properties Member
List
Fabrication Cost
Modify Properties Member
List
Color-coded Feedback
Results: Manipulation Check
Condition
ControlDyads
Elaborative Dyads
Clarification Questions
26.37 (11.32) 18.00 (9.71)
Elaborative Statements
8.94 (6.46) 13.38 (5.76)
Source: Hausmann (2006)
Results: Problem Solving
Source: Hausmann (2006)
0.56 0.55 0.630.45
0.50
0.55
0.60
0.65
Individuals ControlDyads
ElaborativeDyads
Optimization Score
Results: Learning
Source: Hausmann (2006)
5% 3% 11%0%
2%
4%
6%
8%
10%
12%
Individuals ControlDyads
ElaborativeDyads
Deep Knowledge Gain
Summary Individual Learning (Studies 1 & 2)
– Paraphrasing => shallow learning– Self-explaining => deep learning
Human Tutoring (Studies 3 & 4)– Listening to tutor explain => shallow learning– Receiving scaffolding => deep learning– Reflective comments => deep learning
Peer collaboration (Studies 4 & 5)– Listening to peer explain => shallow learning– Giving an explanation to peer => deep learning– Co-constructing knowledge => deep learning
Observing Tutoring Collaboratively (Studies 4)– Observing tutor explain is not correlated with deep learning– Observing student receive scaffolding => deep learning
Integration with Serious Games
What is the implication of these results for the design of serious games?– How can a game inspire explanation,
elaboration, or even co-construction?
Acknowledgements Funding Agencies
Support @ LRDC– Gary Wild– Shari Kubitz– Eric Fussenegger
Advisors– Michelene T.H. Chi– Kurt VanLehn
Physics Instructors– Donald J. Treacy, USNA– Robert N. Shelby, USNA
Other Influences– Mark McGregor– Marguerite Roy– Rod Roscoe
Can a computer interface support self-explaining? Cognitive Technology, 7(1), 4-15.
Study 2: Hausmann, R.G.M. & VanLehn (in prep). The effect of generation on robust learning.
Study 3: Chi, M.T.H., Siler, S., Jeong, H., Yamauchi, T., & Hausmann, R.G. (2001). Learning from human tutoring. Cognitive Science, 25(4), 471-533.
Study 4a: Chi, M.T.H., Roy, M., & Hausmann (accepted). Observing Tutorial Dialogues Collaboratively: Insights about Tutoring Effectiveness from Vicarious Learning, Cognitive Science, x, p. xxx-xxx.
Study 4b: Hausmann, R.G.M., Chi, M.T.H., & Roy, M. (2004) Learning from collaborative problem solving: An analysis of three hypothesized mechanisms. 26nd Annual Meeting of the Cognitive Science Conference, Chicago, IL.
Study 5: Hausmann, R. G. M. (2006). Why do elaborative dialogs lead to effective problem solving and deep learning? Poster presented at the 28th Annual Meeting of the Cognitive Science Conference, Vancouver, Canada.
Inferential Mechanisms Simulation of a mental model (Norman, 1983) Category membership (Chi, Hutchinson, &
Robin, 1989) Analogical reasoning (Markman, 1997) Integration of the situation and text model