CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil University of Southern California & National Center for Research on Evaluation, Standards, and Student Testing (CRESST) Gloria Hsieh University of Southern California Gregory K. W. K. Chung UCLA/CRESST
31
Embed
CRESST Conference Los Angeles, CA September 15, 2000 CRESST Conference 9/15/00 v.3 COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM SOLVING Harry O'Neil.
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
CRESST Conference
Los Angeles, CASeptember 15, 2000
CRESST Conference 9/15/00 v.3
COMPUTER-BASED ASSESSMENT OF COLLABORATIVE PROBLEM
SOLVING
Harry O'NeilUniversity of Southern California &
National Center for Research on Evaluation, Standards, and Student Testing (CRESST)
Gloria Hsieh University of Southern California
Gregory K. W. K. ChungUCLA/CRESST
CRESST Conference 9/15/00 v.3 p.2
C R E S S T / U S C
CRESST MODEL OF LEARNING
Content Understanding
Learning
Communication
Collaboration Problem Solving
Self-Regulation
CRESST Conference 9/15/00 v.3 p.3
C R E S S T / U S C
JUSTIFICATION: WORLD OF WORK
• The justification for collaborative problem solving as a core demand can be found in analyses of both the workplace and academic learning– O’Neil, Allred, and Baker (1997) reviewed five major studies
from the workplace readiness literature. Each of these studies
identified the need for (a) higher order thinking skills, (b)
teamwork, and (c) some form of technology fluency. In four of
the studies, problem-solving skills were specifically identified
as essential.
CRESST Conference 9/15/00 v.3 p.4
C R E S S T / U S C
JUSTIFICATION: NATIONAL STANDARDS
• New standards (e.g., National Science Education Standards) suggest new assessment approaches rather than multiple-choice exams– Deeper or higher order learning
– More robust knowledge representations
– Integration of mathematics and science
– Integration of scientific information that students can apply to new problems in varied settings (i.e., transfer)
– Integration of content knowledge and problem solving
– More challenging science problems
– Conduct learning in groups
CRESST Conference 9/15/00 v.3 p.5
C R E S S T / U S C
MODELSPROBLEM SOLVING DEFINITION
• Problem solving is cognitive processing directed at achieving a goal when no solution method is obvious to the problem solver (Mayer & Wittrock, 1996)
• Metacognition and motivation are assessed by paper-and-pencil survey instrument (self-regulation)
• Create a knowledge map on environmental science (Content understanding)
• Receive feedback on it
• Using a simulated Web site, search for information to improve it (problem-solving strategy)– Relevance, searches, browsing
• Construct a final knowledge map– Serves as the outcome content understanding measure
CRESST Conference 9/15/00 v.3 p.8
C R E S S T / U S C
CRESST’S CONCEPT MAPPER
CRESST Conference 9/15/00 v.3 p.9
C R E S S T / U S C
CORRELATION COEFFICIENTS: OUTCOME AND PROCESS VARIABLES (N = 38)
Process variablesKnowledge map
score
Relevant information found .08
Browsing .41**
Searching .29*
Focused browsing .52***
Feedback .51***
*p < .07** p < .01*** p < .001
CRESST Conference 9/15/00 v.3 p.10
C R E S S T / U S C
CONCLUSIONS
• Computer-based problem-solving assessment is feasible– Process/product validity evidence is promising
• Allows real-time scoring/reporting to students and teachers
• Useful for program evaluation and diagnostic functions of testing
• What’s next?– Generalizability study
– Collaborative problem solving with group task
CRESST Conference 9/15/00 v.3 p.11
C R E S S T / U S C
TEAMWORK MODEL
• Taskwork SkillsTaskwork team skills influence how well a team performs on aparticular task (e.g., whether or not a team of negotiators reachesan agreement with the other party).
• Teamwork SkillsTeamwork skills, or team process skills, influence how effectivean individual member will be as part of a team:– Adaptability Recognizing problems and responding appropriately
– Coordination Organizing team activities to complete a task on time
– Decision making Using available information to make decisions
– Interpersonal Interacting cooperatively with other team members
– Leadership Providing direction for the team
– Communication Clear and accurate exchange of information
CRESST Conference 9/15/00 v.3 p.12
C R E S S T / U S C
CRESST ASSESSMENT MODEL OF TEAMWORK
Simulation
Pre-Defined Process Taxonomy
Union Management Negotiation/Networked
Concept Map
Real-Time Assessment and
ReportingNetworked Computers
Pre-Defined Messages
CORRELATION BETWEEN TEAM PROCESSES AND OUTCOME MEASURES1
(N = 26)
Team processPerformance
score Agreement 2Agreement
typeTime in
negotiations
Adaptability .31 -.24 .24 .72***
Coordination .29 -.08 .15 .65***
Decision making .41* -.04 .41* .69***
Leadership .31 -.02 .19 .65***
Interpersonal .03 -.52** -.05 .56***
Communication .35 -.20 .24 .80***
Note. *p < .05. **p < .01. ***p < .001.
1 High school students.
2 0 = No agreement, 1 = Agreement.
CREEST Conference 9/15/00 v.1
C R E S S T / U S C
CRESST Conference 9/15/00 v.3 p.14
C R E S S T / U S C
Team process
Semanticcontentscore
Organizationalstructure
score
Adaptability -.65** -.49*
Coordination -.31 -.31
Decision making -.09 -.01
Interpersonal -.33 -.25
Leadership .23 .37
Communication -.36 -.24
*p < .05. **p < .01. (two-tail).
Chung, G. K. W. K., O’Neil, H. F., Jr., & Herl, H. E. (1999). The use ofcomputer-based collaborative knowledge mapping to measure teamprocesses and team outcomes. Computers in Human Behavior, 15, 463-493.
Nonparametric (Spearman) Correlations Between Team Processes and Post Outcome Measures for Concept Map (N =14)
CRESST Conference 9/15/00 v.3 p.15
C R E S S T / U S C
PUZZLE
• Unfortunately, the concept mapping study
(Chung et al., 1999) found that the team process
did not predict team outcomes, unlike the union
management negotiation task.
We hypothesized that the lack of useful feedback
in the concept mapping task and low prior
knowledge may have influenced the results.
CRESST Conference 9/15/00 v.3 p.16
C R E S S T / U S C
ONGOING RESEARCH
• We changed the nature of the task to provide more extensive feedback and to create a real “group” task
• Feedback will be knowledge of response feedback versus adaptive knowledge of response feedback
• A group task is a task where – no single individual possesses all the resources;
– no single individual is likely to solve the problem or accomplish the task objective without at least some input from others (Cohen & Arechevala-Vargas, 1987)
• One student creates the concept map, the other student does the searches
CRESST Conference 9/15/00 v.3 p.17
C R E S S T / U S C
KNOWLEDGE OF RESPONSE FEEDBACK(Schacter et al. Study)
Your map has been scored against an expert’s map in environmental science. The feedback tells you:
• How much you need to improve each concept in your map (i.e., A lot, Some, A little).
Use this feedback to help you search to improve your map.
A lot Some A little_______________________________________________
Adapted Knowledge of Response (the above + the following)
Improvement: You have improved the “food chain” concept from needing “A lot of improvement” to the “Some improvement” category.
Strategy: It is most useful to search for information for the “A lot” and “Some” categories rather than the “A little” category. For example, search for information on “atmosphere” or “climate” first, rather than “evaporation.”
CRESST Conference 9/15/00 v.3 p.18
C R E S S T / U S C
GENERAL LESSONS LEARNED
• Need model of cognitive learning (the Big 5)
– Need submodels of process
• problem solving is content understanding, problem-solving strategies, self-regulation
• Teamwork is adaptability, coordination, decision making, interpersonal skill, leadership, and communication
• For diagnostic low-stakes environments need real-time administration, scoring, and reporting
• Role of type of task and feedback may be critical for assessment of collaborative problem solving
BACK-UP SLIDES
CREEST Conference 9/15/00 v.1 p.19
CRESST Conference 9/15/00 v.3 p.20
C R E S S T / U S C
Problem SolvingDomain-Specific Domain-specific Augmented Concept Mapping With Search Task,Self-regulation strategies Transfer Tasks, Motivation (effort, self- efficacy,
Domain Specifications Embedded in the Knowledge Mapping Software
General domain specification This software
Scenario Create a knowledge map on environmental science byexchanging messages in a collaborative context
Participants Student team (3 members)
Knowledge map terms Predefined. 18 important ideas identified by contentexperts: atmosphere, bacteria, carbon dioxide, climate,consumer, decomposition, evaporation, food chain,greenhouse gases, nutrients, oceans, oxygen,photosynthesis, producer, respiration, sunlight, waste,and water cycle.
Knowledge map links Predefined. 7 important relationships identified bycontent experts: causes, influences, part of, produces,requires, used for, and uses
Type of learning Content understanding, collaboration
Outcome measures Semantic content scores, organizational structurescore, number of terms used, number of links used.
Teamwork processes Adaptability, coordination, decision making,interpersonal, leadership, communication
CRESST Conference 9/15/00 v.3 p.23
C R E S S T / U S C
BOOKMARKING APPLET
CRESST Conference 9/15/00 v.3 p.24
C R E S S T / U S C
SAMPLE METACOGNITIVE ITEMS
AlmostNever Sometimes Often
AlmostAlways
a. I figure out my goals andwhat I need to do toaccomplish them............................ 01 Ο 02 Ο 03 Ο 04 Ο
b. I almost always knowhow much of a task I haveto complete ...................................... 05 Ο 06 Ο 07 Ο 08 Ο
The following questions refer to the ways people have used to describe
themselves. Read each statement below and indicate how you generally
think or feel. There are no right or wrong answers. Do not spend too
much time on any one statement. Remember, give the answer that
seems to describe how you generally think or feel.
Note. Formatted as in Section E, Background Questionnaire: Canadian version of the International Adult Literacy Survey (1994). Item a is a planning item; item b is a self-checking item. Kosmicki (1993) reported alpha reliability of .86 and .78 for 6-item versions of these scales respectively.