Chapter 16 Launching a Successful Online Business and EC Project
Dec 20, 2015
Investigating the use of short answer free-text questions in online interactive assessment
Sally Jordan17th January 2008
Centre for Open Learning of Mathematics, Science Computing and Technology (COLMSCT)
My plan• History – how and why did I get involved• What we are trying to achieve• Our questions• Evaluation• How we write the questions – have a go!• Discussion
The S151 : Maths for Science experience
• This course does not have TMAs;• But we wanted to be able to provide students with
targeted and relatively detailed feedback on their answers;
• And we wanted to be able to provide this feedback rapidly;
• We wanted more than multiple choice questions;• And this was for summative assessment.
Since S151…
• The eAssessment system has become OpenMark, being used on several courses in the Science Faculty and elsewhere in the University, for diagnostic, formative and summative purposes;
• OpenMark has been incorporated within Moodle at both the assessment and the question level;
• Around 10 COLMSCT projects are investigating ways in which the use of this type of assessment can be extended;
• My project is a joint one with Barbara Brockbank, supported by Phil Butcher and Tom Mitchell (Intelligent Assessment Technoloies Ltd.)
S104 : Exploring science• A new course from February 2008;• Students will spend 9 months studying the course, and
we want to keep them up to date and engaged with it;• So we are using regular iCMAs (interactive computer
marked assignments) with feedback (alongside conventional tutor marked assignments);
• The iCMAs will be summative (but low stakes), so that students do them, but their primary function is to provide pacing and feedback.
Questions for S104 : Exploring science• We want to be able to ask questions that require slightly
longer free text answers;• So we are working with a commercially provided,
linguistically based authoring tool to write questions that require answers of about a sentence in length;
• Student responses are being used to refine the questions;
• We are providing targeted feedback on incorrect and incomplete answers.
Evaluation 1:IET research lab observations
• Six S103 students were observed in June 2007;• They reacted to the free-text questions in interesting ways
e.g. because they thought we were just looking for keywords, some tried to ‘help’ the computer by giving answers in note form; one of the students appeared to ‘brain dump’;
• Use made of the feedback provided was also variable; one student said he found it useful but clearly didn’t use it; others read the feedback carefully, checked things in their course book, and altered their response successfully.
Evaluation 2:Human-computer marking comparison
• Computer marking (with and without ‘flagging’) compared with that of 6 ALs and the question author;
• For most questions the computer’s marking is indistinguishable from that of the ALs;
• Perhaps not surprisingly, the computer’s marking is closer to that of the question author than that of some of the ALs;
• The computer is not always ‘right’, but neither are the ALs;• The computer seems to behave better when credit is not
given for flagged answers.
Writing questions and answer matching• Have a go..
• We’ve written the question
for you….
In the photograph on the right,
who is the taller?
Interesting issues• Answers that are difficult to parse include those that are
very long and those in note form
• Questions have to be quite tightly constrained e.g.
‘You are handed a rock specimen that consists of interlocking crystals. How would you decide, from its appearance, whether it is an igneous or a metamorphic rock?’
has become
‘You are handed a rock specimen that consists of interlocking crystals. How could you be sure, from its appearance, that this was a metamorphic rock?’
More interesting issues• Writing questions and improving the answer matching is
interesting, but time consuming;• I found it relatively easy to get used to writing
appropriate questions and using the authoring tool, but I am used to writing assessment questions and am quite logical (I’m a physicist!);
• But one of our tasks is to investigate whether these sorts of questions could be written and used more widely across the University, and to compare with other systems;
• And who should be writing the questions?
Short answer questionshttps://students.open.ac.uk/openmark/s103-07j.blocks123world/https://students.open.ac.uk/openmark/s103-07j.block4world/https://students.open.ac.uk/openmark/s103-07j.block5worldhttps://students.open.ac.uk/openmark/s103-07b.block7v1aworld/https://students.open.ac.uk/openmark/s103-07b.block8world/https://students.open.ac.uk/openmark/s103-07b.block9world/https://students.open.ac.uk/openmark/s103-07b.block10world/https://students.open.ac.uk/openmark/s103-07b.block11world/
Workshop questions (developmental server)http://kestrel.open.ac.uk/om-tn/iat.demo/
OpenMark examples sitehttp://www.open.ac.uk/openmarkexamples/index.shtml
Centre for Open Learning of Mathematics, Science Computing and Technology (COLMSCT) The Open UniversityWalton HallMilton KeynesMK7 6AA
http://cetl.open.ac.uk/colmsct