Top Banner
On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001
26

On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Dec 25, 2015

Download

Documents

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

On-Line Student Assessment

Richard Hill

Center for Assessment

Nov. 5, 2001

Page 2: On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Speaking Points

Current paper-and-pencil-based assessments

Image Scoring Computer Administration Computer Scoring

Page 3: On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Typical Current Paper-and-Pencil Based Statewide Assessment

3 grades Reading, writing, math, science, social

studies 30 MC and 6 OE questions for four areas,

one essay for writing 50,000 students per grade

Page 4: On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Materials Processed

150,000 28-page test booklets 2 millions sheets of paper 10 tons of paper, a stack 700 feet high

150,000 20-page answer documents 1.5 million sheets of special paper 7.5 tons 600 boxes to store (per year)

Page 5: On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Process

Materials shipped to schools Materials shipped back to contractor Materials logged in

Count everything, resolve discrepancies Note that one misplaced school can stop

entire process

Page 6: On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Process for Receiving Materials

Separate answer booklets from test booklets Test booklets placed in temporary storage in

original boxes, then destroyed after reporting complete

Answer sheets guillotined MC answer sheets scanned OE sheets packaged by scoring

Page 7: On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Processing of OE Sheets

Separate by content area Sorted by form, randomized across

schools Scanned to capture ID numbers Scoring headers prepared, then merged

with answer sheets

Page 8: On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Scoring

Hire, train, qualify Score On-going evaluation of quality of scoring Determine papers that need adjudication,

then rescore as necessary Scan scoring headers Merge MC, OE and writing scores

Page 9: On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Scoring Time

20 seconds per OE question 5 minutes per essay (2 scorings plus

adjudication, if necessary) 13 minutes per student

32,500 hours 1000 person-weeks, plus training, qualifying,

quality control and equating

Page 10: On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Equating to Previous Year

MC OE

Difficulty of items Changes in scoring

Page 11: On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Count, Count, Count

Initial log-in counts After packaging Every time a box is opened or closed Count boxes, too

Page 12: On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Final Steps

Ship reports back to schools Resolve problems

Missing or misplaced students Challenges to scoring (requires finding

answer sheets—perhaps all for one student) Destroy test materials Long-term storage for answer documents

Page 13: On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Solution # 1—Image Scoring

High-speed scanners capture images of documents

All processing is done on CRTs by looking at electronic image of original paper

Page 14: On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Advantages

Control Scoring

Blind read-behinds Real-time tracking of accuracy of every scorer Multiple sites

Equating Blind rescores from previous year

Page 15: On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Advantages (cont’d)

Scoring speed Next response is ready to be scored when

first is done Scoring stops when rates decline No fumbling for papers Up to 1/3 faster

Page 16: On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Advantages (cont’d)

Tracking No need for counting Nothing is lost Nothing is damaged Records automatically linked Special-request papers easy to obtain

Prep for next year’s scoring Challenged papers Adjudication

Page 17: On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Advantages (cont’d)

Reporting—Send sample of work home to parents

Storage Permanent Compact

Page 18: On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Disadvantages

Hardware and software costs Costs have dropped dramatically ($150,000

server two years ago now selling for $16,000) Need to prove that scoring is the same

Writing vs. OE Connectivity Power outages

Page 19: On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Computer Administered Tests

Web-based vs. CD Comparability

Standards—especially writing Students that write on paper and then just type in

Full use of computer capabilities Underestimation of (some) students’ abilities

Page 20: On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Georgia’s Proposed System

Huge item bank, three levels Teachers can create tests Capacity concerns for Level III tests

Page 21: On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Advantages

Elimination of paper Accommodations Adaptive testing

Shorter tests Diagnostic tests Lower frustation levels

Real-time scoring

Page 22: On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Issues

Administration time All schools have some computers, but how

many? Transition

Recommendation is to test all schools the same way

Comparability Logistics of operating two programs at same time

Page 23: On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Computer Scoring

Major vendors NCME Session N1, April 12, 2001 ETS Technologies—E-rater (Princeton, NJ) Vantage Learning—Intellimetric (Yardley, PA) TruJudge—Project Essay Grade (PEG)

(Purdue) Knowledge Analysis Technologies—Intelligent

Essay Assessor (Boulder, CO)

Page 24: On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Advantages

Time Cost Objective (or at least impersonal)

Page 25: On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Issues

Accuracy rates PA study—computers vs. humans

Computer more accurate than one human Computer less accurate than two humans Bias vs. random error

Beating the system (“Stakes changes everything”)

Capacity of contractors to deliver logistics

Page 26: On-Line Student Assessment Richard Hill Center for Assessment Nov. 5, 2001.

Alternate Testing Modes

Listening Special education adaptations—see

Tindel Virtual reality