Top Banner
ICE Evaluations Some Suggestions for Improvement
29
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: ICE Evaluations Some Suggestions for Improvement.

ICE Evaluations

Some Suggestions for Improvement

Page 2: ICE Evaluations Some Suggestions for Improvement.

Outline

• Background information and assumptions

• Content of evaluation forms

• Logistical problems with processing ICE information

Page 3: ICE Evaluations Some Suggestions for Improvement.

Background Information

• Exchange of e-mails by professors last summer

• Arts and Sciences “Task Team” currently looking at various ways of evaluating teaching

• My points here are mostly compatible with both

Page 4: ICE Evaluations Some Suggestions for Improvement.

Background Assumptions

• Student Evaluations will continue to be used

• They will be used for two purposes:– Instructors’ own improvement of courses and teaching– Assessment of teachers by administrators

• We should make ICE evaluations as effective as possible for both purposes

Page 5: ICE Evaluations Some Suggestions for Improvement.

Suggestions About Content of ICE Forms

(go to evaluations file)

Page 6: ICE Evaluations Some Suggestions for Improvement.

Remove the “One Number” Overall Average At Bottom of Page

• It gives less information, not more

• It is all people will look at if it’s available• Administrators assessing teachers• Teachers planning future courses

• Not all the categories have to do with the instructor, so it’s unfair to assign these ratings to instructor

• FAS Task Team unanimously agreed

Page 7: ICE Evaluations Some Suggestions for Improvement.

Keep the text of individual questions

• In some formats of ICE reports, the questions are missing

• This encourages looking only at numbers

• So include the actual questions

Page 8: ICE Evaluations Some Suggestions for Improvement.

Why Not Also Get Rid of the “Category” Average Numbers?

• All the same reasons apply

• But if this is too much, then really, please, please get rid of the “one number” average

Page 9: ICE Evaluations Some Suggestions for Improvement.

Some Specific Questions on ICE Form Need Revision

• #20 “The Material was not too difficult” means that the highest rating is for material that is far too easy

• Combine questions 18-20 to make question “The difficulty and pace of the course were appropriate”

Page 10: ICE Evaluations Some Suggestions for Improvement.

• Question #10 “Demonstrated Favorable Attitude toward students”

• Task Team recommendation: change to “treated students with proper respect”

• Reason: the old wording favors teachers who are lenient about, for example, plagiarism, arriving to class late, talking during class…

Page 11: ICE Evaluations Some Suggestions for Improvement.

Other questions to revise

• #7 “Was readily available for consultation outside of class”

• Question #12 “Evaluated Work Fairly”

Page 12: ICE Evaluations Some Suggestions for Improvement.

Too Many Questions

• Researchers seem to agree with the common-sense idea that too many questions on an evaluation form leads students to give up

• Some ICE questions seem repetitive or unnecessary

Page 13: ICE Evaluations Some Suggestions for Improvement.

How to include fewer questions

• Again, combine Questions 18-20 to make question “The difficulty and pace of the course were appropriate”

• Drop Question #15 and #16 about stating and covering objectives of course, since #17 “Course organization was logical and adequate” covers these

Page 14: ICE Evaluations Some Suggestions for Improvement.

“Additional Items” on ICE form

• After the university-wide questions, a section of “additional items” is included

• Currently, each faculty (FAS, Engineering, etc.) can choose from an “item bank” of approved questions

• Instead, each department should choose any questions they want, whether from item bank or not

Page 15: ICE Evaluations Some Suggestions for Improvement.

Why let Departments Choose?

• Departments are in the best position to design questions that are appropriate for their discipline

• For example, why think that the same questions would be appropriate to a chemistry course, an education course, and an English literature course?

• Too much bureaucratic regulation is not beneficial to a university

Page 16: ICE Evaluations Some Suggestions for Improvement.

Logistical Problems with Processing ICE Information

Page 17: ICE Evaluations Some Suggestions for Improvement.

Course evaluations are often “lost” or assigned to wrong course

• Intstructors have students fill out evaluation forms, then no ICE report appears for that course

• Has happened at least five times in philosophy department in three years

• Other professors reported the same problem in last summer’s e-mail exchange

Page 18: ICE Evaluations Some Suggestions for Improvement.

The cause?

• If students fill in the wrong section number, or department number, or course number, then the evaluations all automatically are assigned to the wrong course (or to no course)

Page 19: ICE Evaluations Some Suggestions for Improvement.

The Solution

• Is not to assign blame (as in “Well, this is the department’s fault, because the graduate assistant who gave the evaluations must have told students the wrong numbers”)

• But instead is to try to redesign system so that this mistake (which is easy to make) does not lead to corruption of data

Page 20: ICE Evaluations Some Suggestions for Improvement.

The Solution (part II)

• A simple but less effective solution: Tell all instructors to give the course information themselves to students themselves, by e.g. writing on the board (this at least makes instructors responsible)

• A (slightly) more difficult but more effective solution: have some kind of “cover sheet” for each course, which the computer will read. If the individual ICE forms disagree with information on cover sheet, automatically assign it to the correct course

Page 21: ICE Evaluations Some Suggestions for Improvement.

A More Widespread Problem

• When the evaluations for a course are mysteriously absent, sometimes evaluations from one or two (or more) students appear anyway

• Or, when a teacher doesn’t administer evaluations, she still gets results from one or two students anyway

• And probably this “phantom evaluation” process occurs, undetected, in MOST courses

Page 22: ICE Evaluations Some Suggestions for Improvement.

Cause of Phantom Evaluations

• It’s the same cause as for the missing evaluations for a whole course

• If one or two (or more) students write the wrong course numbers, their evaluations will be assigned to the wrong course (even if all the rest of the student forms go to the right course)

• This probably happens VERY OFTEN• So it’s all the more reason to fix the problem

Page 23: ICE Evaluations Some Suggestions for Improvement.

How to Avoid “Phantom Evaluation” Problem

• The same way as avoiding the more large-scale assignment of evaluations to wrong courses

• Have some kind of “cover sheet” for each course, which the computer will read. If the individual ICE forms disagree with information on cover sheet, automatically assign it to the correct course

Page 24: ICE Evaluations Some Suggestions for Improvement.

Another Logistical Problem

• The ICE form includes a “response rate” for indicating the percentage of enrolled students who fill out an evaluation form

• But for at least two of the last four semesters, these figures are inaccurate

Page 25: ICE Evaluations Some Suggestions for Improvement.

Why is the “Response Rate” Often Inaccurate?

• The response rate is, of course, meant to be an indication of the percentage of students enrolled in the course who actually fill out the ICE form

• But the total number of “enrolled students” is not accurate• The AUBsis site in fall 2003-2004 and fall 2004-2005

gave a total number of enrolled students at the BEGINNING of the term, not at the end

• So any students who dropped the class were still included in the “enrolled students” total

• So suppose 25 students were enrolled at the beginning of the term, but 5 dropped. And suppose 15 students filled out the ICE form. The official “response rate” would be 60%. But the real response rate, of students still enrolled, would be 75%.

Page 26: ICE Evaluations Some Suggestions for Improvement.

Solution to the “response rate” problem

• If OIRA uses the AUBsis information for this, OIRA and the registrar should coordinate the uses to which the data will be put. So the “enrolled students” number must reflect the number of students enrolled at the end of the term, not the beginning.

Page 27: ICE Evaluations Some Suggestions for Improvement.

OIRA office responses to faculty

Page 28: ICE Evaluations Some Suggestions for Improvement.

OIRA has not Responded to Faculty Correspondence About Problems

• A delicate issue

• Numerous examples

• Why it matters

• Solution? I admit I don’t know. Maybe a full-time office manager?

Page 29: ICE Evaluations Some Suggestions for Improvement.

One final issue: Use of ICE Reports

• Literature on evaluations often mentions proper use by administrators

• A quick glance is worse than no information at all

• Items to focus on: percentage of students responding; type of course (graduate vs. undergrad, introductory vs. advanced); particular questions; distribution of answers (are one or two terrible ratings dragging average down?)

• NOT ONE NUMBER