Presented by Kelley Engle HACC York April 19, 2011
Dec 27, 2015
Can become an important part of a teaching portfolio
A SEEQ Analysis can: Allow for self-reflection of teaching methods Be used for hiring, tenure review and
promotions Show evidence of teaching effectiveness
A collection of materials that document teaching performance
Presents a professor’s most significant teaching accomplishments
It is not an exhaustive compilation All claims made in the portfolio should
be supported by empirical evidence.
An example of teaching e-portfolios:› Kelley Engle - Work in Progress
http://userpages.umbc.edu/~engke1
› Sample Teaching Portfolios http://sunconference.utep.edu/CETaL/
resources/portfolios/samples.htm#business
Texts:› Seldin, P, & Miller, J. E. (2008). The
academic portfolio: a practical guide to documenting teaching, research, and service. San Francisco, CA: Jossey-Bass Inc Pub.
› Seldin, P. (2004). The teaching portfolio: a practical guide to improved performance and promotion/tenure decisions. San Francisco, CA: Jossey-Bass Inc Pub.
Online:› Mues, F., & Sorcinelli, M.D. (2000).
Preparing a teaching portfolio. The Center for Teaching, University of Massachusetts, Amherst, MA. http://www.umass.edu/cft/publications/
teachingportfolio.pdf
› Vanderbilt University. (2011). Teaching portfolios. http://cft.vanderbilt.edu/teaching-guides/
reflecting/teaching-portfolios/
Collect data from current (and past) semester
Using Excel, create averages for groups based on sub-categories
Create line graphs for each sub-category
Self-reflect on teaching practices What worked this semester and did not? Compare with actual SEEQ data Brainstorm for possible modifications
In order to facilitate this analysis, it is necessary to “compress” the data
An average is computed for each of the sub-categories listed on the SEEQ:› 1. Learning› 2. Enthusiasm› 3. Organization› 4. Group Interaction› 5. Individual Rapport› 6. Breadth› 7. Examinations› 8. Assignments› 9. Workload Difficulty› 10. Student and Course Characteristics› 11. Overall
Need to decide what scores to include for comparison: › College Wide› York All› Division (i.e., TECH All)› Discipline(i.e., CIS All)› Specific course (i.e., CIS105)
For my analysis, I have chosen to include College Wide and York All
There are two perspectives to analyze this chart qualitatively:› Longitudinally at my own scores› Compare with the baseline scores included for College Wide and York All.
Looking at the scores over time, perceived learning has decreased – Why?
Possible reasons:› Format and delivery
changes to the course
› Anomaly based on class attendance on day of SEEQ administration
Looking at the scores in comparison to the College Wide and York averages, perceived learning is lower
This is more difficult to analyze without feedback from other professors (particularly in same discipline)
In my average for Fall 2009, the M/W class gave low scores across the board which lowered the average
Looking at the attendance for that day, it was mostly A and high B students
What does this tell me?
After looking at the MW class in more detail, I discovered the following:1. The students who completed the SEEQ’s
were expecting the highest grades (mostly A’s)
2. When looking at the attendance for the day the SEEQs were administered, the students present were mainly A/high B students.
3. They noted a marked dissatisfaction with the pace of the course stating that is was too slow.
After looking at the MW class in more detail, I discovered the following:1. The students who completed the SEEQ’s
were expecting the highest grades (mostly A’s)
2. When looking at the attendance for the day the SEEQs were administered, the students present were mainly A/high B students
3. They noted a marked dissatisfaction with the pace of the course stating that is was too slow
My assessment of these evaluations is that A/high B students are frustrated with the course either because it is not challenging enough or the pace is too slow.
How to best accommodate these students?
An inordinate amount of time is spent during class “hand-holding” students with no or limited PC skills
The majority of these “hand-held” students end up dropping and/or failing the class
This has been a recurring issue each semester
A pre-requisite of CIS100 would alleviate this problem› Would require a placement test to allow
students to test out› A pre-requisite needs to occur at the
college-wide level
For this spring semester, I have developed a 2-pronged remediation for this problem: 1. Develop and administer a CIS105
Readiness Questionnaire at the beginning of the semester
2. Each students initial CIS105 Readiness score will be compared with their final grade in the class. This will help in adjusting the heuristic for advising students in future semesters
Questionnaire was administered for Spring 2011
None of the students surveyed met the heuristic of 5 “No” questions
May need to remove heuristic from questionnaire in order to reduce bias
May need to reduce heuristic for recommending dropping the course
The final results of this analysis needs to be compiled and organized
This could be done through the use of an e-portfolio.
For more information on e-portfolios click here.