1 Using Formative Assessment and Self-Regulated Learning to Help Developmental Mathematics Students Achieve: A Multi-Campus Program John Hudesman Sara Crosby Niesha Ziehmke Howard Everson Sharlene Isaac Bert Flugman Barry Zimmerman City University of New York Adam Moylan Rockman et al. The authors describe an Enhanced Formative Assessment and Self-Regulated Learning (EFA-SRL) program designed to im- prove the achievement of community college students enrolled in developmental mathematics courses. Their model includes the use of specially formatted quizzes designed to assess both the students’ mathematics and metacognitive skill levels. When the corrected quizzes are returned, students are required to demon- strate how they used both the mathematics and metacognitive feedback to improve on their errors. Results indicate that pro- gram students earned higher mean grades and achieved higher pass rates than students enrolled in baseline classes. More than 40% of high school graduates entering two-year colleges Hudesman, J., Crosby, S., Ziehmke, N., Everson, H., Isaac, C., Flugman, B., Zimmerman, B., & Moylan, A. (2014). Using formative assessment and self-regulated learning to help developmental math students achieve. Journal on Excellence in College Teaching, 25(2), x-x.
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
1
Using Formative Assessment and Self-Regulated Learning
to Help Developmental Mathematics Students Achieve:
A Multi-Campus Program
John HudesmanSara Crosby
Niesha ZiehmkeHoward Everson
Sharlene IsaacBert Flugman
Barry ZimmermanCity University of New York
Adam MoylanRockman et al.
The authors describe an Enhanced Formative Assessment and Self-Regulated Learning (EFA-SRL) program designed to im-prove the achievement of community college students enrolled in developmental mathematics courses. Their model includes the use of specially formatted quizzes designed to assess both the students’ mathematics and metacognitive skill levels. When the corrected quizzes are returned, students are required to demon-strate how they used both the mathematics and metacognitive feedback to improve on their errors. Results indicate that pro-gram students earned higher mean grades and achieved higher pass rates than students enrolled in baseline classes.
More than 40% of high school graduates entering two-year colleges
Hudesman, J., Crosby, S., Ziehmke, N., Everson, H., Isaac, C., Flugman, B., Zimmerman, B., & Moylan, A. (2014). Using formative assessment and self-regulated learning to help developmental math students achieve. Journal on Excellence in College Teaching, 25(2), x-x.
Journal on Excellence in College Teaching2
require mathematics remediation, at a cost of between 1.85 and 2.35 billion dollars annually (Strong American Schools, 2008). As a result of this over-whelming demand, developmental mathematics courses comprise over half of the mathematics offerings at many of these colleges (see Lutzer, Rodi, Kirkman, and Maxwell, 2007). Even more disturbing is the finding by the Carnegie Foundation (2009) that between 60% and 70% of the students who enroll in developmental mathematics do not successfully complete these required courses. The situation continues to deteriorate when we consider what happens to those students who do pass developmental mathematics. For example, at a large urban comprehensive college, only between 45% and 67% of the students passed an introductory college-level mathematics course after successfully completing developmental mathe-matics (Cummings, personal comunication, May 3, 2011).
The Contribution of Academic Content Formative Assessment
to Student Learning
The impact of formative assessment on learning has been well docu-mented. In a series of landmark review articles, Black and Wiliam (1998a, 1998b, 2009) found that achievement gains generated by using formative assessment across a range of academic content areas were among the largest ever reported for education interventions, with the largest gains realized among low achievers.
Implementing an effective formative assessment program involves having teachers receive feedback from the classroom assessments and then using this information to make changes in their instructional practices. Instructors are also expected to provide feedback to students about how they can improve their own learning. Students, in turn, are expected to use this feedback to make constructive changes in how they learn (Sadler, 1989). In this way the formative assessment process is integrated into classroom instruction as an ongoing process, and can promote mastery learning and curriculum-based measurement (Fuchs, 1995; Zimmerman & DiBenedetto, 2008).
In conceptualizing the formative assessment process, feedback is a key element in how students develop their learning-how-to-learn skills. As such, formative assessment merges with theories of metacognition in general and self-regulation in particular (see, for example, Nicol and Macfarlane-Dick’s 2006 discussion of formative assessment and self-reg-ulated learning).
Helping Developmental Math Students Achieve 3
The SRL Component of Formative Assessment
The majority of formative assessment interventions have emphasized content competency, for example, by providing feedback on incorrectly applied mathematics strategies, such as the steps needed to solve a qua-dratic equation by factoring. We believe that in order to optimize student learning, it is necessary to include metacognitive, self-regulatory strategies as part of classroom instruction and assessment.
The SRL approach guiding our work is based on models of self-reg-ulated learning developed by Zimmerman (2000, 2002, 2006) and Grant (2003, 2008). It is a psycho-educational model characterized by continuous feedback cycles, with each cycle divided into three main phases. The first is a planning phase, in which students review their past efforts, conduct academic task analyses, choose content and metacognitive strategies that best address their specific learning challenges, set identifiable goals, and make self-efficacy and self-evaluation judgments to assess the accuracy of their level of understanding and content mastery. Next is a practice phase, in which students implement their plans, monitor their progress, and make real-time adjustments to their learning plans. This is followed by an evaluation phase, in which students assess each strategy’s effectiveness based on instructor feedback, for instance, classroom assessments. Stu-dents then build on the successful strategies and/or modify or replace less effective ones. The students’ responses from the evaluation phase become the basis for the planning phase of the next SRL cycle. This model has also been described in Hudesman, Millet, Niezgoda, Han, and Flugman (2013) and Hudesman, Crosby, Flugman, Isaac, Everson, and Clay (2013).
The SRL intervention derives much success from its cyclical nature; each time students complete a cycle, they acquire more feedback and, therefore, come closer to achieving their learning goals. Students begin to understand that learning is directly related to experimenting with dif-ferent content and metacognitive strategies, a notable shift from the more common notion that achievement is simply a function of innate ability or some other external factor (Zimmerman, 2002).
The power of SRL competence is highlighted in a classic study in social learning theory by Zimmerman and Bandura (1994). They demonstrated that students’ SRL skill levels are more highly correlated with their college grade point average than are their scores on standardized tests such as the SAT. More recently, Dignath and Buettner (2010) reported a meta-analy-sis of 84 K-12 studies that found overall mean effect sizes of .69 for SRL interventions, with the largest effect sizes in mathematics achievement. In addition, a review of the efficacy of SRL strategies by the National
Journal on Excellence in College Teaching4
Mathematics Advisory Panel (NMAP) (2008) concluded there was signif-icant evidence to support the further development and research of this approach in mathematics instruction. The panel specifically suggested encouraging further research on the relationship of SRL strategies to a wide range of mathematics knowledge and skill areas.
The Application of the SRL-Enhanced Formative Assessment Programs
The program described in this article represents a portion of our on-going research on formative assessment and self-regulation. The focus of our work is to improve students’ academic performance by developing both their academic and metacognitive competencies. Because the SRL-EFA program is not reliant on any specific course content, it is possible to apply the SRL model in a wide variety of academic coursework, including developmental mathematics and other STEM-related courses (see Blank, Hudesman, Morton, Armstrong, Moylan, White, and Zimmerman, 2008; Hudesman, Zimmerman, and Flugman, 2005, 2010; Zimmerman, Moylan, Hudesman, White, and Flugman, 2011).
The Main Features of the EFA-SRL Program
In keeping with Heritage’s (2010) view that student assessments should be viewed as part of an ongoing instructional process, the EFA-SRL pro-gram incorporates mathematics content and metacognitive assessments as an integral part of classroom instruction. It is designed to change how instructors teach and how students learn. As such, the program cycle consists of the following five major components:
1. Instructors administer specially constructed quizzes that assess both the students’ mathematics and SRL competencies.
2. Instructors review and grade the quizzes to provide constructive feedback about both the content and SRL competencies with which students struggle. They also use quiz feedback to adjust their instruction.
3. Students complete a specially constructed self-reflec-tion and mastery learning form for each incorrectly answered quiz question. This process affords students an opportunity to reflect on, and then improve, both
Helping Developmental Math Students Achieve 5
the mathematics content and SRL processes that were incorrectly applied.
4. Instructors review the completed self-reflection and mastery learning forms to determine the degree to which students have mastered the appropriate mathematics and SRL skills. Based on the instructor’s evaluation of their work, students can earn up to the total value of the original quiz question. Based on the reflection and mastery learning form data, instructors also have an additional opportunity to make changes to the mathe-matics content and SRL topics to be covered in upcoming lessons.
5. Instructors use the feedback provided by the students’ quizzes and self-reflection/mastery learning forms as the basis for ongoing class discussions and exercises, during which students discuss the relationship between their mathematics content and SRL skills. The students use these discussions to develop their future plans.
The Research Question
The research question addressed in this article is this: Can an Enhanced Formative Assessment Program featuring SRL improve the mathematics achieve-ment of students enrolled in developmental mathematics courses at two different two-year colleges?
Method
Participants and Design
The EFA-SRL developmental mathematics program was part of a larger three-year grant from the Fund for the Improvement of Post-Secondary Education (FIPSE) (Hudesman et al., 2010), which included almost 3,000 students in 14 different academic disciplines. It was implemented at one high school, two 2-year colleges, and two 4-year colleges in three differ-ent states. During the third year of the grant, we were able to study the impact of the EFA-SRL program on students enrolled in developmental mathematics courses at the two participating community colleges. One of the community colleges is located in a major northeastern city, and the majority of the students are either Black or Hispanic. The other commu-
Journal on Excellence in College Teaching6
nity college, which is located in a rural Midwestern city, has a student population that is mostly white. Although there were somewhat different curricula used at the different colleges, they all focused on pre-algebra and algebra topics. Furthermore, all classes were taught using a lecture/demonstration format.
This section reports on the work of three instructors at two 2-year col-leges who carried out the program with 125 students enrolled in seven sections of developmental mathematics. The EFAP-SRL procedure and materials described here have also been described in Zimmerman et al. (2011); Hudesman, Carson, Flugman, Clay, and Isaac (2011, a, b); and Hudesman et al. (2013).
Course Materials
Mathematics QuizzesStudents enrolled in the EFA-SRL course sections completed specially
formatted quizzes that were administered at least once a week. Each quiz consisted of a maximum of five questions. The procedure for each quiz was as follows. When completing the top portion of the quiz, students were asked to predict their quiz grade and to enter the amount of time they spent preparing for it. Once they started the quiz, the students were asked to read each question, but before answering it, they were asked to make a self-efficacy judgment indicating how confident they were that they could correctly solve the problem. After attempting to solve the problem, stu-dents were asked to make a second self-evaluation judgment, indicating how confident they were that they had correctly solved the problem. A sample quiz formatted with the self-efficacy and self-evaluation judgments is illustrated in Appendix A and has also appeared in Zimmerman et al. (2011), Hudesman et al. (2013), and Hudesman et al. (in press)
Instructors used the students’ quiz information to provide feedback to students and to make changes in their instruction. The quiz data also allowed instructors to study the relationship between the students’ math-ematics and metacognitive skill levels. This information is significant, because struggling students frequently make more optimistic predictions about their knowledge than are warranted by their actual quiz scores, indicating that they often do not recognize the difference between “what they think they know” and “what they don’t know” (Tobias & Everson, 2002). As a result of this false belief, these students do not feel any need to remedy the situation by changing their “learning how to learn” behaviors. Without proper feedback, students may continue a destructive cycle of poor planning and disappointing academic outcomes. Being able to pro-
jhudesman
Sticky Note
Hudesman et al should be two references; 1. Hudesman, Crosby, Flugman, Isaac, Everson and Clay (2013); 2. Hudesman, Millet,Niezgoda, Han, and Flugman (2013).
Helping Developmental Math Students Achieve 7
vide students with ongoing feedback about the relationship between their actual performances (their quiz scores) and their metacognitive judgments (their predicted scores, and the relationship between their preparation time and their self-efficacy and self-evaluations judgments) is critical to improving students’ mathematics and SRL skill sets.
The SRL Mathematics Self-Reflection and Mastery Learning Form For each incorrectly answered quiz question, students were expected
to complete a separate self-reflection and mastery learning form. This form was designed to further assist students in assessing the relationship between their content knowledge and their ability to use critical SRL tools. In the first section of this form, students were asked to compare their predicted quiz score and their actual quiz score and explain any significant discrepancy and evaluate the accuracy of their academic confi-dence judgments, that is, their self-efficacy and self-evaluation judgments. They were then required to discuss the relationship of these judgments to their actual quiz score. Based on the instructor’s written feedback and/or prior class discussions, students needed to indicate which of the mathematics strategies they incorrectly applied on the quiz. Implicit in scoring this section of the reflection/mastery learning form is that we are looking for indications that students are acting more strategically. A constructive response would indicate that students understand how their actions were directly related to their quiz score, for example, that their preparation time was inadequate. Furthermore, these students would need to indicate how they will remedy this situation in preparing for the next quiz, for example, to work on more practice problems or meet with the tutor a specific number of times. By contrast, a less effective student response would be much more general and not include any significant description of appropriate metacognitive strategies, for instance, students who respond that they will “try harder” the next time.
In the second section of the EFA-SRL Reflection and Mastery Learning Form, students again had to solve the original mathematics problem and include a written description of the specific mathematics strategies and procedures involved in their work. Students were also required to use these same mathematics strategies to solve a similar problem. A sample EFA-SRL Reflection and Mastery Learning Form is shown in Appendix B and has also appeared in Zimmerman et al. (2011), Hudesman et al. (2013), and Hudesman et al. (in press).
jhudesman
Sticky Note
The Hudesman et al should be two references. 1. Hudesman, Crosby, Flugman, Isaac, Everson and Clay (2013); 2. Hudesman, Millet, Niezgoda, Han and Flugman (2013).
Journal on Excellence in College Teaching8
Scoring the Self-Reflection and Mastery Learning Form The EFA-SRL Reflection and Mastery Learning Form is based on a mas-
tery learning approach in which students are given multiple opportunities to use feedback in order to improve their performance. By completing the form, students had an opportunity to demonstrate how well they could constructively use feedback to master both the mathematics and metacognitive competencies necessary to solve the problem. Students who demonstrated complete mastery on this reflection form could earn up to 100% of the original credit for a problem. Instructors again used the information from the self-reflection and mastery learning form to plan lessons that demonstrated the relationship between mathematics content and EFA-SRL competencies.
Classroom Discussions and ExercisesThe quiz/self-reflection and mastery process is considered to be a major
classroom priority. Instructors are expected to have ongoing class discus-sions that focus on the relationship between how students effectively learn mathematics content and how they enhance their self-regulation skills. One example of such an activity involves having students create individ-ual graphs illustrating the relationship between their SRL judgments and their quiz scores. In another exercise, instructors might ask students to compare the time they spent preparing for the quiz and their quiz grades. The students’ responses are then listed on the board. The results often reveal an obvious correlation between the students’ preparation time and their quiz scores. Students are then asked to use the feedback from this exercise to design a revised plan for improving their work.
Instructor Selection, Training, and Support
EFA-SRL instructors were chosen from among a group of faculty who volunteered to participate. Instructors initially attended a two-day on-site training program that involved discussions on the theory and practice of EFA-SRL. Throughout the semester, instructors provided ongoing elec-tronic logs detailing their classroom experiences. A sample log, which is included in Appendix C, was completed over a three-week time period by one of the most experienced SRL instructors and illustrates how she applied the SRL model to both her students and her instruction. The log form has four main sections:
1. What was your goal for your students and/or yourself this period?
jhudesman
Sticky Note
This should read Appendix D not Appendix C
Helping Developmental Math Students Achieve 9
2. What were the mathematics content and metacognitive strategies your students and/or you were planning to use to achieve this goal?
3. What actually happened?
4. How will the results of this cycle inform your instruc-tion in the future? What worked that you will continue? What adjustments do you need to make?
Program staff responded to these logs by e-mail, telephone, and/or video conferencing sessions. In addition, there were classroom obser-vations conducted by either visiting program staff or on-site EFA-SRL coordinators, who were trained by program staff. These observations were recorded using the SRL Observation form (see Appendix C). The observers and instructors met after the class to review the observations, with the goal of assisting instructors to further improve their implemen-tation of the program. These meetings also provided an important venue for gathering information on how to develop the program further.
Performance Measures for the Developmental Mathematics Course
Grade Distributions During the third year of the grant, we were able to obtain baseline
and post intervention academic outcome data for three developmental mathematics instructors at two community colleges. The baseline data included grade distributions and pass rates from the semester before the instructor began using the EFA-SRL program. The grade distribution is based on a 13-point scale that ranges from 0 (for a grade of F) to 12 (for a grade of A+). These pre-program grade distributions were compared with grade distributions from students enrolled in the same courses taught by the same instructors after the implementation of the EFA-SRL program. Based on these grade distributions, we were able to compare the mean grades and pass rates earned by students before and after the implementation of the program.
Instructor Satisfaction Survey A brief three-question survey was conducted six months after the end
of program funding. Instructors were asked if they continued to use the EFA-SRL program in their classrooms “not at all,” “somewhat,” or “a lot.”
jhudesman
Sticky Note
This is correct. It should read Appendix C which refers to the SRL observation form
Journal on Excellence in College Teaching10
Results
The data in Table 1 indicate that student achievement in developmen-tal mathematics courses improved after the introduction of the EFA-SRL program. The percentage of students passing the course increased from 63.5% to 79.2% (χ2 = 6.95; p = .008). Similarly, the mean grade for students in the EFA-SRL program courses was 5.2, compared to a mean grade of 4.1 for students enrolled in baseline developmental mathematics courses (F = 4.84; p = .029). As part of the overall FIPSE program evaluation, we also surveyed all of the instructors, regardless of their academic depart-ment, six months after funding for the program ended. As mentioned, all instructors were asked if they continued to use the EFA-SRL in their classrooms “not at all,” “somewhat,” or “a lot.” Only one instructor (not a mathematics instructor) out of the total group of 38 instructors was no longer using the program. About half of the instructors indicated that they continued to use EFA-SRL in their classrooms “somewhat,” and the rest of the instructors reported that they were still using the EFA-SRL program in their classrooms “a lot.” These findings support the portability of the EFA-SRL model to other colleges.
Discussion
This article describes the implementation of an EFA-SRL program in developmental mathematics at two 2-year colleges. Program students, some of whom were enrolled in classes that were 500 miles away, were more successful in their developmental mathematics coursework, as in-dicated by improved grades and pass rates. These findings are in keeping with those reported by Hudesman et al. (in press) and Zimmerman et al. (2011). The authors found that students enrolled in EFAP-SRL sections of developmental mathematics achieved higher pass rates in the course and higher pass rates on the COMPASS, that is, the mathematics section of the ACT (1997, 2006). There was also some evidence that these students were able to carry over the skills they learned in the EFAP-SRL class to a subsequent credit-level mathematics course.
Instructors also seemed pleased with their participation, as indicated by their continued use of the program six months after the conclusion of the grant funding. It should also be noted that the positive academic outcomes reported in this paper are in keeping with the findings reported in several other studies carried out by the project team, for example Zimmerman et al. (2011) and Blank et al. (2007).
jhudesman
Sticky Note
Hudesman et al refers to the Hudesman, Crosby, Flugman, Isaac, Everson, and Clay (2013) article.
Helping Developmental Math Students Achieve 11
Limitations and Directions for Future Research
Because the program staff had limited access to the two community college campuses, there concerns about the fidelity of the implementation for the program. While faculty at both sites received their initial onsite training from program staff, most of the onsite follow up monitoring was done via electronic logs, video conferencing, and “surrogate” observers to do the classroom observations and follow up meetings.
As previously mentioned, the EFA-SRL program was implemented in 14 different academic areas at five different high school and college cam-puses. The developmental mathematics courses implemented at the two 2-year campuses accounted for only three of the instructor participants. Because of administrative and financial constraints, our pre/post evalu-ation design was limited to this (developmental mathematics) instructor group. Because no similar evaluation was carried out for other academic disciplines and campuses, we have no way of knowing if our results would carry over to these areas.
Another concern is that the EFA-SRL program is labor intensive for both students and instructors. It takes time for students to complete the various requirements of the program. This raises a question as to whether the students’ improvement is due to the additional time-on-task or wheth-er the improvement is due to an increase in the students’ metacognitive skill level. Our belief is that program students expected to take on more
Journal on Excellence in College Teaching12
responsibility for their learning, which, in turn, would result in increasing their time-on-task.
The EFA-SRL program also expects that instructors will put in the extra time needed to administer more quizzes, provide ongoing mathematics and SRL feedback regarding mathematics content and SRL strategies, review self-reflection/mastery learning forms and conduct classroom discussions relating SRL to the students’ learning and achievement. All of this must be included within the already crowded required mathemat-ics curriculum. Under the circumstances, it is not surprising that some instructors expressed an initial concern about completing the semester’s work. However, these concerns usually disappeared by the end of the semester. In fact, we are unaware of any mathematics instructor who was unable to complete the course curriculum. In order to address this concern and make the program delivery more efficient, we have begun to explore ways to transition from a paper and pencil delivery system to a computerized system (see Hudesman et al., 2011, a, b).
Despite the program limitations discussed here, we believe that these results, when taken together with other similar work, suggest that instruc-tors can consider the EFA-SRL program to be part of their “tool box” of approaches to be combined with other content-specific approaches and instructional methodologies.
References
ACT. (1997). Computer-adaptive placement assessment and support system in mathematics (COMPASS). American College Testing manual. Iowa City, IA: ACT.
ACT. (2006). COMPASS reference manual. Iowa City, IA: ACT.Black, P., & Wiliam, D. (1998a). Assessment and classroom learning. As-
sessment in Education: Principles, Policy & Practice, 5, 7-71.Black, P., & Wiliam, D. (1998b). Inside the black box: Raising standards
through classroom assessment. Phi Delta Kappan, 80, 139-148.Black, P., & Wiliam, D. (2009). Developing the theory of formative assess-
ment. Educational Assessment, Evaluation and Accountability, 21(1), 5-31.Blank, S., Hudesman, J., Morton, E., Armstrong, R., Moylan, A., White,
N., & Zimmerman, B. J. (2007, October). A self-regulated learning assessment system for electromechanical engineering technology stu-dents. In Proceedings of the National STEM Assessment Conference (pp. 37-45). Washington, DC: Drury University and the National Science Foundation.
The Carnegie Foundation for the Advancement of Teaching. (2009). De-
Helping Developmental Math Students Achieve 13
velopmental math. Retrieved from http://www.carnegiefoundation.org/problem-solving/developmental-math
Dignath, C., & Buettner, G. (2010). Components of fostering self-regulated learning among students. A meta analysis on intervention at primary and secondary school level. Metacognition and Learning, 3, 231-264.
Fuchs, L. S. (1995). Connecting performance assessment to instruction: A comparison of behavioral assessment, mastery learning, curricu-lum-based measurement, and performance assessment. ERIC Digest E530. [There should be a volume and pages.]
Grant, A. M. (2003). The impact of life coaching on goal attainment, metacognition, and mental health. Social Behavior and Personality, 31(3), 253-264.
Grant, A. M. (2008). Personal life coaching for coaches-in-training enhances goal attainment, insight, and learning. Coaching: An International Journal of Theory, Research, and Practice, VI(1) 54-70.
Heritage, M. (2010). Formative assessment and next-generation assessment systems: Are we losing an opportunity? (Report prepared for the Council of Chief State School Officers). Los Angeles, CA: National Center for Research on Evaluation, Standards, and Student Testing, University of California - Los Angeles.
Hudesman, J., Carson, M., Flugman, B., Clay, D., & Isaac, S. (2011a [need month for conferencs.] ). The computerization of the self-regulated learning assessment system: A demonstration program in developmental mathematics. Paper presented at the 9th annual meeting of the Education and Infor-mation Systems, Technologies, and Applications (ESTIA) [Letters don’t match the order in the name; shouldn’t it be EISTA?] Conference, Orlando, FL.
Hudesman, J., Carson, M., Flugman, B., Clay, D., & Isaac, S. (2011b). The computerization of the self-regulated learning assessment system: A demonstration program in developmental mathematics. The Interna-tional Journal of Research and Review, 6, 1-18.
Hudesman, J., Crosby, S., Flugman, B., Isaac, S., Everson, H., & Clay, D. (2013). Using an enhanced formative assessment program and self-regu-lated learning to improve student achievement. Journal of Developmental Education, 13, 2-13.
Hudesman, J., Millet, N., Niezgoda, G., Han, H., & Flugman, B. (2013). The use of self-regulated learning, formative assessment, and mastery learning to assist students enrolled in developmental mathematics: A demonstration project. The International Journal of Research and Review, 10, 1-17.
Hudesman, J., Zimmerman, B., & Flugman, B. (2005). A comprehensive cog-
jhudesman
Sticky Note
Since I am out of town I do not have access to my original hard copy version of this reference; however, when I Google it I get the following the following info:It appears in the ERIC Digest E530 and the identifier is ED 381984. There are five pages, but they are not part of a larger volume (collection) and do not include page numbers.
jhudesman
Sticky Note
EISTA is correct and the conference was in July.
Journal on Excellence in College Teaching14
nitive skills academy for associate degree students (FIPSE No. P116B010127). New York, NY: City University of New York.
Hudesman, J., Zimmerman, B., & Flugman, B. (2010). The replication and dissemination of the self-regulated learning model to improve student performance in high schools, two-year, and four-year colleges (FIPSE No. P116B060012). New York, NY: City University of New York.
Lutzer, D. J., Rodi, S. B., Kirkman, E. E., & Maxwell, J. W. (2007). Statistical abstract of undergraduate programs in the mathematical sciences in the United States. Providence, RI: American Mathematical Society. Retrieved from http://www.ams.org/cbms/cbms2005.html
National Mathematics Advisory Panel. (2008). Foundations for success: The final report of the National Mathematics Advisory Panel. Washington, DC: U.S. Department of Education.
Nicole, D., & Macfarlane-Dick, D. (2006). Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31, 199-218.
Sadler, D. R. (1989). Formative assessment and the design of instructional systems. Instructional Science, 18, 119-140.
Strong American Schools. (2008). Diploma to nowhere. Retrieved from http://www.strongamericanschools.org/files/SAS_Diploma_To_No-where_v11_FINAL.pdf
Tobias, S., & Everson, H. T. (2002). Knowing what you know and what you don’t know (College Board Report, 2002-04). New York, NY: [The?] College Board.
Zimmerman, B. J. (1990). Self-regulating academic learning and achieve-ment: The emergence of a social cognitive perspective. Educational Psychology Review, 82, 297-306.
Zimmerman, B. J. (2000). Attaining self-regulation: A social cognitive per-spective. In M. Boekaerts, P. Pintrich, & M. Zeidner (Eds.), Handbook of self-regulation (pp.13-39). San Diego, CA: Academic Press.
Zimmerman, B. J. (2002). Achieving self-regulation: The trial and triumph of adolescence. Academic Motivation of Adolescents, 2, 1-27.
Zimmerman, B. J. (2006). Integrating classical theories of self-regulated learning: A cyclical phase approach to vocational education. In D. Euler, G. Patzold, & M. Lang (Eds.), Self-regulated learning in vocational education (pp. 7-48). Stuttgart, Germany: Franz Steiner Verlag.
Zimmerman, B. J., & Bandura, A. (1994). Impact of self-regulatory in-fluences on writing course attainment. American Educational Research Journal, 31, 845-862.
Zimmerman, B. J., & DiBenedetto, M. K. (2008). Mastery learning and assessment: Implications for students and teachers in an era of high-
jhudesman
Sticky Note
College Board is correct
Helping Developmental Math Students Achieve 15
stakes testing. Psychology in the Schools, 45(3), 206-216.Zimmerman, B. J., Moylan, A. R, Hudesman, J., White, N., & Flugman, B.
(2011). Enhancing self-reflection and mathematics achievement of at-risk urban technical college students. Psychological Test and Assessment Modeling, 53(1), 108-127.
Author Note
This work was supported by a grant from the Fund for the Improvement of Post-Secondary Education (P116B060012). Correspondence concerning this article should be addressed to [email protected].
John Hudesman is a senior principal investigator at the Center for Advanced Study in Education at the CUNY Graduate School and an adjunct professor at the New York City College of Technology, CUNY. Sara Crosby is the director of First Year Programs at Brooklyn College, CUNY. Niesha Ziehmke is a first-year experience consultant [at CUNY?] . Howard Everson is the director of the Center for Advanced Study in Ed-ucation at the CUNY Graduate School. Sharlene Isaac is an SRL staff researcher and administrator at New York City College of Technology, CUNY. Bert Flugman is senior research fellow at the Center for Advanced Study in Education at the CUNY Graduate School. Barry Zimmerman is a distinguished professor emeritus of educational psycholo-gy at the CUNY Graduate School. Adam Moylan is a senior researcher at Rockman et al.