:en ED 289 939 DOCUMENT RESUME UD 025 951 TITLE Connecticut Education Evaluation and Remedial Assistance. Grade 8 Mastery Test Results: Summary and Interpretations 1986-87. INSTITUTION Connecticut State Dept. of Education, Hartford. PUB DATE 87 NOTE 140p.; For other Mastery Test results, see UD 025 949-950. PUB TYPE Reports Evaluative/Feasibility (142) EDRS PRICE MF01/PC06 Plus Postage. DESCRIPTORS *Academic Achievement; *Academic Standards; Behavioral Objectives; *Grade 8; Junior High Schools; Language Arts; *Mastery Tests; Mathematics; Scoring; *Test Construction; Writing Instruction IDENTIFIERS *Connecticut ABSTRACT The central aspect of Connecticut's agenda for educational equity and excellence is the implementation of statewide mastery testing in mathematics and language arts. The program, designed for grades four, six, and eight, assesses the skill levels of students-by measuring their performance on learning objectives they should have mastered in lower grades. Student performance also indicates the effectiveness of remedial assistance programs and regular instruction. This report summarizes the development and implementation of the Grade Eight Mastery Test. These four steps in the program are discussed: (1) mastery test development; (2) setting mastery standards by objective; (3) test administration and scoring; and (4) school district test results reporting. Statewide mastery test results are given for Fall 1986. Four charts show the percentage of students who achieved mastery for each test objective. The learning objectives, sample score report, and information about the school districts are presented in 11 appendices. (VM) *********************************************************************** Reproductions supplied by EDRS are the best that can be made from the original document. **********************************************************************
99
Embed
DOCUMENT RESUME UD 025 951 Connecticut …:en ED 289 939 DOCUMENT RESUME UD 025 951 TITLE Connecticut Education Evaluation and Remedial Assistance. Grade 8 Mastery Test Results: Summary
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
:en
ED 289 939
DOCUMENT RESUME
UD 025 951
TITLE Connecticut Education Evaluation and RemedialAssistance. Grade 8 Mastery Test Results: Summary andInterpretations 1986-87.
INSTITUTION Connecticut State Dept. of Education, Hartford.PUB DATE 87NOTE 140p.; For other Mastery Test results, see UD 025
949-950.PUB TYPE Reports Evaluative/Feasibility (142)
EDRS PRICE MF01/PC06 Plus Postage.DESCRIPTORS *Academic Achievement; *Academic Standards;
ABSTRACTThe central aspect of Connecticut's agenda for
educational equity and excellence is the implementation of statewidemastery testing in mathematics and language arts. The program,designed for grades four, six, and eight, assesses the skill levelsof students-by measuring their performance on learning objectivesthey should have mastered in lower grades. Student performance alsoindicates the effectiveness of remedial assistance programs andregular instruction. This report summarizes the development andimplementation of the Grade Eight Mastery Test. These four steps inthe program are discussed: (1) mastery test development; (2) settingmastery standards by objective; (3) test administration and scoring;and (4) school district test results reporting. Statewide masterytest results are given for Fall 1986. Four charts show the percentageof students who achieved mastery for each test objective. Thelearning objectives, sample score report, and information about theschool districts are presented in 11 appendices. (VM)
***********************************************************************Reproductions supplied by EDRS are the best that can be made
from the original document.**********************************************************************
GRADE 8MASTERY TEST RESULTSSUMMARY AND INTERPRETATIONS1986-87
CONNECTICUT
EERU.S. DEPARTMENT OF EDUCATION
Office of Educational Research and Improvement
EDUCATIONAL RESOURCES INFORMATIONCENTER (ERIC)
Vitus document has been reproduced asreceived from the person or organizationoriginating it.
O Minor changes have been made to tmproareproduction quality
Points of view or opinions stated in Mut doomment do not necessarily represent officialOERI position or policy
"PEPMISSION TO REPRODUCE THISMATERIAL HAS BEEV GRANTED BY
TO THE EDUCATIONAL RESOURCESINFORMATION CENTER (ERIC).
STATE OF CONNECTICUT DEPARTMENT OF EDUCATION
2 BEST COPY AVAILABLE
State of Connecticut
William A. O'Neill, Governor
Board of Education
Abraham Glassman, ChairmanJames J. Szerejko, Vice ChairmanA. Walter EsdaileWarren J. FoleyDorothy C. GoodwinRita L HendelJohn F. MannixJulia S. RankinHumberto Solano
Norma Foreman Glasgow (ex officio)Commissioner. of Higher Education
Gerald N. TirozziCommissioner of Education
Frank A. AltierlDeputy CommissionerFinance and Operations
Lorraine M. AronsonDeputy CommissionerProgram and Support Services
3
,
ConnecticutEducation Evaluation and Remedial Assistance
GRADE 8 MASTERY TEST RESULTS
SMART AND INTERPRETATIONS: 1986-87
STATE OF CONNECT CUT DEPAR]MEIT OF EDUCATION
4
CONTENTS
ForewordAcknowledgements vii
LEGISLATIVE BACKGROUND 1
OVERVIEW OF THE MASTERY TEST DEVELOPMENT PROCESS 2
Test Construction 2
Pilot Tests 3
Survey 4
Mastery Test Content 4
SETTING MASTERY STANDARDS BY OBJECTIVE 5
Setting Remedial (Grant) Standards 6
TEST ADMINISTRATION AND SCORING 7
Testing Guidelines: Grade Eight Connecticut Mastery Test 8
Scoring of the Language Arts and Mathematics Test 8
Scoring of the Writing Sample 8
Analytic Scoring 11Scoring of the Degrees of Reading Power (DRP) Test 11
SCHOOL DISTRICT TEST RESULTS REPORTING 11
FALL 1986 STATEWIDE MASTERY TEST RESULTS 12
Mathematics 12
Language Arts 12
Test Results by District 17
Participation Rate Results 17
Charts
Chart 1: Mathematics: Percent of StudentsAchieving Mastery for Each Objective 13
Chart 2: Language Arts: Percent of StudentsAchieving Mastery for Each Objective 14
Chart 3: Writing Sample: Percent of Students at Each Score Point 15Chart 4: Degrees of Reading Power: Percent of Students
at Selected Ranges of DRP Unit Scores 16
APPENDICES
Appendix A: Grade Eight Mathematics Objectives 19
Appendix B: Grade Eight Language Arts Objectives 23
Appendix C: Remedial (Grant) Staadard-Setting Process 25
Appendix F: Sample Grade Eight Mastery Test Score Reports 47
Appendix G: Number of Objectives Mastered 61
Appendix H: Fall 1986 Grade 8 State by District Report: Mathematics 65
Appendix I: Fall 1986 Grade 8 State by District Report: Language Arts 81
Appendix J: Type of Community Classifications 89
Appendix R: Student Participation Rates 91
-iv-
6
FOREWORD
One of my highest priorities and a very central aspect of Conne cicut'sChanel) e: An Agenda for Educational E uity and Excellence is theimplementation of the statewide mastery testing program in mathematics andlanguage arts, including listening, reading and writing, for grades four, six,
and eight. The testing program is designed to assess specific skill levels ofstudents by measuring performance on various learning objectives that studentsreasonably can be expected to have mastered by the end of grades three, five,and seven.
The results of the Connecticut Mastery Test are useful in evaluating:
o individual student performance in mathematics and language arts;
o the effectiveness of instructional programs in mathematics andlanguage arts; and
o the effectiveness of the remedial assistance programs in mathematicsand :anguage arts.
The Grade Eight Connecticut Mastery Test, given for the first time in the fallof 1986, provides valuable educational information which can be used toimprove instruction and the basic skills of Connecticut's students. The test
results have helped local districts to re-examine curriculum and to identifystudents who have not mastered certain skills.
I encourage you to carefully review the mastery test results provided at thestudent, classroom and district levels. The Department is prepared to assistlocal school districts in the areas of curriculum and professional development.
Gerald N. TirozziCommissioner of Education
-v- 7
MASTERY TEST IMPLEMENTATION ADVISORY COMMITTEE
Thomas Jokubaitis, Chair, Wolcott Public SchoolsGerry BrownSpringer, New Britain Public SchoolsBenjamin Dixon, Bloomfield Public SchoolsTimothy Doyle, Regional School District No. 4Richard Dubow, Wilton, ConnecticutCharles Guinta, Walden Book Co., Inc.Cosby Marable, Hamden Public SchoolsJohanna Murphy, Hartford, ConnecticutOlive Niles, East Hartford, ConnecticutPhilip Pelosi, Watertown Public SchoolsEdward Reidy, West Hartford Public SchoolsLouis Saloom, Meriden Public SchoolsHark Waxenberg, Eust Hartford Public Schools .
Lauren WeisbergKaufman, CT Business & Industry Assoc.
MATHEMATICS ADVISORY COMMITTEE
Steve Leinwand, Chair, CT State Department of EducationLinda Ball, Glastonbury Public SchoolsPat Banning, Windham Public SchoolsBetsy Carter, CT State Department of EducationMitchell Chester, Suffield Public SchoolsWalter Clearwaters, Naval Underwater Sys. Ctr., New LondoLeroy Dupee, Bridgeport Public SchoolsDavid Howell, New Haven Public SchoolsMarcia Kenefick, CT State Department of EducationHilda Negron, Hartford Public SchoolsMary Ann Papa, West Hartford Public Schools
Joanne Parr, Bloomfield Public SchcolsPhilip Pelosi, Watertown Public SchoolsHelen Prescott, Ashford Public SchoolsJoyce Reilly, Meriden Public SchoolsCarolyn Rosenfield, Norwalk Public SchoolsSylvia Schmutzler, Middletown, ConnecticutJan Siegel, Ridgefield Public School('Dolores Vecchfarelli, Westport Public Schools
BIAS ADVISORY COMMITTEE
Lillian Cruz, Chair, CT State Department of EducationBenjamin Dixon, Bloomfield Public SchoolsM. Claudius Fabregas, Bridgeport Public SchoolsJanet C. Huber, Windham Public SchoolsRita Jackson, Stamford Public SchoolsSusan McCarthy-Miller, South Windsor Public SchoolsHarriet McComb, Yale Child Study CenterRonald S. McMullen, New Haven Public SchoolsJames F. Mitchell, Groton Public SchoolsAngel Muniz, Bridgeport Public SchoolsLyn Nevins, Cooperative Educational ServicesRobert Pitacco, Hartford Public SchoolsRosa Quezada, New Haven Public SchoolsNelson Quinby, Regional School District No. 9
-vii-
ACKNOWLEDGEMENTS
LANGUAGE ARTS ADVISORY COMMITTEE
Robert Kinder, Chair, CT State Department of EducationRuth Allen, Western CT State UniversityEvelyn Burnham, Colebrook Public SchoolsSue Deffenbaugh, West Hartford Public SchoolsMartin Espinola, Granby Public SchoolsMary Fisher, Thmapson Public SchoolsMarguerite Fuller, Bridgeport Public SchoolsJohn Hennelly, Old Saybrcoh Public SchoolsJane Jaaskela, Brooklyn Public SchoolsJean Klein, Newtown Public SchoolsOlive Niles, East Hartford, ConnecticutJacqueline Norcel, Trumbull Public SchoolsCarol Parmelee, Middletown Public SchoolsLucille Rios, Hartford Public SchoolsRonald Rymash, North Stonington Public SchoolsGeraldine Smith, Canton Public SchoolsMary WeiUland, CT State Department of Education
PSYCHOMETRICS ADVISORY COMMITTEE
Robert Gable, Chair, University of ConnecticutBaxter Atkinson, Hartford Public SchoolsDel Eberhardt, Greenwich Public SchoolsVictor Ferry, Waterford Public SchoolsDiane Klotz, New London Public Schools
n Michael Muro, Norwalk Public SchoolsEdward Reidy, West Hartford Public SchoolsJudy Singer, Stamford Public SchoolsJames Snyder, Windsdr Public SchoolsWilliam Streich, Farmington Public SchoolsJ. A. Camille Vautour, South Windsor Public Schools
Special thanks to:
John Whritner, former Chair, Mastery Test ImplementationAdvisory Committee and East Lyme Public Schools
Marsha Van Hise, language arts committee alternate,Trumbull Public Schools
LEGISLATIVE BACKGROUND
In June 1984, the General Assembly of the State of Connecticut amended Section10-14 m-r of the Connecticut General Statutes, an act concerning EducationEvaluation and Remedial Assistance (EERA). This law provides that:
o By May 1, 1985, each local or regional board of education shalldevelop and submit for State Board of Education approval, a new planof educational evaluation and remedial assistance. Each plan is toaddress the following:
o the use of student assessment results for instructionalimprovement;
o the identification of individual students in need of remedialassistance in language arts/reading, and mathematics;
o the provision of remedial assistance to students with identifiedneeds; and
o the evaluation of the effectiveness of the instructionalprograms in language arts/reading, and mathematics.
o The State Board of Education shall administer an annual statewidemastery test in language arts/reading, and mathematics to allfourth-, sixth-, and eighth-grade students.
o Each student who scores below the statewide remedial standard on oneor more parts of the eighth-grade mastery examination or the ninthgrade proficiency test shall be retested. Starting in October 1987,these students shall be retested annually, using the eighth-grademastery test, only in the deficient area(s) until such students scoreat or above the statewide remedial standard(s).
o Biennially, each local or regional board of education shall submit tothe State Board of Education a report which includes indicators ofstudent achievement and instructional improvement.
o On a regularly 'cheduled basis, the State Board of Education shallcomplete field assessments of the implementation of local EERA plans.
o On an annual basis, test results and low income data shall be used todetermine the distribution of available state funds to supportremedial assistance programs.
The purpose of this report is to summarize the development andimplementation of the eighth-grade Connecticut Mastery Test. The mastery testassesses how well each student is performing on those skills identified bycontent experts and practicing educators as important for students enteringeighth grade to have mastered.
-1-9
OVERVIEW OF THE MASTERY TEST DEVELOIMENT PROCESS
In the spring of 1984, the Connecticut General Assembly amended the EducationEvaluation and Remedial Assistance (SERA) legislation to authorize thecreation of mastery tests in the basic skill areas 'of mathematics and languagearts, including listening, reading and writing skills. The tests were to beestablished for grades 4, 6, and 8.
The goals of the mastery testing program are:
o earlier identification of students needing remedial education;o testing a more comprehensive range of academic skills;o setting high expectations and standards for student achievement;o more useful test achievement iaformation about students, schools and
districts;o improved assessment of suitable equal educational opportunities; ando continual monitoring of students in grades 4, 6, and 8.
The type of test that best addresses these goals is a criterion-referencedtest. Criterion-referenced tests are designed to assess the specific skilllevels of students. Such tests usually cover relatively small units ofcontent. Their scores have meaning in terms of what the student knows or cando. Test results are used to identify the areas of strengths and weaknessesof each student.
Test Construction
The development of the eighth-grade criterion-referenced mastery test requiredthe formation of seven statewide advisory committees. These included theMathematics and Language Arts Committees, the Psychometrics Committee, theBias Committee, the Mastery Test Implementation Advisory Committee, and two
standard-setting committees, one for mathematics and one for language arts.These committees were comprised of representatives from throughout the state.Members were selected for their area of expertise. Approximately 150Connecticut educators participated on the mastery test committees which metover 80 times over an 18-month period (see Acknowledgements, p. vii).
Beginning in the spring of 1985, content committees in both language artsand mathematics participated in each stage of the test development process,including assisting the State Department of Education in the selection of thePsychological Corporation as its test contractor. First, the contentcommittees reviewed the curriculum materials prevalent throughout the stateand the scope of the national tests in use in Connecticut at the respectivegrade levels. Additional resources included the Connecticut curriculum guidesin mathematics and language.arts, developed in 1981, as well as the results ofrecent Connecticut Assessment of Educational Progress (CAEP) assessments inmathematics and language arts. Next, the committees identified sets ofpreliminary mathematics and language arts objectives which reflected existingcurriculum materials and the goals of the mastery testing program. Thecontent committees defined an objective as an operationalized learning outcomethat was fairly narrow and clearly defined.
Four criteria we' used in identifying the appropriate learning outcomesor test objectives and in selecting specific test items to be included on theGrade 8 Connecticut Mastery Test. To have been considered for use, testobjectives and items must have been:
(1) significant and important;
(2) developmentally appropriate;(3) reasonable for most students to achieve; and(4) generally representative of what is taught in Connecticut schools.
Once the objectives were identified, item specifications and/or sampleitems were written. Ittm specifications are written descriptions of the types
and forms of test items that assess an objective. They also prescribe thetypes of answer choices that can be used with each item.
After the test specifications were written and agreed upon, the testcontractor wrote items and response choices for each of the objectives. The
items were then reviewed by the content committees. Items which met thecriteria of the test specifications and received the approval of the contentcommittees were considered for the pilot test. Before testing, the BiasCommittee reviewed each item for potential adverse discrimination of gender,race or ethnicity in the language or format of the question or responsechoices. After their review was completed, the pilot test forms wereconstructed. Over 1600 customized Connecticut items were included in theOctober 1985 Grade 8 pilot test in language arts and mathematics.
The Psychometrics Committee provided advice concerning other aspects ofthe pilot test including the sampling design, statistical bias analysis, thedesign of item specifications, and pilot test administration procedures. The
recommendations proposed by the Psychometrics Committee were reviewed andendorsed by the Mastery Test Implementation Advisory Committee.
Pilot Tests
After the items had been reviewed, twelve test forms (six is mathematics, andsix in language arts) were piloted for the Grade 8 test. The purpose ofseveral pilot test forms was to enscre that enough test items were included toconstruct three comparable test forms from the pilot test results.
Over 8,000 Grade 8 students participated in the October 1985 pilot test.In January 1986, the pilot test results were made available to ConnecticutState Department of Education (CSDE) staff. The process of selecting items toconstruct three comparable test forms began by the Bias Committee examiningthe pilot test statistics of each item for potential bias. As a result, someitems were.eliminated from the item pool. From the remaining items, testforms were constructed to be equivalent in conte.tt and difficulty at both the
objective and total test levels.
11
Once the items were sorted on this basis, the test contractor preparedthree complete forms of the mathematics test and two complete forms of thelanguage arts test. Thiele forms were approved by the content committees.Each form was created to be equal in difficulty and test length. A tairdlanguage arts test will be constructed after a few additional items arepiloted as part of a future test administration. The psychometric proceduresused to construct these test forms focus erimarily on the use of theone-parameter latent trait model.
Survey
In October 1985, a survey of preliminary Grade 8 mastery test objectives wassent to over 4,000 Connecticut educators. The purpose of the survey was todetermine (1) the importance of the proposed mathematics and reading/languagearts objectives; and (2) whether the objectives were taLght prior to the fallof grade 8. Approximately a 45% response rate was achieved which includedapproximately one-third of the respondents representing urt-a schooldistricts. Thirty-six out of the thirty-seven original objectives were judgedto be important learning skills or outcomes.
Mastery Test Content
Mathematics. The Mathematics Committee recommended a Grade 8mathematics test that assessed thirty-six (36) specific objectives in fourdomains: (1) Conceptual Understanding; (2) Computatioual Skills; (3) ProblemSolving/Applications; and (4) Measurement/Geometry. There are four test itemsper objective for a total of 144 items on the mathematics test. A detailedlist of domains and objectives is given in Appendix A (p. 19).
Language Arts. The Language Arts committee recommended a 111 itemGrade 8 language arts test that cover- two domains: Reading/Listening, andWriting/Study Skills. The eleven (11) objectives recommended by the LanguageArts Committee are presented in Appendix B (p. 23).
The general content area of Reading/Listenim consisted of narrative,expository, and perauasive passages on a variety of topics measuring astudent's ability in: (1) Literal Comprehension; (2) Inferential orInterpretive Comprehension; and (3) Critical or Evaluative Comprehension.Audiotapes were used to assess students' listening comprehension ability in:(1) Literal Comprehension and (2) Inferential and Evaluative Comprehension.The Degrees.of Reading Power (DRP) test was als:.1 used to assess reading. The)1tRP test included eleven (1]) passages and seventy-seven (77) test items. Itvas designed to measure a student's ability to underst nonfiction Englishprose at different levels of reading ability.
-4- 12
The general content area of Writing/Study Skills consisted of threecomponents. First, there was a holistic writing sample where writing skillswere directly assessed. Each student was asked to write a composition on adesignated topic. Writing was then judged on a student's demonstrated abilityto convey information in a coherent and organized fashion. Second, themechanics of good writing, which was defined as (1) Capitalization andPunctuation, (2) Spelling, (3) Agreement, and (4) Tone was assessed in amultiple choice format. Third, Study Skills were assessed through LocatingInformation and Notetaking/Outlining. Locating Information (Schedules, Maps,Index and Reference Use) measured a student's ability to find and useinformation from the sources listed. Notetaking and Outlining tested astudent's ability to take notes and report information as well as completemissing outline information. A detailed list of objectives and number ofitems per objective is given in Appendix B (p. 23).
SETTING MASTERY STANDARDS BY OBJECTIVE
The essence of the Connecticut Mastery Test (CMT) is the establishment of aspecific mastery standard that accurately reflects students' knowledge andcompetency on each objective. The mastery test incorporates appropriate andchallenging expectations for Connecticut public school students. The goal ofthe (MT Program is for each student to achieve mastery of all objectives. Theobjectives being tested ware identified as appropriate and reasonable forstudents at each of the grades tested. These tests are designed to measure astudent's performance against these specific objectives.
The process of establishing the mastery standards by objective used astatistical method that required two decisiofts to be operationalized. Thefirst decision defined a student who mastered a particular skill as one whohad a 95% chance of correctly answering each item within the objective. Thesecond decision was that the specific standard for each objective wouldidentify 99% of the students who mastered the skill. For example, literalreading comprehension is measured by 8 questions. By applying the twodecision rules stated above to a binomial distribution table, a student isidentified as mastering the skill if he/she gets at least 6 of the 8 itemscorrect.
The mastery standards are as follows:
o In mathematics, for each of the 36 objectives, a student must answercorrectly at least 3 out of 4 items.
o In language arts, for the eleven multiple choice objectives withvarying numbers of items, a student must answer correctly thefollowing number of items:
-5-
13
WRITING MECHANICS(1) Capitalization & Punctuation(2) Spelling(3) Agreement(4) Tone
STUDY SKILLS(5) Locating Information(6) Notetaking and Outlining
No mastery levels were set for the two holistic language arts measures,the Degrees of Reading Power (DRP) test and the Writing Sample, since thesemeasures are not composed of objectives against which mastery could beassessed.
Setting Remedial (Grant) Standards
The Psychometrics Committee also considered alternative ways to set standardsfor grant and remedial purposes. Section 10-14 m-r of the CT General Statutesrequires that the Connecticut State Board of Education establish statewidestandards for remedial assistance in order to meet two responsibilities:
to identify and monitor the progress of students in need, of remedialassistance in language arts/reading and mathematics as part of theEERA field assessments; and
to distribute EERA funds based on the number of needy studentsstatewide, as well as for use in the Chapter 2 and Priority SchoolDistrict Grants.
The Psychometrics Committee advised setting the standards by the number ofitems correct because of important technical considerations in equating testforms. The committee conducted lengthy deliberations over the technicalfeasibility of establishing standards by the number of objectives passed butfelt there were significant obstacles which could not be overcome.Standard-setting committees in mathematics and language arts/reading wereconvened in March 1986 to determine the grant/remedial standards. Thestandard-setting committees recommended the following remedial standards:
14-6-
1. In mathematics, a student who answers fewer than 78 of the 144 items(54%) correctly is required to receive further diagnosis by the localschool district and, if necessary, to be provided with remedialassistance.
2. In reading, a student whose Degrees of Readirg Power (DRP) unit scoreis lower than 55 is required to receive furtber diagnosis and, ifnecessary, to be provided with remedial assistance.
3. In writing, a student receiving a total holistic score less than 4 isrequired to receive further diagnosis by the local school districtand, if necessary, to be provided with remedial assistance.
The recommendations of the Psychometrics Committee and theStandard-Setting Committees were reviewed by the Mastery Test ImplementationAdvisory Committee in March 1)86. The Mastery Test Implementation AdvisoryCommittee (MTIAC) endorsed the procedures used to establish the remedialstandards with the clarification that the remedial standards should beconsidered broad indicators of student achievement and need. Thecriterion- referenced test is a valuable diagnostic tool used to help districtsidentify students in need of remedial assistance, to target State Departmentof Education resources to those students most in need, and to provide usefulinformation to local school districts for improving their curriculum andinstructional programs. The MTIAC felt strongly that the data generated bythe State Department of Education should not be used to compare performanceamong districts.
The mastery and remedial standards were adopted, as recommended, by theState Board of Education on June 4, 1986. For a detailed explanation of theremedial standard-setting process, see Appendix C (p. 25).
TEST ADMINISTRATION AND SC(11NG
Test sessions were conducted by local school district staff under thesupervision of local test coordinators who had been trained by staff of theDepartment and The Psychological Corporation. A student who took all subtestsparticipated in approximately eight hours of testing.
The Grade 8 Mastery Test schedule allowed for three weeks of testing(including make-ups). This allowed local districts as much latitude aspossible in adapting test administration to local conditions, in meetingstudents' needs, and in accommodating religious holidays that Occur duringtesting. Local plans for administration of the Grade 8 Mastery Test wereacceptable if the following guidelines were met for all students:
-7-
1 5
Testing Guidelines: Grade 8 Connecticut Mastery Test
a) The writing sample MUST occur on Tuesday, September 23, 1986.b) Other testing must occur sometime between September 22
and October 3, 1986, with make-up testing during the week ofOdtober 6-10.
c) All eighth graders in -a district must be tested on the same schedule.d) Testing must occur during the regular school day in a regular
classroom setting.e) No more than two (2) testing sessions may be administered in one day
with at least a fiftee minute break between testing sessions (e.g.,two a.m. sessions or one a.m. session and one p.m. session).
f) Make -up sessions MUST conclude by Fridcy, October 10, 1986.Conditions "d" and "e" above must also hold for all make-up sessions.
The Grade 8 Connecticut Mastery Test had eight testing sessions.
Mathematics I (60 minutes)Mathematics II (60 minutes)Mathematics III (60 minutes)Writing Sample (45 minutes)Degrees of Reading Power (70 minutes)Reading Comprehension (60 minutes)Listening Comprehension (45 minutes)Writing Mechanics/Study Skills (60 minutes)
At the conclusion of the make-up testing period, answer booklets werereturned to National Computer Systems (NCS) of Iowa City, Iowa for opticalscanning and scoring, and then organized in preparation for holistic scoringworkshops.
Scoring of the Language Arts and Mathematics Test
The mathematics and language arts multiple-choice tests were machine-scored byNCS. Mathematics scores were reported for the total test as well as formastery by each objective. Likewise, language arts scores were reported forthe total test as well as for mastery of each objective.
Scoring of the Writing Sample
The writing sample was scored by Connecticut elementary teachers using atechnique known as the holistic scoring method. Holistic scoring is animpressionistic and quick scoring process that rates written products on thebasis of their overall quality. It relies upon the scorers' trainedunderstanding of the general features that determine distinct levels ofachievement on a scale appropriate to the group of writing pieces beingevaluated.
1 6-8-
The major assumption upon which holistic scoring is based is that thequality of a piece of writing should be js,dged on its overall success as awhole presentation, rather than on the quality of its component parts.Contributing to the rationale underlying holistic scoring is evidence that:(1) no aspect of writing skill can really be judged independently; (2)teachers can recognize and agree upon good writing when they see it regardlessof how they describe writing ability; and (3) teachers will rate pieces ofwriting in much the same way regardless of any discrepant views they mighthold about how particular components of writing should be weighed.
The procedure for holistic scoring is specific to the complete set ofwriting samples on a given topic that a group of scorers have been asked toevaluate. That is, the scoring scale is based on the range of abilityreflected in the particular set of writing samples being assessed.
Preparation for scoring. Prior to the. training/scoring sessions, acommittee consisting of Connecticut State iTartment of Education (CSDE)consultants, representatives of the language arts committee and other languagearts specialists, two Chief Readers and project staff from Measurement Inc. ofDurham, North Carolina, met and read a substantial number of essays drawn fromthe total pool of essays to be scored. Approximately 60 essays were selectedto serve as "range-finders" or "marker papers," representing the range ofachieveient demonstrated in the total set of papers. Copies of thoserange finders served as training papers during the scoring workshops whichfollowed. Each range-finder paper was assigned a score according to afour-point scale, where 1 represented a poor paper and 4 represented asuperior paper.
Scoring workshops. During the month of November, eight holistic scoringworkshops were held in two different locations in the state. Attendance atthe grade eight scoring workshops totaled 210 teachers. A Chief Reader andtwo aesistaLts were present at every workshop in addition to representativesof the CSDE. Each workshop consisted of a training session and a scoringsession.
The general procedure for a training session is described below.
o Each training paper (range-finder) was studied in turn andtrial-scored by all scorers. Scoring judgments were independent,quick, immediate, and were based on the scorer's overall impressionof the paper. No fractional potats on the score scale (1-4) werepermissible.
o After all scorers had scored the first four training papers, theirjudgments were compared to the score assigned during therange - finding process. Any discrepancies were discussed. Throughrepeated discussions on succeeding training papers, scorers came toidentify and internalize those features of written composition thatdistinguish the papers along the established range. This "holistic"process obviates the need to articulate explicitly the specificcriteria that separate one score point from the next.
1(
o Scorers were "calibrated" by ascertaining that they were makingjudgments consistent with one another and with the Chief Reader.Discussions about papers continued until agreement was reached on the
scores of the training papers.
Once scorers were calibrated, actual scoring of the writing exercisesoccurred. Each paper was read independently by two different scorers; thatis, the second reader did not see the score assigned by the first reader. TheChief Reader was responsible for adjudicating any disagreement of more thanone point between the judgments of the two scorers as well as any score incombination with a zero score. In other words, discrepancies of one pointbetween scores (e.g., 4 and 3, 1 and 2, 2 and 3) were acceptable, but largerdiscrepancies (e.g., 2 and 4, 3 and 1, 1 and 4) had to be resolved by the
Chief Reader. Once a paper was assigned two non-discrepant scores, the twoscores would be summed to produce the final score for each student. The
possible scale of summed scores ranged from a low of 2 to a high of 8.
Understanding the holistic scores. Examples of actual student papers
which are representative of the scoring range will assist the reader inunderstanding the statewide standard set for writing and interpreting the test
results. Sample papers representing four different holistic scores arepresented in Appendix D (p. 31). Note that the process of summing the scoresassigned by the two readers expands the .,coring scale to account for
"borderline" papers. A paper which receives a 4 from both scorers (for atotal score of 8) is likely to be better than a paper to which one readerassigns a 4 and another reader assigns a 3 (for a total score of 7). In
addition, it should be emphasized that each of the score points represents arange of student papers--some 4 papers are better than others.
A score of zero (0) was assigned to student papers in certain cases. Ascore of 0 indicates that a paper is not scorable and, therefore, that thestudent's writing skills remain to be assessed. The cases in which a score of
0 was assigned were as follows:
o responses merely repeated the assignment;
o illegible responses;
o blank responses;
o responses in languages other than English;
o responses that failed to address the assigned topic in any way; and/or
o responses that were too brief to score accurately, but whici.demonstrated no signs of serious writing problems (for example, aresponse by a student who wrote the essay first on scratch paper andwho failed to get very much of it recopied).
18-10-
Both readers had to agree that a paper deserved a zero before this scorewas assigned. If the two readers disagreed, the Chief Reader arbitrated thediscrepancy. Papers which were assigned a score of zero were not included insummary reports of test results.
Analytic Scoring
All papers receivirg holistic scores below the remedial standard also receivedanalytic scoring in five categories (traits): focus, organization, support/elaboration, mechanics and sentence formation. Analytic scoring is athorough, trait-by-trait analysis of those components of a writing sample thatare considered important to any piece of writing in any context. This scoringprocedure can provide a comprehensive picture of a student's writingperformance if enough traits are analyzed. It can identify those traits thatmake a piece of writing effective or ineffective. However, the traits need tobe explicit and well defined so that the raters understand and agree upon thebasis for making judgments about the writing sample. The analytic ratingguide and sample marker papers for the analytic scoring are presented inAppendix E (p. 41).
Scoring of the Degrees of Reading Power (DRP) Test
The scores reported are in DRP unit. scores. These scores identify thedifficulty or readability level of prose that a student can read withcomprehension. This makes it possible to match the difficulty of writtenmaterials with student ability. These scores can be better interpreted byreferring to the readability levels of some general reading materials as shownbelow:
o Elementary textbooks (grades 7-9) - 54-65 DRP Units
o Personality Section - teen magazines - 55 DRP Units
o Adult General Interest Magazines - fiction - 60 DRP Units
A much more extensive list of reading materials is contained and rated inthe booklet Readability Report, Seventh Edition, published by The CollegeBoard.
The conversion between DRP unit scores and raw scores can be made from thetabled values in The College Board's Degrees of Reading Power Form PB SeriesCoLversion Tables, effective March, 1985.
SCHOOL DISTRICT ZEST RESULTS REPORTING
The CMT school district reports are designed to provide useful andcomprehensive test achievement information about students, schools anddistricts. Four standard test reports are generated to assist teachers,principaRs, superintendents and parents to understand and usecriterion - referenced test results. Appendix F (p. 47) presents samples of theschool district and parent/student diagnostic score reports.
19
FALL 1986 STATEWIDE MASTERY TEST RESULTS
The Grade Eight Connecticut Mastery Test provides a comprehensive report cardon how students perform on specific skills that Connecticut educators feel areimportant at the beginning of eighth grade. The mastery test isinstructionally useful since it identifies areas of weakness, as well as areasof strength.
Mathematics
In mathematics,,eighth graders mastered an average of 23.7 objectives of the36 tested, or 65.8 percent. The state's goal is that all students masterevery objective, or 100 percent. Chart 1 (p. 13) illustrates that, statewide,students demonstrated strong scores in the areas of basic computational skills(such as multiplication/division with whole numbers and addition/subtractionwith whole numbers and decimals); rounding of whole numbers; and computingwith calculators. However, students did not perform as well on items thatassess computational skills with fractions and mixed numbers; measurement; andsolving process problems involving the organization of data.
A total of 35 percent of the students mastered 29 or more objectives onthe mathematics test, and 4 percent mastered all 36 objectives (seeAppendix G, p. 61).
Students getting fewer than 78 questions correct on the 144-questionmathematics section (177.) were identified as needing further diagnosis andpossible remedial instruction.
Language Arts
In language arts, eighth grade students averaged 7.5 objectives of the eleventested, or 68.2 percent. The state's goal is that all students master everyobjective, or 100 percent. Chart 2 (p. 14) illustrates that while studentsdid reasonably well on writing mechanics and on study skills, significantweaknesses were found in higher order inferential and evaluative readingcomprehension and literal listening comprehension. A total of 48 percent ofthe students mastered nine or more objectives on the language arts test, whichincludes writing and reading skills, and 21 percent of the students masteredall eleven objectives (see Appendix G, p. 61).
In writing, eighth grade students averaged 5.0 points on a scale of 2through 8. The state's goal is that all students be able to produce anorganized, well-supported piece of writing, that is, a score of 7 or 8.Chart 3 (p. 15) illustrates that 20 percent of the students produced anorganized, well-supported piece of writing (a 7 or an 8 score), and anadditional 39 percent produced a paper which is senerally well organized (a 5or a 6 score). Another large group, 25 percent, scored a 4, which is definedas a "minimally proficient piece of writing." A total of 17 percent of thestudents scored a 2 or a 3, which is below the remedial standard.
-12- 20
MATHEMATICS:AVERAGE NUMBER OF
OBJECTIVES MASTERED
This bar chart illustrates theaverage number of mathematics objectives mastered.statewide.
MATHEMATICS: PERCENT OF STUDENTS ACHIEVING MASTERY FOo EACH OBJECTIVE
O
CaCIATual wean 1.0.01MIA MaColl
C4(WALIMoe VAQUISIVIIIIM.0.01:40.41WOO., wall ftl OW lir N. W. WOO tWer INCT04. 1CTI.0184X.4445 MC1104CfpuksCOWUIT IRKTC00110.4110consOVTIN Hi 01 MAMA 'NIL SCALIA NMary MTIOSNOIRKIVNY. PM'SLWOW, POCCIPAII ION ~AMC. It$1.4101
CLINOVTATIPIALICO NO INJOIIMCI 100.11MINKMIMNMTLM Or1011.0110.1144.4ICO IMO PAW= t*OrALSID OYMICT O101ut P M PO0001101 OF OICW13A0001111PACT SUCTO4 LIO 1020 U00110.1A1rilr IFPAC110,4 NO OWSIAS011,1.04 MOM OF MAURe17e1elliA4011,11. 0,111111 .1 MO NOMAIsrtKn MCOOLCT Or was rs axe wawamwn DUCT004 IM11"11 Was el
NOMUJI SOLvIMLIMUCAtiOntacOlviltMATON warn 1X11011101W rINFif 040144. 741p MO P.A11100111. MO VIM .0141001111
MO 1.11V1.:44.04~41.}4100110410104 MOW N) MIMIWIVOrtSWIMS& 11100400 0.44141011Minfer117.01 a 404301011441140*10.414011!14 MM 177114401)1 INPCNAMNCOrtfr)d10110 PeOluth0/11.1 /MUMKrell MCCUISM0111.11.0.1400210
MaluALANDOIMITWI104W5n01.0l11443 060.4111C WW1AgAgAg 41001714a4 1/M4710 MO WAS11110401 it WineMINY01.104/..011$111C1 AMMOrradt vITAC0410.R11, tonWM WAStOCANT10.1.S.C4 woo 153714
This bar chart illustrates the percent of students. statewide, who mastered each of the 36 mathematics obit:409es.
21
pM rra a 1,3
ar+O 0fi
Pti
O mtt m
rtA 0trO CA
L0'. 0fti.
tAt mt-h.
03
22
LANGUAGE ARTS:AVERAGE NUMBER OF
OBJECTIVES MASTERED
O 11
c:W Ito
2fl
U
0u.0rtco
1088
YEAR
This bar chart Illustrates theaverage number of languagearts objectives mastered,statewide.
LANGUAGE ARTS: PERCENT OF STUDENTS ACHIEVING MASTERY FOR EACH OBJECTIVE
WRITING MECHANICS
CAPITALIZATION AHD PUNCTUATION
SRELUNG/HOMONYIATJARIVIEVIATION$
AGREEMENT
TONE
STUDY SKILLS
LOCATING INFORMATION
NOTETAIONICUTUNINO
USTENING COMPREHENSION
LITERAL
INFERENIMUEVALUATNE
READING COMPREHENSION
UTEFtAL
INFERENTIAL
EVALUATIVE
1;st4:41: 41,7-.Yi I 72%
" ;.W.4.- c 66%76%
77%
.4 183%
: 0';:.;?1`4a4i-::,..i. 166%
f:t...r.;.;?7W.rnt*/ 162%
% a . '.Wit; :,..11;4:,:tZ4xti"....4 ..7.itirii 70%
rfiFej8 54%
":!:kkg4"1^)/ .:41;'tf 6.1 57%
I 1 1 I 1 1 1 1 1
10 20 30 40 50 60 70 80 90 100
PERCENT OF STUDENTS
This bar chart illustrates the percent of students, statewide, who mastered each of the eleven language arts objectives.
3.
WRITING SAMPLE:AVERAGE HOUSTIC SCORE
6...
5
3
2
5.0
1986
WRITING SAMPLE:PERCENT OF STUDENTS' AT EACH SCORE POINT
30 ..--
10--
5
0
25%
2216
17%
2 3 4 5 6 7
YEAR HOLISTIC WRITING SCORES
This bar chart illustrates theaverage holistic writing scoreof students, statewide.
This bar chart illustrates the distribution of students who received east 7olistic writingscore, statewide. Holistic writing scores are Interpreted as follows: a student whoscores 7 or 8 has produced a paper which Is well written with developed suppor-tive detail; a student who scores 5 or 6 has prodmed a paper which Is generallywell organized with supportive detail: a student who scores 4 is minimally profi-dent; and a student who scores 2 or 3 is in noed of further diagnosis and possibleremedial assistance.
Chart 3Writing Sample: Percent of Students at Each Score Point
-15- 24
s
DEGREES OF READINGPOWER* (DRP)s :
AVERAGE DRPUNIT SCORE
This bar chart illustrates theaverage DRP unit score of students, statewide.
DEGREES OF READING POWERS (DRP)s ;PERCENT OF STUDENTS AT SELECTED RANGES OF DRP UNIT SCORES
500
57%
0S-2 30-0tu
27%
4"
yy
16%
10-
054 AND BELOW 55-61 62 AND ABOVE
DRP UNIT SCORES
This bar chart Illustrates the distribution of students, statewide, scoring in eachof three Degrees of Reading Power (DRP) score categories. DRP score categoriesare Interpreted as follows: a student who scores 62 DRP units or above can read,with high comprehension, materials which are typically used at grade 8 or above;a student who scores 55.61 units can read, with high comprehension, materialswhich are typically used below grade 8 but above the Remedial Standard; and astudent who scores 54 DRP units or below is In need of further diagnosis and pos-sible remedial assistance.
Chart 4Degrees of Reading Power (DRP): Percent of StudentsAt Selected Renges of DRP Unit Scores
25
In reading (Degrees of Reading Power Test), eighth grade studentsaveraged 61 DRP units on a scale of 15 through 99. The state's goal is thatall students be able to read with high comprehension materials typically usedat the eighth grade or above, that is, at least 62 on the DRP scale. Chart 4(p, 16) illustrates that 57 percent of the students scored at least 62 on thereading section, 16 percent scored between 55 and 61, and 27 percent scoredbelow 55, which is the remedial standard. The average score of 61 suggeststhat Connecticut eighth graders typically can read, with high comprehension,materials normally used up to grade 8.
Test Results by District
Appendix H (p. 65) and Appendix I (p. 81) present 1 listing of the mathematicsand language arts test results, respectively, for Connecticut schooldistricts. Sch-,,1 districts are listed alphabetically, followed by regionalschool districts. The Type of Community (TOC) designation in the secondcolumn indicates the group with which each district or school has beenclassified. A definition of the TOC classifications is provided in Appendix J(p. 89).
Because the most valid comparisons for district scores are longitudinalwithin each district, the State Department of Education advises against makingschool district comparisons. The following caution should also L, noted:
o It is not appropriate or meaningful to sum across the different testsand subtests because of differences in test length, mastery, andremedial standards. These comparisons are inappropriate since it isimpossible to identify, solely on the basis of the above information,how the average student has performed in the districts beingcompared. Average scores and standard deviations provide moreappropriate comparative information on how well the average student isperforming, although many factors may affect the comparability ofthese statistics as well.
Participation Rate Results
Appendix K (p. 91) presents the number of eighth-grade students in eachdistrict and the percents of students who participated in the grade eightmastery testing during the Fall 1986 statewide administration. Thealphabetical listing of districts provides the following information for eachdistrict:
Column 1Column 2
Column 3Column 4Columns 5-8
The name of the district.The total eighth-grade population at the start of masterytesting.
The number of students eligible for testing.The percent of total population exempted from testing.The percent of eligible students tested in each contentarea.
The results in Appendix K illustrate that participation rates by schooldistrict on the eighth - grade CRT were quite high, with only a few exceptions.
APPENDIX A
Grade Eight Mathematics Objectives
27
19
Grade Eight Mathematics Objectives
The 36 objectives of the eighth grade mathematics test are listed below. There arefour test items for each objective.
CONCEPTUAL UNDERSTANDINGS (44)
1. Order fractions.
2. Order decimals.
3. Round whole numbers.
4. Round decimals to the nearest whole number, tenth, and hundredth.
5. Multiply and divide whole numbers and decimals by by 10, 100, and 1000.
6. Identify fractions, decimals, and percents from pictorial representations.
7. Convert fractions to decimals and vice versa.
8. Convert fractions and decimals to percents and vice versa.
9. Identify points on number lines, scales, and grids.
10. Identify ratios and fractional parts from given data.
11. Identify an appropriate procedure for making estimates with decimals andfractions.
COMPUTATIONAL SKILLS (40)
12. Add and subtract whole numbers less than 10,000.
13. Multiply and divide 2- and 3-digit whole numbers by 1- and 2-digit numbers.
14. Add and subtract decimals (to hundredths) in horizontal form.
15. Identify the correct placement of the decimal point in multiplication anddivision of decimals.
16. Add and subtract fractions and mixed numbers.
17. Multiply fractions and mixed numbers.
18. Determine the percent of a number.
19. Estimate sums and differences of whole numbers and decimals includingmaking change.
20. Estimate products and quotients of whole numbers and decimals.
21. Estimate fractional parts and percents of whole umbers and money amounts.
-20- 28
PROBLEM SOLVING/APPLICATIONS (with calculation available) (40)
22. Compute sums, differences, products, and quotients using a calculator.
23. Interpret graphs, tables and charts.
24. Solve 1- and 2-step problems involving whole numbers and decimalsincluding averaging.
25. Solve 1- and 2-step problems involving fractions.
Holistic scoring provided for all students. Analytic scoringprovided for students who score below the remedial standard of 4(on a scale of 2-8).
Performance on all eleven Language Arts objectives, the Degrees of ReadingPower, and. the Writing Sample is reported at the student, classroom, school,district and state levels.
(#)Number of items for each content area or objective.
31
, APPENDIX C
Remedial (Grant) StandardSetting Process
32
Remedial (Grant) Standard-Setting Process
Background
There are several acceptable strategies for setting standards oncriterion-referenced tests. Each of the proposed methods has one or moreunique characteristics. One common element to the various methods is thatthey all offer to the individuals who are setting the standards some processwhich reduces the arbitrariness of the resulting standard. Different methodsaccomplish this in different ways. All methods systematize the standard-setting process so that the result accurately reflects the collective informedjudgment of those setting the standard.
Types of Standard-Setting Methods
Standard-setting methods can generally be categorized into three types: testquestion review, individual performance review and group performance review.Test question review methods specify a procedure for standard setters to
'-------e-zeutil-re-Gash-tes..t.4111.ation and make a judgment about that question. Forexample, standard setters 'MigfiE- be asked to rate the difficulty or theimportance of each question. These judgments are then combined mathematicallyto produce a standard. Individual performance review methods also requirestandard setters to make judgments, but the judgments are made on the basis ofexamining data that indicate how well individual students perform on testitems. These data may be based on actual pilot test results or projectedresults using mathematical theories. In this method, additional studentinformation, such as grades, may also be used to inform the standard setters.Group performance review methods provide for judgments to be made based on theperformance of a reference group of students. That is, standard settersreview the group performance and make a determination where the standardshould be set based on the group results.
Selection of a Standard-Setting Method
Several factors affect the choice of a particular standard-setting method.The type of test is one consideration. For example, some methods are onlyappropriate for multiple choice questions or for single correct answerquestions while other methods are more flexible. For example, timeconstraints are a consideration if student performance data are necessary. In
this case, a pilot test must be conducted and the test results must beanalyzed prior to setting the standards. Another consideration is therelative importance of the decisions that will be made on the basis of thestandard. For example, a classroom test affecting only a few students wouldnot require as stringent a procedure as would a statewide test determiningwhether a student is allowed to graduate from high school. Other relevantfactors include the number of test items, permanence of the standard, purposeof the test, and the extent of available financial and other resources tosupport the standard-setting process.
-26-33
On February 4, 1985, the Mastery Test Psychometrics Committee met to considerthe issue of standard-setting procedures and voted unanimously to approve thefollowing proposal.
A PROPOSAL FOR SETTING THE REMEDIAL STANDARDS ON THE CONNECTICUT MLSTERY TESTS
1. Two standard-setting committees will be created: one for mathematics andone for reading and writing..
2. This description of a minimally proficient student will be given to eachof the committees:
Imagine a student who is just proficient enough in reading, writing,or mathematics to successfully participate in his/her regulareighth-grade coursework.
3.A In mathematics, an adaptation of the Angoff procedure will be used. Thecommittee will be provided with each item appearing on one form of themathematics test. The committee will be given the following directions:
Consider a group of 100 of these students who are just proficientenough to be successful in regular eighth-grade coursework. How manyof them would be expected to correctly answer each of the questions.
The committee will rate each item. The committee will then be given theopportunity to discuss their rating of each item. Sample pilot data willbe presented. Committee members will be given the opportunity to adjusttheir item ratings. The item ratings will then be ave-...aged in accordancewith the Angoff procedure in order to produce a recommended test standard.
3.B In reading, the committee will review and discuss each passage of theDegrees of Reading Power (DRP) test. Student performance data will bepresented. The committee will consider the reading difficulty that shouldbe expected of a student at the grade level being tested. The committeemembers will identify the passage that has the appropriate level ofreading difficulty consistent with the above description of a minimallyproficient student.
3.0 In writing, the committee will read four sample essays. These essays willhave been prescored holistically (on a scale from 2 to 8) in order to rankthe quality of the essays. Committee members will classify essays intoone of three categories: 1) definitely NOT proficient, 2) borderline, and3) definitely proficient. These classifications will be discussed inlight of the holistic scores. The committee will then classifyapproximately twenty-five additional essays. The essay ratings will bediscussed in the same manner as the original four essays. When all essayshave been discussed, the essays which fell in the borderline category willbe focused upon to determine the standard. The committee will determinewhere among the borderline essays, the standard should be established.
4. The standards recommended in step 3 will be presented to the Mastery TestImplementation Advisory Committee for discussion and action.
-27-34
Connecticut's Strategy
Several steps were employed to create an acceptable and valid test standardfor Connecticut tests. Initially, a separate standard-setting committee wasconvened for each test on which standards are to be set. Individuals werechosen to serve as members on the committee on the basis of their familiaritywith the area being assessed and the nature of the examinees. One source of
such members is the test content committees related to the project. Forexample, members of the Mathematics Committee were represented on thecommittee setting standards for the mathematics mastery test.
The actual procedures used to set standards were an adaptation of a methodproposed by William Angoff (1970). This test question review method requiredmembers of a standard-setting committee to estimate the probability that aquestion would be correctly answered by examinees who possess no more than theminimally acceptable knowledge or skill in the areas being assessed. Standardsetters then reviewed pilot test data for sample items as further evidence ofthe appropriateness of the judgments being made. The original probabilityestimates assigned to each test question were reviewed and adjustments made bythe standard setters. The final individual item probabilities were summed toyield a suggested test standard for each member of the committee. The
suggested standards were averaged across members of the committee to producethe recommended test standard.
The recommended test standard was presentedImplementation Advisory Committee and the State
In mid- March, Mathematics and Language Artsmet to set the remedial standards for the Gradefollowing information summarized the results ofactivities conducted by CSDE staff:
Using the procedures previously outlined, the standard setters rated each itemand considered the pilot data. Committee members discussed items and weregiven the opportunity to adjust their initial ratings. The final ratings wereaveraged to produce a remedial standard. It is recommended that a raw scoreof 79 be the remedial mathematics standard. Below is a summary of the ratings.
Procedure 1! Judges Range % Mean % Correct Raw Score
Angoff 20 25.7-67.7 54
II. Reading (Degrees of Reading Power, 77 item test)
78
Standard setters used two procedures to establish a remedial readingstandard. First, they examined the passages in the Degrees of Reading Power(DRP) test, asking themselves which passage is too difficult for the studentwho is just proficient enough to successfully participate in eighth-gradecoursework. Discussion occurred throughout this selection process.
Second, they examined textbooks which are typically used in grades 7 and 8and selected those textbooks which a minimally proficient student would not beexpected to read in order to successfully participate in eighth -gradecoursework. Discussion occurred throughout this selection process.
The average readability values of the selected passages and textbooks andthe pilot test data were then revealed to the standard setters. The standardsetters discussed the readability values and the pilot test data andrecommended the DRP unit score of 55 as the remedial standard. The standardwas accepted by the State Board of Education at the 80% comprehension level.Below is a summary of the ratings.
Readability RecommendedProcedure # Judges Range Remedial Standard
A. Test Passage Review 26 53-62 DRP Units
B. Textbook Review 26 48-60 DRP Units
III. Writing (45 minute writing sample)
55 DRP Units
Using the procedure previously outlined, standard setters read and rated 21essays written to a persuasive prompt and 21 essays written to an expositoryprompt. After discussions and final ratings, the holistic scores for thepapers were revealed to the group. The committee then discussed theappropriate remedial writing standard in light of the degree to which theirratings matched the holistic scores. It was the recommendation of thecommittee that a holistic writing score )f 4 be used as the remedial writingstandard. Below is a summary of the ratings.
Dell Britt, Newtown Public SchoolsFred Brucoli, New London Public Schools'Patricia Dobson, Stafford Public SchoolsDonald Falcetti, Litchfield Public SchoolsBill Farr; Bolton Public SchoolsJames Foley, Waterbury Public SchoolsDorothy French, Litchfield Public SchoolsMarguerite Fuller, Bridgeport Public SchoolsSara Godek, Stafford Public SchoolsNina Grecenko, Newtown Public SchoolsMary, Haylon, Hartford Public SchoolsKaren Karcheski, Danbury Public SchoolsJean Klein, Newtown Public SchoolsMark Kristoff, New London Public SchoolsThomas Lane, Old Saybrook Public SchoolsLucretia Leaves,. Hartford Public SchoolsEdward Moore, Danbury Public SchoolsMary Murray, Putnam Public SchoolsDick Nelson, Old Saybrook Public SchoolsOlive S. Niles, East Hartford Public SchoolsAnne L. Rash, Bolton Public SchoolsBernice Wagge, Waterbury Public SchoolsMary Wilson, Hartford Public SchoolsBarbara Zamagni, Putnam Public SchoolsRobe*- ;Cinder, CT State Department of EducationMary Weinland, CT State Department of Education
MATHEMATICS STANDARD-SETTING COMMITTEE
Barbara Bailey, New Haven Public SchoolsPat Banning, Windham Public SchoolsGeorge Caouette, Manchester Public SchoolsPearl Caouette, Manchester Public SchoolsTony Ditrio, Norwalk Public SchoolsDon Flis, West Hartford Public SchoolsMarian Frascino, Norwalk Public SchoolsCharles Framularo, Bridgeport Public SchoolsSheryl Hershonick, New Haven Public SchoolsMable McCarthy, Middletown Public SchoolsMichele Nahas, Windham Public SchoolsJudy Narveson, Farmington Public SchoolsMary Ann Papa, West Hartford Public SchoolsJim Pinto, Bloomfield Public SchoolsHelen Prescitt, Ashford Public SchoolsDolores Vecchiarelli, Westport Public SchoolsSylvia Webb, Middletown Public SchoolsFrank Whittaker, Bridgeport Public SchoolsBetsy Carter, CT State Department of EducationSteve Leinwand, CT State Department of Education
APPENDIX D
Marker Papers for Holistic Scoring
3138
CONNECTICUT MASTERY TEST GRADE II WRITING SAMPLE
I, .. Al ' f I al ts 41 MI V 4I
OB:hi I r A 0 ; litLiA0 l 1 IWO nbC114 (o r1±(Cif 10 ..YY1/111. nfrA i -.114 risuicti 1,-..(e. iNvvroo.1%. Am. . ". . t. 1 -,
ctIciCPoLt.i.QtecLikintrizl______41.0,1 40-irer A. -ika.4-9 Ocin Ax,
.Cat b A 0_161.4711 -1. t k 1 A i i (1 1ff 1 LP+...41.P ..2S
as A/ /a I a 5 lb
4 5. P ... g. z . , I Al S II I 4 Ise4. 4 I le 0 II ii. $ 2 6 I
There is cies: Lwidence this t:udent saw the prompt: however. the response is a'discussion of an election. There is no support of a single candidate.
CONNECTICUT MASTERY TEST GRADE S WRITING SAMPLE IIII/ /
441 Afi e I 6 . f ..Ar af, Re. ...'VAINIP/72640k
I ...: .44. r ... .4 ... ?. I ... 4- ' aI [ i - / orG ./ . ..i , ... /ad ... EP ii......:...-... 4...
-11114era.. / , OM I IN I I I 11121 III VIP . 4 4 aim/
A/ ' /A', O .r . -.a. '
, ' '. 0'G It
m. at, . ..:.i,. sr - as.' .02.4 ' . ...
MINITd A- 1 . . *oat ... .. ....-
4116.4. .... la .0..4Y
it AriPLAmar....,.... ,f...
.5 (57 -7-1,,,7- n-zt ,t,,,,,eile Gt4cArft1
I7 1I ./:, , ray erv V-4.,(4
Score Point: 1
This response is an attempt to respond to the task. but there
is no sustained discourse. It reads like an outline of a speech.
Additional clarification or some transitional linking it needed
for a nigher score.
40
CONNECTICUT L:ASTERY TEST GRADE t WRITING SAMPLE =EEO
AL 20 Arki,.e a toHA ctize
e
A'
I /
JAWINFAIS/ I '
rfi .17.001.-1rAlI,.
Score Point: 2
Although this paper has no more information than the previous
paper, it has the needed transitions which create sustained
discourse.
41
CONNECTICUT MASTERY TEST GRADE WRITING SAMPLE
81
1519161s-111
f1146 0 tRUYJR /..keriAteil AO_rti"d" AfttA,AU.fri r oc-utet./k4.,L,
t u p -IL7
c.4 e" ..A..q..eOLVA,
CAScore Point: 2
This response has a number of points. Most are vague, but
give his hardest and a late bus have some clarification.
43
boo
.34 \
CONNECTICUT MASTERY TEST GRADE
Score Point
WRITING SAMPLE 11111121RE
There are a number otc poaints. citligvibl:nictlecajoLanflag!i:b !:31s,lebaltirle the lisA:4e1 !weds
enhancesdeiail;* for 4
istrt.geariv,tEllisr!ItipefiLic:r: i2hLiner sco . r-nr lilll K eI.ukcii to 0 o wit'ria7t- riiii--.e . He will
LiLic, hie 54. in Someone nd4- ell.9r+0 . u n ir 4frn hte lIth cA- ii- is SG veior A e u.,Ill 10
-1-i ,y1 -or s .1 Onii 3" . rI.1_c; PM t71 el fl d POO
. . -I 0 , +1 a .0.
.1i _kr- evPrsjoriP h ke ever. , , .... , - . hi 7 ,
1 4 A F.0 I 1 S. I 0 I a I, a r L. I - e 4 I
I like ,Cool- ha/1ft C I d CII:,r2r4,4 elf., .
fi PI Ott ST)r 1- 4/*1 e a t7r)yeyjh ,pp 'JOU Ay 4.1-61-e- A t.,ii ccor Erknnl concill
, I ill .,. 1e.(ultra a rejq r-I0 ;173 rto p0 ) u.,i4-4. arier.5,flinLa I & tiP 1S kJ o-te d h e to; // ne74- 6-e 04 44-n ritai nip WM A P 0 a.9ir,74,....rspek iipicr
ipr, Or r n r i v sy ,1 brs nr- ., I- 4.9 rila g;ltr.r.t.. or 11,-,,4 e Ifit,s. -4-4, liar,i I I s . - N II I /
il e Ca 0 4 o -1.-1,k+_ k
S-0 fr. on r n s f IIA . I . .
I
44
r
CONNECTICUT MASTERY TEST GRADE 8 WRITING SAMPLE
.c
I lbozoAl :-.1.2._ to..c 9. _ cs.st. t kt, Q te -q,m.c.
k Is. kK.\ C.-r-,M\ C-: \ 4._ ?C. 0 ci.N\.) .
... C V . ) ' 7:\
o capfZ:S. \cl. .
..-. C\ .0 0 .4C:>C\ V,1\e cr. C..- 2,..tti jCt1 ca S l_<..C.OSS ct.,)lky? .T,k:!,
enr ,..:k1,-. se:A..., . C.X.,ik csc...
, cre.`c.. \-4 cA,_ c,.c. 1,:,,ss .. \ . -- A k cs . 1k 4.,
,it. -ki.Cry. A es. CN.. If\ ALO. 4,.... .-\. ec.ek\k-I.,"\ 4-, se,\, 1,4 \ AP.-.1 t tri7c,a IIA) 0..1 OM_ %1N, \)(111./.A.. -ki Si
\,f c,S Irc\)1.2b... N .-...-\ z,11), a 1 1t( 11A %A1 cks-\\AA Vco.,,J, \ C\ LO C \ n c<CJC, ,z,t,..) 4-1-
c4CAS 4 c. P -N. r.:,\U ...5
1: ' c ccc c-
-k...c .\ hit_ own,..,(0 Scing,I.k -L.
I C. .
N.0 %.,. A ,..c \N:lw.? k,x,_ 4 -L tckc_1( bp r.,.\\ 1-k 01
.131new KkaLaireit .1,4..W.And1/ in 'v./Chid / LAI
/AglL
Score Point: 3
o , 1 bp Opin .
This response has numerous points -- some are clarified and
some hate supporting detail. It is a "3" because of the list-like
quality, the repetition, and the lack of overall' control.
49
.a .01 Allf41 all all 111.ILII-SL 41 lb III
.g.. 6 ta A a *OM of WE
a
A A a a &e .
A 116 I III Mgt A. 4a-.. AA_ oda _,.7. a. 1
0.11 IA 1 ' C .4 . lii .A2 O. e t A .. Jai,
- a... as I 4 Alf A a a a Ada O. 0
a. a P I M ,Is Alla.. 6 a In.lel
/6 a ,! . ... oa A a A a A . A la P. a. fa "Pa 0. .11 0 .a .a - a 0.A A a.iMPWr; WI Ali OS 4 at 4 Si f lie 1 tit 0. a 10. ea.Is
a an
( .Ili la 0.". 6 Air, .01 Al as 40 I el. l - 4 ..:. -.. ... ....:-. .A P. 9121111 I . % 116A 611, Asa a' ..4 a a ...I... qt, 66, _66 .. la
6... . Z IAA AA MO f II I. s ....: LA .0 A I . : .A.- . 1 t Plik . ..L. 1.1..I .... a A
AAA 411Aa Jib lb ma . an a P is.....ta4 i at Co L.. A G.-a.....a.. a 4..._ A A 4 1 lb KS Alb .'"...I% . .0 g IP la) Ila .1111 & ° la AI VP., . !AA a i a a a .... 11704,-
. .... a.alas /aka .V. l1 ab 411L a .. i a am ea a AUL .a.........5. . al O.% . !..11 a 41 ..a. _a a A At.' .. ..! 18 ..!. . LOP l.:- Arm ...A .
VPL. la 11.. . :.n .. AAA Par 1 ilil L ... - ._:'S_ 4. Ia. 0 .4. aa .... I'M .r...... ..
4110
A . 1 . I IAl i t 'te o w d i a l P a . .01 a . .66 A . 0214 L , - 4 . 6 a. a. . ' 'APAM!" . ...
a A a . ' .1 . *2 AA. inDA It.. ... da A all! alt .4' ar A 41`0.4
111 MO a 4 e. I. .11_1/0 , IIPAXV. 4 .1.,.p
I...* II. ,... . . 4.1 0: Ea lia a a .......w. . If 114 . 4. ... iS a .AA. ll- 16
..I. a la A Sala. a a ' _ef..aa ca. .1. . 4 I a A ol . Las ra .1.m .. .. ... to ay. 44 a..
I 1141-L12- & ai 1 I a.... -AAA° ab . All A A.A a 4.., .41,
a.. aillia
.!.:..k. 1 .4l 1 II IA AII1 I .6i4 . I A .11b
I AC r *MP.. 41 ii., IIII i I. k/0.0._ a 11. -.
IIIt Aa le !..k. ALA., A A.A."111'k ........
t L. .l%
41 . IEr6216,
. El It 9_
L.
CCC
C
CONNECTICUT MAITENY TEST GRADE I WRITING MAPLE
esENNIO
14r /Ir. flat n4, a."- t-, t> 1.. tke> 0,11
... " a . 11171 a A - *Ile 11° oil kn,, 4k scl,,,,,/ ,re 312 //Ir.
fintie C 6.tniS ,11P CP C' its 11,3=___raf.:÷ ' ._,A4 un-li I rx, r1I- rev- 1,4nc. !..(-46 'Je fea4ir..171e Le 4, 1.t* c.IvArocct
fNeNt. 1 niA ;5 LAY 7.-- alvt AriA;PII44+3_,XILD
law ft' 01 . .i 1 . 1 Al_I .1. r D- ° I i
reVS-7,teri:Ti tie -Ltt'A tAc He_ e-co, rnoice Aese chorseIt .,
- .. .- .,.. .....( a. AI iii at -WY!" 4. kr,. t. ;.P yc 4 f c y i- pet'S cm
. -4
Le ,,,A. & 41:c?, ,es Ay. ),,,e -1-ise.. e fp ec ...-fs1 ,
Ace L t kii A Fret/A. tji- t,.+' .., 11 A,getkr--' oreL;1 ly. aj :,-.
ryI -0- ka-mra DA,
14 14,A,' 7,14E. ,In bla represefg-i,ve Prtpa.Arek_ riNtiqs tro!..e)rr. Tk ad (Sc' , , ki. ,.._,,t. A tice,
'hp kto( o Z-1..;.c- ,*,, r. /ch.. 11P 1 rwkL ,c t. IA Ur .h 14 1i.e otcifyri- 1- di -ay., 4 11C-
47", LC ei le6rkfc lle: fil:rfk ZIA, A m'--4c'ecksr.... ic 4? 'Ai!c be C" Adf -0,6' Zi,fit'rt-tk,f9y T./1 AP kAms_A ht_ can ha) -Me-, ,
Mall- k p et. Clarne l't.c.:4:
. . a _giA.er Li, 441
sztaZattaLE, 6 A A . c 414ti c ty j,-1-0: 4-, ri: 1". )t,-,_,
52
,
4.1 ... A IL a. Aft .I I ..,
1,1 Lk A 0.- A /. A ...
fif. 1710 A..J5ca, siAprti r,,nr_:/ IV0_5 1:Je Ls-l-a.+1," l'gte 111. T akfurly Tht_31:1 c,,, Qd- .1 iel-1 - A Ill a.- ... I.. . .. ill A . st
LQ4:2M be//c.4Ve . A ach . a
44.Y, 1114.
,A ..A. 4. ;AIL (nA0S firMel- nn A nodel cpv }J Ft c...,), . swak L.4 Ae
Cc k..4" Pet 11._&4-:. r,4- 4e,c, say 1-Ir i.s ,,17 - :,
1r I. )4 cc'. o. c f+p,A ,"roaicm .4- Iii .SCIADni Ap4 i ,%... 7 ....) .
1a1
'' a
. . 0 a e ,11C 110 .t. . * 4111. I ii. 4 ...
Ii.... Litz, I 11<ca takz,- -kc pvi. '' their/ f:S.Ile- a,t. ctm? t c 4. , ',tit.- -A. c-, 14,11.1110" g la
pik,e. t4 et..e.i rc ' f a r IT% ie1
T rr: nA% .:5. 40 I.L b ,--
..12,1 rn r crm.ac II m."01 Art: T e y 34,- . , IQ 4:,-. ,
RV' ;ft A e -r1 .c1 ri 1505-,-...ctlf,:ei-1-
, . % ., 4 .
. Le . .1 a
k (-AC cl' 1h',, :..4--,),-L,,/
Score Point: 4
This response has a number of points, several of which have
additional supporting derail. 7.* response is organized and conrrolled.
53
APPENDIX E
Analytic Rating Guide and Marker Papers for Analytic Scoring
GRADE EIGHT ANALYTIC RATING GUIDE
FOCUS: How effectively does the writer unify the paper by a dominant topic?
1 = switches and/or drifts frequently from the dominant topic2 = switches and/or drifts somewhat from the dominant topic3 = stays on topic throughout the response
ORGANIZATION: Is there a plan that clearlybeginning to the end of the response and is
1 = no discernible plan2 = inferable plan and/or discernible
present
3 = controlled, logical sequence with a clear plan
governs the sequence from thethe plan effectively signaled?
sequence; some signals may be
SUPPORT /ELABORATION: To what extent is the narrative developed by detailsthat describe and explain the narrative elements (character, action, andsetting)?
1 = vague or sketchy details that add little to the clarity of theresponse or specific details but too few to be called list-like
2 = details that are clear and specific but are list-like, or uneven, ornot developed
3 = well-developed details that enhance the clarity of the response
- SENTENCE FORMATION: Are sentences correctly formed?
1 = many run-ons, "on-and -ons," fragments, and/or awkwardconstructions - -may cause confusion
2 = some run-ons, "on-and -ons," fragments, and/or awkwardconstructions - -may cause confusion
3 = few errors and/or awkward constructions - -no confusion
MECHANICS: To what extent does the student use the conventions of standardwritten English (e.g. spelling, usage, capitalization, punctuation)?
1 = many errors2 = some errors3 = few errors
55
-42-
I. . a La
bhh hectie I 11.thk ke. \4411390.A______
rtier_eSthil* -FK dr,cs He vh4v4 1.,,c>s.k
Skr th,r,.1 Vie how urli- ;c, Tih5 01e, 'on II. f
SiliAf vit. CarthCi I CA& '/o0 rioN5 4 cur rt6eS
;% Tr h., Cr;hr:c, . 1-1 e Wcn6 Vote. eor more dwi;c4,
GNI a cnftcr sv,c)tie prehg-tot6 pics,e . vote -GT _IiNh
Ow% d Iho.ck Ns yec..y- cA 50o (-1 fahe .
Analytic Score Point?
Focus: 3
OreanizatiOn: 2
Support/Elaboration: 1
-Sentence Formation: 3
',-17hanics: I%....7
.4,9 14
JEINOWIPPIAMMIMENNIET . L.5 11.. . . A I . . ....: -
1.11111.11111111kiliMPIWIEN Mt A
I i n Atemarnmssigswanstmenimaws.isirgruswwas
:;,4=';',
CONNECTICUT MASTERY TEST GRADE I WRITING SAMPLE
58
rn
_A04.31k9cuni- Perult Wd vtltaig0 so
to
..1114_,&1_4;44,1_14.4.3...Z.CALE4U1PaL4
Allaalala0.1fAteg.4.L+14.1MICLA44C1,1212:14,
44..
qmi:1454&"41-
Arslyt c Score Points
Focus: 3
Organization: 1
Support/Elaboration: 1
Sentence Formation: 3
Mechanics: 2
,
Ca/ opt or1ter4
Si ik .111.
Pee) cfonoii
tei i .0.t1r1 j,0Q.. ball 1 It _ S.. Iv a All
.4(sin perpu_ V-Icarc-i lc/Kryx)1'(1e, trii 13n
.yryeir> A J311 ,(4
a Iib.i a..k
'4 tI Abi AWL 411o.
Analytic Score Points
Focus: 3
Organization: 2
Support/Elaboration: 1
Sentence Formation: 1
Mechanics: 2
59
2
0
--4:
41
111
41
...
44
-
..
1
...
.
1
IIII-
-
a
41
I
.....an
eI
1111
4
11111.
.
.4NI
":I
11
't
a
Il:
IT
IN
1.9
'
- i
ail
qi,
Ira
..
41
111.
4fiS.
,
- ,
.
1
..- 1
1
gio:..
%
i4
4
1
'
- ,
4
It
I
Il
,
....
...
.d
.
'I
'I
1,
.. '
- .
Il
I
4
4
IO.
' I
I
I
i
ql411.
p
...
0c
V)
u...4.>,
.
.4to
...e
.
.
..
I.
c...
at
e4
..1,
isc.,.cl'
.-,1;1
APPENDIX F
Sample Grade Eight Mastery Test Score Reports
o Class Diagnostic Report- Mathematics
o School by Class Report- Mathematics
o District by School Report- Mathematics
o Class Diagnostic Report- Language Arts
o School by Class Report- Language Arty,
o District by School Report- Language Arts
o Parent/Student Diagnostic Report
6147
CT T STI c ; CLASS DIAGNOSTIC REPORT
GRADE 8 FORM A
TESTING DATENUMBER OF STUDENTS TESTED:
NUMBER OF STUDENTS NEEDINGFURTHER DIAGNOSIS
IN MATHEMATICS:
MATHEMATICS OBJECTIVES TESTED
MASTERYCRITERIA# OF ITEMSCORRECT
CONCEPTUAL UNDERSTANDINGS1. ORDER FRACTIONS2. ORDER DECIMALS3. ROUND WHOLE NUMBERS4. ROUND DECIMALS5. MULT/DIV WHOLE ti'S & DEC. BY 10, 100, 10008. IDENTIFY FRACTIONS, DEC., %'S FROM PICTURES7. CONVERT FRACTIONS DECIMALS8. CONVERT FRACTIONS/DECIMALS PERCENTS9. IDENTIFY PTS. ON NUMBER LINES, SCALES, GRIDS
10. IDENTIFY RATIOS AND FRACTIONAL PARTS11. IDENTIFY PROCEDURE FOR FRAC/DEC. ESTIMATION
COMPUTATIONAL SKILLS12. ADD AND SUBTRACT WHOLE NUMBERS13. MULTIPLY AND DIVIDE WHOLE NUMBERS14. ADD AND SUBTRACT CECIMALS15. ID CORRECT DECIMAL P T IN PROD/QUOT OF DECIMALS18. ADD/SUIr:LACT FRACTIONS AND MIXED NUMBERS17. MULTIPLY FRACTIONS AND MIXED NUMBERS18. DETERMINE PERCENT OF A NUMBER19. ESTIMATE SUUS/DIFFS. OF WHOLE #'S AND DECIMALS20. ESTIMATE PRCOMUOT OF WHOLE ti'S AND DECIMALS21, ESTIMATE FRACTIONAL PARTS/%'S OF WHOLE IrS
3 OF 4
3 OF 43 OF 43 OF 4
3 OF 43 OF 43 OF 43 OF 43 OF
3 OF 4
3 OF 4
3 OF 43 OF 43 OF 4
3 OF 43 OF 43 OF 4
3 OF 4SOF 43 OF 43 OF 4
SEE MATHEMATICS PART 2 FOR OBJECTIVES 22-36 AND SUMMARY TOTALS.
MATHEMATICS PART 1 OF 2
PAGE
NUMBER/PERCENTOF STUDENTS
MASTERING EACH OBJECTIVE
CLASS SCHOOL DISTRICT
/ %
INDICATES A SCO;S. BELOW THE REMEDIAL STANDARD.THIS STUDENT MUST RECE..E FURTHER DIAGNOSIS.
COPYRIGHT 0 MS BY CONNECTICUT STATE BOARD OF EDUCATIONALL RIGHTS RESERVED.PRINTED IN THE UNITED STATES OF AMERICA.
0 CLASS DIAGNOS TIC REPORT MATHEMATICS PART 2 OF2
GRADE 8 FORM A
TESTING DATE:NUMBER OF STUDENTS TESTED:
NUMBER OF STUDENTS NEEDINGFURTHER DIAGNOSIS
IN MATHEMATICS:
PAGE
NUMBER/PERCENTOF STUDENTS
MASTERING EACH OBJECTWEMASTCRYCRITERIA7,, OFCORRECTS
CLASS SCHOOL DISTRICT
MATHEMATICS OBJECTIVES TESTED. /2/2_
PROBLEM SOLVING/APPLICATIONS22. ADD/SUBTRAULT/DIV WITH A CALCULATOR23. INTERPRET GRAPHS, TABLES AND CHARTS24. SOLVE 1-AND 2-STEP PROBSWHOLE WS & DEC.25. SOLVE 1-AND 2-STEP PROBLEMS - FRACTIONS28. SOLVE PROBLEMS INVOLVING MEASUREMENT27. SOLVE PROBS. INVOLVING ELEM. PROBABILITY28. ESTIMATE A REASONABLE ANSWER29. SOLVE PROBLEMS WITH EXTRANEOUS INFORMATION30. IDENTIFY NEEDED INFORMATION IN PROBLEMS31. SOLVE PROCESS PROBLEMS - ORGANIZING DATA
MEASUREMENT/GEOMETRY32. IDENTIFY FIGURES USING GEOMETRIC TERMS33. MEASURE AND DETERMINE PERIMETERS AND AREAS34. ESTIMATE LENGTWAREANOLUME/ANGLE MEASURE35. SELECT APPROPRIATE METRIC/CUSTOMARY UNIT38. MAKE MEASUREMENT CONVERSIONS W/IN SYSTEMS
3 OF 4
3 OF 42 OF 43 OF 4
3 OF 43 OF 43 OF 4
3 OF 43 OF 43 OF 4
3 OF 43 OF 4
3 OF 43 OF 4
3 OF 4
TOTAL NUMBEI OF OBJECTIVES MASTEREDAVERAGE il OF OBJECTIVES MASTERED
NUMBER OF ITEMS CORRECT NUMBER/PERCENT OF STUDENTSBELOW REMEDIAL STANDARD
MATHEMATICS REMEDIAL STANDARD 78 OF 144
INDICATES A SCORE BELOW THE REMEDIAL STANDARDTHIS STUDENT MUST RECEIVE FURTHER DIAGNOSIS.
COPYRIGHT 0 1986 BY CONNECTICUT STATE BOARD OF EDUCATIONALL RIGHTS RESERVED.PRINTED IN THE UNITED STATES OF AMERICA
65
0293A3
ONNECTICUT MASTERY ES IN PROGRAM SCHOOL BY CLASS REPORTMATHEMATICS PART 1 OF 2
GRADE 8 FORM A
TESTING DATE:
SCORES INDICATE NUMBER/PERCENT OFSTUDENTS MASTERING EACH OBJECTIVE
CONCEPTUAL UNDERSTANDINGS1. ORDER FRACTIONSL. ORDER DECIMALS3. ROUND WHOLE NUMBERS4. ROUND DECIMALS5. MULT/DIV WHOLE #'S & DEC. BY 10, 100, 10008. IDENTIFY FRACTIONS. DEC.. WS FROM PICTURES7. CONVERT FRACTIONS - DECIMALS8. CONVERT FRACTIONS/DECIMALS - PERCENTS9. IDENTIFY PTS. ON NUMBER LINES. SCALES. GRIDS10. IDENTIFY RATIOS AND FRACTIONAL PARTS11. IDENTIFY PROCEDURE FOR FRAC/DEC. ESTIMATION
COMPUTATIONAL SKILLS12. ADD AND SUBTRACT WHOLE NUMBERS13. MULTIPLY AND DIVIDE WHOLE NUMBERS14. ADD AND SUBTRACT DECIMALS15. ID CORRECT DECIMAL PT IN PROD/QUOT OF DECIMALS16. ADD/SUBTRACT FRACTIONS AND MIXED NUMBERS17. MULTIPLY FRACTIONS MD MIXED NUMBERS18. DETERMINE PERCENT OF A NUMBER19. ESTIMATE SUMS/DIFFS. OF WHOLE srs AND DECIMALS20. ESTIMATE PROD/QUOT OF WHOLE IT'S AND DECIMALS21. ESTIMATE FRACTIONAL PARTS/WS OF WHOLE #'S
. ;;ILLASirWgiidali
3 OF 4
3 OF 43 OF 4
3 OF 43 OF 43 OF 43 OF 43 OF 43 OF 43 OF 4
3 OF 4
3 OF 43 OF 4
3 OF 43 OF 4
3 OF 43 OF 43 OF 43 OF 43 OF 4
3 OF 4
'4' RiNklm,k642.44.twAiiitgalltai'aiellagear.. tar ,-- f
SEE MATHEMA "CS PART 2 FOR OBJECTIVES 22-36 AND SUMMARY TOTALS.
'REMEDIAL STANDARD IS 78 OF 144 ITEMS CORRECT COPYRIGHT ID 1986 BY CONNEG i CUT STATE BOARD OF 8 'LIGATION ALL RIGHTS RESERVED PRINTED IN THE U SA
01SCA3
67
SCHOOL BY CLASS REPORTHE TICS PART 2 OF 2
GRADE 8 FORM A
TESTING DATE
SCCRES INDIC kTE NUMBER/PERCENT OFSTUDENTS MASTERING EACH OBJECTIVE
PROBLEM SOLVING/APPUCATIOHS22. ADD/SUBT/MULT/DIV WITH A CALCULATOR23. INTERPRET GRAPHS, TABLES AND CHARTS24. SOLVE I- AND 2STEP PROBS-WHOLE #5 & DEC.25. SOLVE 1- AND 2STEP PROBLEMS - FRACTIONS28. SOLVE PROBLEMS INVOLVING MEASUREMENT27. SOLVE PROBS. INVOLVING ELEM. PROBABILITY28. ESTIMATE A REASONABLE ANSWER28. SOLVE PROBLEMS WITH EXTRANEOUS INFORMATION30. IDENTIFY NEEDED INFORMATION IN PROBLEMS31. SOLVE PROCESS PROBLEMS - ORGANIZING DATA
MEASUREMENT/GEOMETRY .
32. IDENTIFY FIGURES USING GEOMETRIC TERMS33. MEASURE AND DETERMINE PERIMETERS AND AREAS34. ESTIMATE LENGTWAREANOLUME/ANGLE MEASURE35. SELECT APPROPRIATE METRIC/CUSTOMARY UNIT38. MAKE MEASUREMENT CONVERSIONS WAN SYSTEMS
AVERAGE NUMBER OF OBJECTIVES MASTERED
NumBEFuPERc;NT OF STUDENTS BELOW HE REMEDIAL STANDARD*
*REMEDIAL STANDARD IS 78 OF 11.1 ITEMS CORRICTCOPYRIGHT T 1988 BY CONNECTICUT STATE BOARD OF EDUCATION. ALL RIGHTS RESERVED PRINTED IN THE U.SA
69
0300A3
CONNECTICUT MASTERY TESTING PROGRAM DISTRICT BY SCHOOL REPORT MATHEMATICS PART 1 OF 2
GRADE 8 FORM A
TESTING DATE:
SCORES INDICATE NUMBER/PERCENT OFSTUDENTS MASTERING EACH OBJECTIVE
CONCEPTUAL UNDERSTANDINGSI. ORDER FRACTIONS2. ORDER DECIMALS3. ROUND WHOLE NUMBERS4. ROUND DECIMALS5. MULT/DIVWHOLE #'S & DEC. BY 10, 100, 10006. IDENTIFY FRACTIONS. DEC.. %'S FROM PICTURES7. CONVERT FRACTIONS - DECIMALS8. CONVERT FRACTIONS/DECIMALS - PERCENTS9. IDENTIFY PTS. ON NUMBER LINES, SCALES. GRIDS10. IDENTIFY RATIOS AND FRACTIONAL PARTSII. IDENTIFY PROCEDURE FOR FRAC/DEC. ESTIMATION
COMPUTATIONAL SKILLS12. ADD AND SUBTRACT WHOLE NUMBERS13. MULTIPLY AND DIVIDE WHOLE NUMBERS14. ADD AND SUBTRACT DECIMALS15. ID CORRECT DECIMAL PT PROD/QUOT DECIMALS16. ADD/SUB:RACT FRACTIONS AND MIXED NUMBERS17. MULTIPLY FRACTIONS AND MIXED NUMBERS18. DETERMINE PERCENT OF A NUMBER19. ESTIMATE SUMS/DIFFS. OF WHOLE #'S AND DECIMALS20. ESTIMATE PROD/QUOT OF WHOLE #'S AND DECIMALS21. ESTIMATE FRACTIONAL PARTS/%'S OF WHOLE IrS
PROBLEM SOLVING/APPUCAYIONS22. ADD/SUBT/MULT/DIV WITH A CALCULATOR23. INTERPRET GRAPHS. TABLES AND CHARTS24. SOLVE 1- AND 2-STEP PROBS -WHOLE #5 & DEC.25. SOLVE I- AND 2-STEP PROBLEMS - FRACTIONS28. SOLVE PROBLEMS INVOLVING MFASUREMENT27. SOLVE PROBS. INVOLVING ELEM. PROBABILITY29. ESTIMATE A REASONABLE ANSWER29. SOLVE PROBLEMS WITH EXTRANEOUS INFORMATION30. IDENTIFY NEEDED INFORMATION IN PROBLEMS31. SOLVE PROCESS PROBLEM'S - ORGANIZING DATA
MEASUREMENT/GEOMETRY32. IDENTIFY FIGURES USING GEOMETRIC TERMS33. MEASURE/DETERMINE PERIMETERS AND ARM.'34. ESTIMATE LENGTWAREANOLUMEPANGLE MEASURE35. SELECT APPROPRIATE METRIC/CUSTOMARY UNIT38. MAKE V 'SURE CONVERSIONS WAN SYSTEMS
10;4 ...:;x: ... . ,, 4!../.1. I.:vie . si i...,i., -. ji.K... s. lAts;;a:s. . *.sflsd.-...,%-e.tite, ...;St i 's t.s'ss. s s. :gst-s 4rrs' 'LA. s' ..z--44...:A; v.*1... :2,4. -.::'
"INDICATES A SCORE BELOW THE REMEDIAL STANDARD. THIS STUDENT MUST RECENE FURTHER DIAGNOSIS COPYRIGHT SD 1986 BY CONNECTICUT STATE BOARD OF EDUCATION"'ANALYTIC SCORING INFORMATION IS GIVEN ONLY FOR THOSE STUD NTS WHO SCORED BELOW THE REMEDIAL STANDARD. ALL RIGH S RESERVED.
1...NEEDS REMEDIAL ASSISTANCE 2...BORDERLINE PERFORMANCE 3- SATISFACTORY PERFORMANCE PRINTED IN THE UNITED STATES OF AMERICA.
750285A3
CONNECTICUT MASTERY TESTING PROGRAM SC' :00L BY CLASS REPORTLANGUAGE ARTS
PAGE-.- .-- - . -.......
TESTING DATE:
SCORES INDICATE NUMBER/PERCENT OFSTUDENTS MASTERING EACH OBJECTIVE
WELL WRITTEN WITH DEVELOPED SUPPORTNE DETAIL 7 OR 8GENERALLY WELL ORGANIZED WITH Str:PORTNE DETAIL S OR 6PINIMALLY PROFICIENT 4B 1. R 14 DIA STAN .AR. OR
i.kmigii'....amLdzareeurgiciirtwaiatiesaigtr:,:amt,a,DEGREES OF READING POWER(DRP) aNUM3ER/PERCENT OF STUDENTS' # / %
AT OR ABOVE THE READING GOAL FOR BEGINNING EIGHTH GRADERS 62+Mow THE READING GOAL FOR BEGIN. NG EIGHTHGRADERS RUT ABOVE THE REMEDIAL VARDAR() 55 TO 61BELOW THE REMEDIAL STANOARO" BELOW 55
MINIMA PRO NT 4IIMEM6j2=3FTIMMIIIIMINEIMIIIMIlignr.J ' ZikAlex ..f. t...64-1ths4a114 S.L.:1;...11.1%.i.G.V,141.4111.11°,41.4.61.*1arlatiti.4144:ilfti.0.1kAkia..1.1. tilaeAtilki;Ltjliiik41difltak.2..i.:..:1111.4.34Jwittgal4i2..Feid*/:3*-4.4,...6..tf.Vziis.:1.41.4.4A.1,11,,,
DEGREES OF READING POWER(DRP) eNUMSER:PERCENT OF STUDENTS.
COPYRIGIIT 0 MS V CONNECTICUT STATE BOARD OF EDUCATIONALL RIGHTS RESERVED. PRINTED IN THE U SA
'REMEDIAL STANDARD IS 4 FOR WRITING."REMEDIAL STANDARD IS 55 ORP UNITS FOR READING
79
0306A3
(ConnecticutMastery TestingProgram
PARENT/STUDENT DIAGNOSTIC REPORT
Your child's scores on the Connecticut Mastery Test are reported inside.
For a description of the Connecticut Mastery Testing Program, see the back cover of till, (older.
For general information about your local district's testing program, please contact your superintendent of schools.
For further information on the Connecticut Mastery Testing Program, contact. Connecticut State Department of Education,Office of Research and Evaluation, Box 2219, Hartford, Connecticut 06145, (203) 566-4001 or 4008
80
t MATHEMATICSSTUDENT OBJECTIVES ANALYSIS FOR
GRADE: SCHOOL
FORM: DISTRICT
TEACHER: TESTING DATE
NCONNECTICUT
MASTERY TESTING
PROGRAM
OBJECTIVES TESTED
MASTERY CRITERIA
STUDENTSCORE
NUMBER OFITEMS CORRECT
CONCEPTUAL UNDERSTANDINGS1. Order fractions 3 of 42. Order decimals 3 of 43. Round whole numbers 3 of 44. Round decimals to the nearest whole number, tenth and
hundredth3 of 4
5. Multiply and divide whole numbers and decimals by 10, 3 of 4100 and 1000
6. Identify fractions, decimals and percents from pictorialreprescitations
3 of 4
7. Convert fractions to decimals P- vice versa 3 of 48. Convert fractions and decim-* percents and vice versa 3 of 49. Identify points on number line:, ec ales and grids 3 of 4
10. Idertify ratios and fractional parts from given data 3 of 411. Idenlify an appropriate procedure for making estimates
with decimals and fractions3 of 4
COMPUTATIONAL SKILLS
12. Add and subtract whole numbers less than 10,000 3 of 413. Multiply and divide 2- and 3-digit whole numbers by 1-
and 2-digit numbers3 of 4
14. Add and subtract decimals (to hundredths) in horizontalform
3 of 4
15. Identify the correct placement of the decimal point inmultiplication and division of decimals
3 of 4
16. Add and subtract fractions and mixed numbers 3 of 417. Multiply fractions and mixed numbers 3 of 418. Determine the percent of a number 3 of 419. Estimate sums and differences or whole numbers and
decimals including making change3 of 4
20. Estimate -)roducts and quotients of whole numbers anddecimals
3 of 4
21. Estimate fractional parts am] percents of whole numbersand money amounts
3 of 4
I
THE PSYCHOLOGICAL CORPORATIONHARCOURT BRACE IOVANOVICH PUBLISHERS
GRADE 8 REPORT PART 1
./-
OBJECTIVES TESTED
MASTERY CRITERIA-1
STUDENTSCORE
NUMBER OFITEMS CORRECT
PROBLEM SOLVING/APPLICATIONS
22 Compute sums, differences, products and quotients using-a calculator
3 of 4
23. Interpret graphs, tables and charts 3 of 424. Solve 1- and 2-step problems involving whole numbers
and decimals including averaging3 of 4
25. Solve 1- and 2-step problems involving fractions 3 of 426. Solve problems involving measurement 3 of 427. Solve problems involving elementary probability 3 of 428. Estimate a reasonaole answer to a given problemt 3 of 429. Solve problems with extraneous information 3 of 430. Identify needed information in problem situations 3 of 431. Solve process problems irnolving the organization of data 3 of 4
MEASUREMENT/GEOMETRY
(with calculator available)
32. Identify figures using geometric terms 3 of 433 Measure and determine perimeters and areas 3 of 434. Estimate lengths, areas, volumes and angle measiires 3 of 435. Select appropriate metric or customary units and
measures3 of 4
36. Make measurement conversions within systems
tvothout calculator available
3 of 4
\--TOTAL NUMBER OF OBJECTIVES MASTERED (out of 36)
NUMBER OF rums CORRECT (out of 144) (Remedial Standard is 78 of 144 items correct)
82
LANGUAGE ARTSSTUDENT OBJECTIVES ANALYSIS FOR
GRADE:
FORM:
TEACHER:
SCHOOL
DISTRICT
Tr 'INC DATE i
CONNECTICUT
MASTERY TESTING
PROGRAM
THE PSYCHOLOGICAL CORPORATIONHARCOURT ERA(' IOVANOVICH PUNISHERS
1. Capitalization & Punctuation2. Spelling3. Agreement (verb tense, subject- object -verb, and pronoun referents)4. Tone
9 of 126 of 8
11 of 153 of 4
STUDY SKILLS
5. Locating Information (schedules, maps, indexes, glossaries, dictionaries)6. Notetaking and Outlining 9 of 12
3 of 4
USTENING COMPREHENSION7. Literal (understands the meaning; of ideas clearly stated by a scPaker)8. Inferential & Evaluative (understands the meanings of ideas not clearly stated, but implied, by a speaker
and is able to make critical judgments about them)
3 of 412 of 16
READING COMPREHENSION,9. Literal (understands the meanings of ideas clearly stated within a passage)
10. Inferential (understands the meanings of ideas not stated, b,t implied, within a passage)11. Evaluative (able to make critical judgments about statements and inferences within a passage)
6 of 810 of 1410 of 14
( TOTAL NUMBER OF OBJECTIVES MASTERED (out of 11)
WRITING SAMPLE
Holistic Writing Score
STUDENTSCORE
Remedial Standard is 4 of 8
DEGREES OF READING POWERS (DRP)TM 31 LAJCIN ISCORE
DRP Units
Remedial Standard is 55 DRP UnitsReading Goal is 62 DRP Units
i (..Degrees of Reading Power and DRP are trademark, owned by the College Entrance I xammat.on Board I
Inside you will find the re ults of the Connecticut Mastery Test administered to you: chid earlier this fall. The test results F 2Ip to show you andthe school district's professional staff how well your child is perr3rming on those skills identified by the State of Connecticut ac important forstudents entering eighth grade to have mastered.
These tests are designed to determine the specific skill levels of students. The test results will be used to:provide your school with information for use in asses - mg the progress of individual students over time;provide your school with information based on which improvements in the general instructional program can be made, andprovide information on appropriate basic skills remedial assistance for students so indicated.
Mastery testing will occur each fall in grades four, six, and eight.
If you have any questions about these test results please ask your child's teacher(s). The teacher(s) will share w'th you other observations andrecommendations based on experience in working with your son or daughter during the last several months.
PARENT/STUDENT DIAGNOSTIC REPORT
Description of the Test
Mathematics: The mathematics test assesses thirty-six (36) specific objectives in foul general areas of: (1) Conceptual L iderstandings; (2)Computational Skills; (3) Problem Solving/Applications; and (4) Measurement /Geometry. Test items evaluate a student's ability to: orderfractions ano decimals; round whole numbers and decimals; make conversions among fractions, decimals and percents, compute with wtolenumbers, decimals and fractions; Esiimate with whole numbers, decimals and fractions, solve 1- and 2-step problems involving wholenumbers, decimals, fractions, measurement and elementary probability (with a calculator available); estimate a reasonable answer to aproblem; solve problems with extraneous information and identify needed information in problem situations. ineasuie and/or estimatelengths, areas, volumes and angle measures; make measurement conversions; and select appropriate measuremer t mils.
Language Arts: The langLage arts test covers two general areas: Reading/Listening Comprehension and Writing/Study Skills. Then are eleven(11) objectives and two holistic measures of reading and wrii.ng.
The content of Reading/Listening Comprehension consists of narrative, expository, and persuasive passages on a variety of topics measuring astudent's reading and listening ability in: (1) Literal Comprehension; (2) Inferential or Interpretive Comprehension; and (3) Evaluative or CriticalComprehension. Audio tapes are used to assess a student's listening comprehension ability. Also used is the "Degrees of Reading Power" (ORP)Test which includes eleven (11) passages and seventy-seven (77) test items. It is designed to measure a student's ability to understand onfic-tionEnglish prose on a graduated scale of reading difficulty.
The content of Writing /Study ;','.ill; consists of three comoonents. First, writing skills are directly assessed. A student is asked to write on adesignated topic. The writing s. j...dged on the stadent's demonstrated ability to convey information in a coherent and organized fashion.Second, the test assesses the mechanics of good writing, which are defined as: (1) Capitalization and Punctuation, (2) Spelling, (3) Agreement;and (4) Tone. Finally the test assesses Study Skills, which have been defin:d as Locating Information (schedules, maps, index references, anddictionary usagcl arid Outlining and Notetaking. This part of the test measures a student's ability to find and use information from listedsources, and to make notes from audio tapes.
APPENDIX G
Number of Objectives Mastered
o Mathematics
o Language Arts
86
,,'
MATHEMATICS: MATHEMATICS:AVERAGE NUMBER OF PERCENT OF STUDENTS ACHIEVING MASTERY BY
OBJECTIVES MASTERED NUMBER OF OBJECTIVES MASTERED
36
30
24
18
12
23.7
6
0
1986
YEAR
This bar chart illustrates theaverage number of mathemat-ics objectives mastered,statewide.
30
20
10
00% Pi
14%.111=MEMP
21%
26%..11111,
31%
0 1-7 8.14 15.21 22.28 29.35
NUMBER OF OBJECTIVES MASTERED
4%
n36
This bdr chart illustrates the distribution of stuc its, statewide, who masteredmathematics objectives within each of the sewn score categories.
87
LANGUAGE ARTS:AVERAGE NUMBER OF
OBJECTIVES MASTERED
o 11-..wCC 10(0< 9-2Ul 8
0CC 4co2 3-.D
2 -
1-
7.5
1986
YEAR
This bar chart illustrates theaverage number of languagearts objectives ma !et ed,stateide.
LANGUAGE ARTS:PERCENT OF STUDENTS ACHIEVING MASTERY BY
NUMBER OF OBJECTIVES MASTERED
25+-
20 -
15
u_ 11%0
7%
52%
00 1.2
18%11 4 %
2 %
21%
3.4 5.6 7-6 9.10 11
NUMBER OF OBJECTIVES MASTERED
This bar chart ilListrates the distribution of students, statewide, who mastered ob-jectives within each of the seven score groupings.
88
Aprendix H
State by District Report October 1986
Grade Eight Mathematics Test Results
r) 9
90,
STATE BY DISTRICT REPORTCONNECTICUT MASTERY TESTING PROGRAM GRADE 8
DATE TESTED: 10-86
Mastery Critena for each objective Is3 of tne 4 items correctRemedial Standard is 78Of the 144 items correct
Mastery Crams roc each omectmt is3 at the 4 items correct.fterneesal Standard is Tftot to 144 items correct
OBJECTIVES TESTED
°ROBLEM SOLVING AND APPLICATIONS MEASUREMENT/GEOMETRY
TOTALMATHEMATICS
PAGE 2
S 'Po, 12, B 44, cP 13 c to * 4c. c.04, CI; Vc +2S%., .14. 4., 4. 4. 4. 44, l '1,4 % "'t, . to ,,, 4.01°y 13,5",
% i o ' : % 41/4 % N S. t, -e-o ° to %, qi,"+,, 0.ots.oe. .., ., 1,. 43.° -,.. s 4, ..=' 4; -4. s,3%,- 4t, 44. 44. 4-0 *.i, % . I,01,... A, 6, % % .:2 a 4., , e ,,, .1. .. 0,40/ . hip
ea" %, Pp 4.0 o,-0 i vs. , % .' t o % 4 1, c' / & 4:. 1404.4.
0, 3 % 'tt 6.4 .t. di. .t q % % $ ,. %sty o, % 46 ... ''G % 43 ' .' 4.*'4,. 4. i t d °F '0 ci 0.4, **, sti, 4> 4 4. ,, 4., fi.
°O 11. & 4 'Pe..a..9 tr... %
DISTRICTSi OF
STUDENTSTESTED
TOC SCORES INDICATE THE PERCENT OF rTUDENTSMASTERING EACH OBJECTIVE
. T. .. rEAST HAMPTCNEAST HARTFORDEAST HAVENEAST L Y M EEASTONEAST $4INDSORELLINGTONENFIELDFAIRFIELDFARRINGTONFRANKLINGLASTONBURYGRANIYGREEMICHGRISNOLDGROTONGUILFORDHAMDENHARTFORDHARTLANDKENTKILLINGLYLEBANONLEDYARDLISBONLITCHFIELOMADISON
PARTICIPATION RATES FOR EIGHTH-GRADE STUDENTS BY DISTRICTSCHOOL YEAR 1986-1987
TOTAL STUDENTS PERCENT OF STUDENT PERCENT OF ELIGIBLE STUDENTS TESTEDEIGHTH-PADi ELIGIBLE POP.EXEMPTPOPULATION FOR TESTING FROM TESTING MATHEMATICS LANGUAGE ARTS WRITING READING;'
"%
137
2i':"L EIGHTH-GRADE ELIGIBLE POP EXEMPTRITRICT POPULATION FOR TESTING FROM TESTING
PARTICIPATION RATES FOR EIGHTH-GRADE STUDENTS BY DISTRICTSCHOOL YEAR 1986-1987
TOTAL STUDENTS PERCENT OF STUDENT PERCENT OF ELIGIBLE STUDENTS TESTED
%,-,:!..
STRATFORD?SUFFIELDINOMASTON3MOMPSONOVLEANDAURINGTONTRUMBULL:UNION?VERNON.1WOLUNTONN',WALLINGFORDIMERBURYMATERFORD40TERTONN41ESTBROOK,IWESVHARTFORD;WESTAAVEN'WESTON;WESTPORT:WETHERSFIELDNILLINGTONOULTONNINCHESTER1WINDHANNINDSOROUNDSOR LOCKSNOLCCTT1WOODSTOCK;REGION IV'iliEG10N V;REGION VIREGION VIIREGION VIII
4RFGIONX
EGION XIREGION XIIREGION XIII,REGION XIV%REGION XV;REGION XVI;REGION XVII