Top Banner
DEVELOPMENTAL READING ASSESSMENT DRA Published by Pearson Reviewed by: Carolyn Kick
30
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
  • 1. DEVELOPMENTALREADINGASSESSMENTDRAPublished by PearsonReviewed by:Carolyn Kick

2. AUTHORSJoetta BeaverWith a bachelor of science in elementary education and masters degree inreading from The Ohio State University, Joetta Beaver worked as anelementary teacher (K-5) for 30 years, as well as a K-5 LanguageArts/Assessment coordinator and an Early Education teacher-leader. She isthe primary author of DRA2 K-3, co-author of DRA2 4-8 and DevelopingWriters Assessment (DWA), consultant, and speaker.Mark Carter, PhDWith assessment the focus of much of his professional work, Mark Carterserved as a coordinator of assessment for Upper Arlington Schools (wherehe currently teachers fifth grade), conducted numerous seminars, and co-authored DRA2 4-8, DWA, and Portfolio Assessment in the ReadingClassroom. He received his doctorate of philosophy from The Ohio StateUniversity where he also taught graduate courses in education as anadjunct professor. 3. OVERVIEW OF THE DRA The Developmental Reading Assessment is a set ofindividually administered criterion-referenced assessmentsK-8. Purpose: Identify students reading level based onaccuracy, fluency, and comprehension Other purposes- Identify students strength andweaknesses at their independent reading level, planninginstruction, monitor reading growth, and preparation fortesting expectations. Assessment is administered one-on-one requiring studentsto read specifically selected leveled assessment texts thatincrease with difficulty. Administered, scored, and interpreted by classroomteachers. 4. DRA HISTORY & REVISIONS 1988-1997- DRA is researched and developed by Joetta Beaver andthe Upper Arlington School District 1997- DRA K-3 is published by Pearson 1999- Evaluation of the Development of Reading 2002- DRA 4-8 2004- DRA Word Analysis 2005 SRA Second Edition (DRA2), K-3 & 4-8 2006- Evaluation of the Development of Reading 2007- More than 250,000 classrooms use DRA and EDL 2008- Pearson partners with Liberty Source on DRA2 HandheldTango Edition 2009- DRA2 Handheld- Tango wins CODIE Award 5. DRA READING ASSESSMENTCRITERIAOral Reading Fluency Ability Word Analysis Comprehension Grade K 6. ORAL READING AND FLUENCYThe total number of oral reading errors is converted to an accuracy score using a words-per minute (WCPM) metric. Rating expression, phrasing, rate, and accuracy on a 4 point scale. This begins at level 14- the transitional level, grades 1 and 2. 7. COMPREHENSION Levels 3-16, once the oral reading is over, the studentshould take the book and read it again silently. This givesthem another opportunity to check themselves oncomprehension for retelling. Students retell what happensin the story. Underline information that the student is able to give, butwhich requires prompting. Note information that the student is able to give, but whichrequires prompting, with a TP (teacher prompt). Follow-up questions follow the summary and if used needto be tallied to the left. The number of prompts to elicit moreinformation will be calculated as part of the comprehensionscore. 8. WORD ANALYSISAssesses phonological awareness, metalanguage, letter/word recognition, phonics, and structural analysis in grades K-3. DRA Word Analysis is included in the new second edition of DRA K-3. 9. READING LEVELS EmergentEarlyTransitional Levels Levels A-3 Levels 4-1214-24Kindergarten Grade 1 Grades 1 & 2 Intermediate/MiddleExtending: LevelsSchool Levels 40-28-39 Grades 2 & 380, Grades 4-8 10. INFORMATION ON DRA 11. VALIDITY 12. KORETZ ON VALIDITYValidity, which is the single most important criterion for evaluating achievement testing. ..but, tests themselves are not valid or invalid. Rather, it is an inference based on test scores that is valid or invalid. ..Validity is also a continuum: inferences are rarely perfect. The question to ask is how well supported the conclusion is (Koretz, 2008, p. 31). 13. VALIDITY CONT. Messick 1994 would argue that construct validity refersto the inferences that are drawn about score meaning,specifically the score interpretation and theimplications for test use (quantitative). This theoreticalframework becomes subject to empirical challenges, aunified approach of validity What is the test measuring? Can it measure what it intends to measure? 14. FOUR TYPES OF VALIDATION Predictive Concurrent Criterion-oriented Criterion-OrientedValidity Content Construct 15. CRITERION VALIDITYPredictive validity is were we draw an inference from test scores to performance.Concurrent Validity- studied when a test is proposed a substitute for another, or a test is shown to correlate with some contemporary criterion (Cronbach & Meehl, 1955). 16. CONTENT VALIDITYAccording to Yu, content validity is when wedrawinferences from test scores to a larger domain ofitems similar to those on the test, samplepopulation.This selection of content is usually done byexperts.Experts tend to lack experience in the field, andassume that all are experts. 17. CONSTRUCT VALIDITYAccording to Hunter and Schmidt (1990), construct validity is aquantitative question rather than a qualitative distinction such as"valid" or"invalid"; it is a matter of degree. Construct validity can be measured bythe correlation between the intended independent variable(construct)and the proxy independent variable (indicator, sign) that is actuallyused. -Yu 18. PEARSON EDUCATION ONDRA VALIDITY Pearson refers to validity of an assessment, one looks at the extent to which the assessment actually measures what it is supposed to measure. Questions to be asked when examining validity include: Does this assessment truly measure readingability? Can teachers make accurate inferences aboutthe true reading ability of a student based uponDRA2 assessment results? 19. PEARSON EDUCATION ON CONTENT RELATED VALIDITY OF THE DRA The content validity of a test relates to the adequacywith which the content is covered in the test. A Theoretical Framework and Research, the DRA2incorporates reading domains to review and researchgood readers with consultants and educators. Content Validity was built into the DRA and DRA2assessments during the development process. 20. PEARSON CRITERION RELATEDVALIDITY ON THE DRA Criterion-related validity refers to the extent to which ameasure predicts performance on some other significantmeasures, (called a criterion) other than the test itself.Criterion validity may be broken down into two components:concurrent and predictive. Concurrent validity correlates the DRA to many otherreading tests: Grays Oral Reading Test-4th Edition GORT-4; Wiederholt & Bryant, 2001 DIBELS Oral Reading Fluency Test-6th Edition Correlations Between DRA2 and Teacher Ratings 21. DRA REVIEW, NATALIE RATHVON, PH. D.The following evidence of validation is basedupon the review of the DRA completed by:Natalie Rathvon, Ph. D., Assistant ClinicalProfessor, George Washington University,Washington DC, Private Practice Psychologistand School Consultant, Bethesda, MD(August 2006) 22. DRA CONTENT VALIDITYIn a review by Natalie Rathvon, PH.D. Oral Fluency, running record- derived from only Clays Observational Survey (Clay,1993). Teacher surveys (return rates were 46%), conducted (ns of 80 to 175) revealedthat DRA provided teachers with information describing reading behaviors andidentifying instructional goals There were also concerns about adequacy and accuracy of the comprehensionassessment and the accuracy of text leveling prior to 2003 before the Lexileframework evaluated the readability in the DRA text. Concerns about who develop and reviewed the assessment. There is no evidencethat external reviewers participated in the development, revision, or validationprocess. Rathvon states, Means, standard deviations, and standard errors ofmeasurement should be presented for accuracy, rate, and comprehension scoresfor field test students reading adjacent text levels to document level-to-levelprogression. 23. CONSTRUCT VALIDITY EVIDENCE Results from Louisiana statewide DRA administrations for springof 2000 through 2002 for students in grades 1 through 3 (ns =4,162 to 74,761) show an increase in DRA levels across grades,as well as changes in DRA level for a matched sample of student(n = 32.739) over a three year period. This indicates that theskills being measured are developmental. The DRA as evidence can detect changes in reading levels. As evidenced in two studies evaluating the relationship betweenLexile Scale measures and DRA running-record format is a validmethod of assessing reading comprehension. 24. SUMMARY OF WHAT DRA IS: An attractive reading battery modeled after an informalreading inventory based Clays Observational Survey(Clay, 1993) Authentic Texts Instructionally relevant measures of fluency andcomprehension Provides meaningful results for classroom teachers,parents, and other stakeholders Provides encouraging evidence that the use of DRApredicts future reading achievement for primary gradestudents. 25. DRA CRITERION RELATEDVALIDITY No concurrent validity evidence is presented documenting therelationship between the DRA and standardized or criterion-referenced tests of reading, vocabulary, language, or otherrelevant domains for students in kindergarten or grades 4 through8. There is a need for studies examining the extent to whichindividual students obtain identical performance levels on theDRA and validated reading measures are especially needed. No information is provided to document the relationship betweenthe DRA Word AnalysNo concurrent validity evidence is presentedfor any of the DRA assessments in terms of relationship betweenDRA performance and contextually relevant performancemeasures, such as teacher ratings of student achievement orclassroom grades. 26. SUMMARY OF WHAT DRA IS: Responsive to intervention for primary grade students An assessment model that has raised teacherawareness of student reading levels correspondingthem with appropriate texts. Teacher reviewed and survey based on classroompractice (return rates were 46%), conducted (ns of 80to 175) (Rathvon, 2006) Provided evidence that the Lexile Scale measures andDRA running record format is a valid method ofassessing reading comprehension. 27. SUMMARY OF WHAT DRA IS NOT: Informal reading inventories lack in reliability and validity (Invenizzi etal,; Spector, 2005) Provide evidence of text equivalence within levels Provide evidence for overall reading level for half the grade levels. Have consistent process of text selection, scoring, andadministering- vulnerable to teacher inconsistencies and judgments(improved since Lexile model) Provide enough evidence of criterion-related validity for olderstudents Provide concurrent validity evidence documenting the relationshipbetween the DRA and standardized or criterion-referenced tests ofreading, vocabulary, or language in kindergarten and grades 4-8 Provided to document the relationship between the DRA WordAnalysis and any criterion measure. 28. SUMMARY OF WHAT DRA IS NOT: Provide sufficient evidence that teachers can select texts alignedwith students actual reading level (or achieve acceptable levels ofscorer consistency and accuracy) Provide evidence of demographic groups Include external reviewers in the development, revision, andvaliditaion of any DRA series Provide complete field testing reporting Provide theoretical rationale or empirical data supporting theomission of a standard task to estimate student reading level. Provide standard means , standard deviations, and standarderrors of measurement ensuring accuracy 29. WHAT DOES ALL OF THIS MEAN?Learning about the validity of the Developmental ReadingAssessment was difficult. I have yet to administer one, butwould like to go through the process. There is no empiricalevidence that consistently supports the validity of the DRA.There are far too many variables, and opportunities for humanbehavior to alter results and effect the variability.However, the difference in how teachers approach thediagnostics of the reading levels of students, and theawareness of getting leveled texts in the classroom haschanged dramatically over the past few years. The changesin reading instruction based on results of the DRA (though notvalid) has changed reading instruction in our district. 30. RESOURCES http://mypearsontraining.com/pdfs/TG_DRA2_ProgramComponents.pdf DRA k-3PMDBSUBCATEGORYID=&PMDBSITEID=2781&PMDBSUBSOLUTIONID=&PMDBSOLUTIONID=&PMDBSUBJECTAREAID=&PMDBCATEGORYID=&PMDbProgramID=23662