Top Banner
Jodi Herold Reading Notes ED 633 Citation: Balfanz, R., Legters, N., West, T., & Weber, L. (2007) Keywords: NCLB, high schools, AYP, graduation rates Introduction: Only 60% of students who begin their high school careers graduate. Many of these students attending high schools are Latino students (40%) and African American (nearly 50%) NCLB was created to identify schools that are failing and ensure that all students receive a high quality education. Schools that do not pass AYP (Adequate Yearly Progress) over a period of years will receive remedies from the district and state. AYP is based on students’ state test scores in academic skills (e.g. reading, writing, and math). AYP is the means to determine improvement. There are weaknesses in the law both in theory and implementation that question its validity to effectively identify and encourage improvement. Methods: Research Questions Is NCLB working as proposed? Does it accurately identify and encourage improvement in low-performing high schools? It is an effective tool to reform high schools? Problem area is that states have latitude in developing standards, graduation qualifications, and baselines numbers for subgroup accountability. This can cause difficulty for schools to meet AYP. Data Three sources: Common Core of Data (CCD), state and district report cards for the 2003-2004 and 2004-2005, and National Assessment of Educational Progress (NAEP) Sample: 2,030 low-performing high schools across the country, 10% random sample of high-schools (202 high schools) Findings: 2005-majority of low-performing high schools in sample did not make AYP Success varied by state. Half in south made AYP, 1/3 in north and south Table 1-4 show school characteristic and if made AYP or not (e.g. free lunch, Title1, teacher-student ratio, urban v. rural, ethnic groups).
31

Suggestions - WOU Homepagejherold08/ED633/Research Notes.docx  · Web viewThis study proves that teachers need to use appropriate math vocabulary and not slang or “dummied down”

Feb 06, 2018

Download

Documents

phungdan
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Suggestions - WOU Homepagejherold08/ED633/Research Notes.docx  · Web viewThis study proves that teachers need to use appropriate math vocabulary and not slang or “dummied down”

Jodi HeroldReading NotesED 633

Citation: Balfanz, R., Legters, N., West, T., & Weber, L. (2007)

Keywords: NCLB, high schools, AYP, graduation rates

Introduction: Only 60% of students who begin their high school careers graduate.Many of these students attending high schools are Latino students (40%) and African American (nearly 50%)NCLB was created to identify schools that are failing and ensure that all students receive a high quality education.Schools that do not pass AYP (Adequate Yearly Progress) over a period of years will receive remedies from the district and state. AYP is based on students’ state test scores in academic skills (e.g. reading, writing, and math). AYP is the means to determine improvement.There are weaknesses in the law both in theory and implementation that question its validity to effectively identify and encourage improvement.

Methods:Research QuestionsIs NCLB working as proposed? Does it accurately identify and encourage improvement in low-performing high schools? It is an effective tool to reform high schools?Problem area is that states have latitude in developing standards, graduation qualifications, and baselines numbers for subgroup accountability. This can cause difficulty for schools to meet AYP.

DataThree sources: Common Core of Data (CCD), state and district report cards for the 2003-2004 and 2004-2005, and National Assessment of Educational Progress (NAEP)Sample: 2,030 low-performing high schools across the country, 10% random sample of high-schools (202 high schools)

Findings:2005-majority of low-performing high schools in sample did not make AYPSuccess varied by state. Half in south made AYP, 1/3 in north and southTable 1-4 show school characteristic and if made AYP or not (e.g. free lunch, Title1, teacher-student ratio, urban v. rural, ethnic groups). Was a cluster—attributed to possible school characteristics or effective reformPoverty, Title 1 and higher teacher-student ratio separated schools from make/not make AYPLow-performing schools with less subgroups accountability are more likely to pass. Southern states and rural areas prove this finding.Less kids=lower teacher-student ratio and less students in a subgroup=more likely to make AYPNCLB has a theory that the higher the sanctions and interventions a school might face, the more it will be motivated to initiate reform More pressure to pass AYP=less possibility of making AYPStates design their own performance standards and assessments and determine proficiency as well as decide what students belong in a subgroup.Each state determines how schools are progressing towards AYPStates can make AYP and have decreasing graduation ratesSchools are improving significantly and not making AYP

Page 2: Suggestions - WOU Homepagejherold08/ED633/Research Notes.docx  · Web viewThis study proves that teachers need to use appropriate math vocabulary and not slang or “dummied down”

There is not strong correlation with the NAEP exam results and NCLB tests

Implications:Four Points:1. There are three tiers of high schools-high performance 10-20%, Low performance 30-40%, and 12-15% average2. Low-performing schools have the hardest battle. Children are losing attachment to schooling, and attendance is an issue. Many live in poverty and are below grade level. Over half are located in urban areas.3. There is a pattern of students dropping out. Low grades=worse attendance. Repeat freshman year and then try alternative high school programs.4. Improving a low-performing school takes money, time, and a tremendous amount of effort.

Proposals:1. Focus on the problem areas and not on AYP and the kids that “bring the school down”2. Determine goals of Title 1 funds and redefine school entitlement (poverty and educational challenges)3. Schools need support from the federal, state, and local entities to provide guidance, trainings, and

resources. Fix them or replace them.

Research Notes:Look at other testing completed nationwide when using test score data on state v. state or region v. regionAYP is not a useful tool to compare states’ proficiency rates due to the states’ authority to determine proficiencyLook at alternative forms within my classroom, school, and district to demonstrate progress and improvementAYP does not allow for data that proves progress but not meeting AYP

Balfanz, R., Legters, N., West, T., & Weber, L. (2007). Are NCLB's measures, incentives, and improvement strategies the right ones for the nation's low-performing high schools? American Educational Research Journal, 44(3), 559-93.

Page 3: Suggestions - WOU Homepagejherold08/ED633/Research Notes.docx  · Web viewThis study proves that teachers need to use appropriate math vocabulary and not slang or “dummied down”

Citation: Graue, E., & Hatch, K. (2007)

Keywords: class-size reduction, elementary education, professional development, team teaching

Introduction: CSR (class-size reduction) is a proposal to help increase academic achievement, build up foundational skills in primary grades and to reduce the effects of economic and social inequitiesIn the last two decades, more than 30 states and the federal government have implemented CSR programsFirst generation researchers note-higher student achievement and student and teacher attitudesSecond generation researchers note -teacher were able to do activities that created opportunities for student learningThird generation researchers note-classroom practice is embedded into the school culture that is created by teacher beliefs, administration, and district or state policyCSR’s theory is that smaller class sizes=more opportunities to have positive-student interactions, few behavioral issues, and more effort to teach and learn. Teachers can have deeper relationships with parents and tailor instruction to specific needs of the students. Problems-need more teachers, more space. More space may be cutting other programs (i.e. art, special ed, libraries, computer labs) to have more classrooms. Schools do not have the budget.Class size 15:1 or 30:2SAGE is comprehensive legislative reform in Wisconsin designed to reduce the impact of poverty on student achievement. Its components are CSR, professional development for teachers, school available after hours, and challenging curriculum.

Methods:Research QuestionsHow is teacher action related to the resource of CSR?How is this resource facilitated or constrained by physical layout of the school?How is implementation shaped by educator beliefs about team teaching and working with colleagues?

Data8 half-day visits for observationsStandardized assessment of environments through the use of the Assessment of Practices in Early Elementary Classrooms (APEEC) and Early Language and Literacy Classroom Observation Toolkit (ELLCO)Collection of artifacts (e.g., lesson plans, curriculum, examples of home–school communication, report cards, assessment instruments, photographs).Interviews with classroom teachers, principals, and studentsWI State reading comprehension tests 3rd gradeWI State knowledge and concepts test 4th grade

Sample: nine high-poverty schools in urban, semiurban, and rural communities.

Findings:Studies done of classroom practice could not be described by test scores and standardized observationsPrincipals in schools that were high achieving or showing improvement readily were will to get creative in facing space issues. Low- achieving schools were not. Class-size configuration is but only a small part that creates student outcomes Teachers are able to differentiate instruction with smaller class sizesSpace allowed students to learn in their “bubble”; small group activities were more feasible; teachers can interact more with the studentsSpace is a resounding issue in CSR, especially in at 30:2 classroomsSome schools involved in CSR were constrained by square footage in a school15:1 classrooms had slightly higher ratings in the APEEC and ELLCO (assessment of environment) except for ELLCO literacy activities

Page 4: Suggestions - WOU Homepagejherold08/ED633/Research Notes.docx  · Web viewThis study proves that teachers need to use appropriate math vocabulary and not slang or “dummied down”

30:2 classrooms issues: teachers found it hard to work together without CSR professional development, had same resources as a 15:1 classroom, when one teacher was absent, a substitute was not provided, one teacher took the lead and the other did miscellaneous tasksIn 30:2 classrooms-“tag team” teaching was prevalent but not effective. It did provide for less lesson planning and one to deal with behavioral issuesIn 30:2 classrooms, students needed to be split into to groups to provide engagement and meaning activities.Professional development focused on balanced literacy, visits to other schools, or PLCs and not on CSRAuthors did not receive the final outcomes to the question of student outcomes of children in small classrooms.

Discussion/Implications:1. SAGE’s biggest challenge was space and staffing2. Student achievement increased by the strengthening of student–teacher interactions3. Teachers with ample space felt at liberty to do various activities-small and large groups, pull-out sessions, and individualized activities. 4. Teachers in small spaces grappled with effective instruction and could not do many hands-on activities or small group work. 5. In 30:2 classrooms, teachers tend to “tag team” teach6. SAGE/CSR made teaching feasible. Teacher talked with each other and communicated with parents, and the bookkeeping and housekeeping lessened. Teachers could differentiate instruction as well. 7. Problem areas-teachers must change teaching styles, get used to the new space set up, develop curriculum for different ability levels, and differentiate instruction

Suggestions:1. Examine instruction activities and learn more about strategies that are effective for small groups and co-

teaching settings.2. Provide help for teachers and staff to develop new strategies to complement small class sizes

Research Notes:CSR is still a new approach that needs to be more researchedSmall group sizes or 30:2 classroom does not automatically produce higher student achievement.CSR classrooms need to have effective strategies to balance large and small group activities, individualized activities, and clearly identified responsibilities of each teacherBoth teachers must take co-ownership of the classroom and be willing to work together and change teaching styles to be effective

Graue, E., & Hatch, K. (2007). The wisdom of class-size reduction. American Educational Research Journal, 44(3), 670-700. doi: 10.3102/0002831207306755

Page 5: Suggestions - WOU Homepagejherold08/ED633/Research Notes.docx  · Web viewThis study proves that teachers need to use appropriate math vocabulary and not slang or “dummied down”

Citation: Spires, H., Lee, J., & Turner, K. (2008)

Keywords: technologies, academic engagement, middle grades, student perspectives

Introduction: Children are comfortable using various forms of technologies to access information and communicate globally= “digital native”Examples include internet, handheld devices, cell phones, and gaming devices. Students use this technology outside of school, but it is not found in many classrooms as a tool for learning. Often schools prohibit these devices and cause the students to be “unplugged”Workplaces are saturated with digital communications and information management systems. Employers expect potential workers to have advanced technological skillsHalf of the world compete on the same level of economics=educational reform proposalsReformed education needs to “bridge the gap between how students live and how they learn”

Methods:Research QuestionWhat engages a middle schooler to achieve in school?

DataSurveys and focus groupsSample: 4,000 middle grades students (from sixth, seventh, and eighth grades) who were members in a North Carolina statewide after-school program.

Findings:Students used computers more at school than at home. More at after-school programs that at a public library.Students are knowledge in word processing and spreadsheets (learned at school)Students more likely to use the Internet rather than a book to find information High-users of internet and video gamesUse cell phones for communication more than non-e-mail Internet technologies (e.g. chat rooms, IM)Students liked using the computers and doing research on the Internet. Least-worksheets and listening to the teacherMajority of students use a variety of technologies outside of school in genuine, personal, and social ways. Students feel these are an integral part of their everyday lives and that teachers do not understand this.Outside school- technologies for communication and entertainmentInside school-very traditional (word process, test, find info on internet)Social communication not used for academic purposesStudents want to do projects that use technology as a tool to learn information and have experiences that relate to possible career choicesStudents expressed concerns on the constraints of computer use in schools

Discussion/Implications:1. Students want aesthetically pleasing environments with the newest technology e.g. Wi-Fi, sitting areas, open areas. 2. They view school as a place that often confines their access to technology. High users of technology use computer at home more often.3. There is a link between “the use of technologies in school and their academic engagement”4. Students want curriculum and projects based upon possible future careers and be validated in their abilities to make new uses of technology 5. Need to have a 21st century curriculum, instruction and assessment. There must be current tools and connectivity to do this as well as maintenance and upgrades. Funding and lack of resources are huge problems.

Page 6: Suggestions - WOU Homepagejherold08/ED633/Research Notes.docx  · Web viewThis study proves that teachers need to use appropriate math vocabulary and not slang or “dummied down”

Research Notes:I need to write some grants to get more technology into my classroom and incorporate it more into my instruction and projects. With the librarians cut from elementary schools, I need to pick up the baton and teach students computer skills. I am interested in studies from elementary students’ point of view.

Spires, H. A., Lee, J. K., Turner, K. A., & Johnson, J. (2008). Having our say: middle grade student perspectives on school, technologies, and academic engagement. Journal of Research on Technology in Education, 40(4), 497-515.

Page 7: Suggestions - WOU Homepagejherold08/ED633/Research Notes.docx  · Web viewThis study proves that teachers need to use appropriate math vocabulary and not slang or “dummied down”

Citation: Martiniello, M. (2008)

Keywords: limited English-proficient students, academic achievement, word problems (Mathematics), educational tests and measurements

Introduction: One of the most critical topics in high-stakes in education assessments is guaranteeing that the measurements used in content-area skills are valid for students not proficient in EnglishIt is difficult to determine if an ELL student receives a low score due to lack of understanding the question or lack of mastery or both.ELL subgroup is component used in determine AYP ELLs are the lowest-scoring group in both national and state assessments. This provides a great trial to school systems.

Methods:Research QuestionWhat is the nature of linguistic complexity in math word problems that present comprehension difficulties for ELLs?

The test examined is the English version of the fourth-grade math exam given statewide in the spring of 2003. This is a standards-based state assessment aligned with the Massachusetts Math Standards.

39 publicly released test items were used-29 multiple choice, 5 short answer, and 5 open-ended

DataAuthor gave test items to students and had them think aloud. Audio taped and transcribedUsed two DIF (differential item functioning) detection methods. (DIF means the inconsistencies in difficulty an item is for two groups that have equal proficient levels.)Flagged items that proved DIF disfavoring and described the extent of DIF, strand, linguistic complexity, and children’s responses during think-aloudsTextual analyses of test items to examine the intricacy of their nonmathematical language

Sample: random twenty-four 4th grade ELL students in inner-city Massachusetts public schools (first- or second-generation Latin American immigrants)

Findings:Author analyzed each test question and provided think-aloud scripts. She includes the math problem and possible answers, the strand, and standard. Also included is the linguistic complexity of problem and if the words were included on the 3,000-word list known of 4th graders.e.g. test question # 2 multiclausal sentences, child did not understand terms like spin, spinner, identicale.g. test question #30 Students did not know the words chores, vacuum, rake, weed, inside chore, outside chore (words learned at home)ELLs showed a definite understanding of school-related wordsThere were common misinterpretations due to preposition clauses e.g. “Exactly 3/4 of the marbles in the bag are blue, to mean 3/4 of the marbles in the blue bag.”Problem-Cultural references e.g. (coupon for $1.00 off). The word “off” is used usually in a different context

Questions could be reworded to reduce linguistic complexity but maintain its purpose

Page 8: Suggestions - WOU Homepagejherold08/ED633/Research Notes.docx  · Web viewThis study proves that teachers need to use appropriate math vocabulary and not slang or “dummied down”

DiscussionThere was a pattern of characteristics in the math problems that hampered reading comprehension:

1. Syntax-long phrases, multiple clauses (embedded adverbial and relative clauses)2. Vocabulary-some words not commonly known words in grade 4, words learned at home e.g. “chores”,

multiple meaning words, high level academic words, English word expressionsStudents did use cognates to help with understanding but only with high-frequency Spanish words

Greater linguistic complexity=amplifies the complicatedness of math word problems for ELLs as compared with non-ELLs of equivalent math proficiency

Confirms “linguistic complexity” as one source of DIFMath word problems with low linguistic complexity=low DIF. Greater linguistic complexity=high DIF.Strands data analysis, statistics, and probabilities items tended to be abnormally more difficult for ELLs than for non-ELLs with equivalent math scores. This may be caused due to the greater syntactic and semantic complexity of items measuring this standard compared to the others or how the teacher taught the content or how prepared the students were for the examSuggested to examine further the relationship between the syntactic and semantic complexity of strand questions about data analysis, statistics, and probabilities and not the content

Implications1. Test writers need to write clearly and concisely and avoid irrelevant information2. Train test writers to focus on preserving the content or skill to be measured but decreasing the excessive

linguistic complexity of word problems3. Perform cognitive laboratory research and don’t rely on experts to review tests4. For teachers-you cannot separate the teaching of language and math concepts. There must be linguistic

scaffolding.

Research Notes:This serves as a reminder that teachers need to be teaching language at all times and not relegate it to reading and writing. Would like to see analysis of other state tests to see if there are patterns nationwide.

Martiniello, M. (2008). Language and the performance of English-language learners in math word problems. Harvard Educational Review, 78(2), 333-368.

Page 9: Suggestions - WOU Homepagejherold08/ED633/Research Notes.docx  · Web viewThis study proves that teachers need to use appropriate math vocabulary and not slang or “dummied down”

Citation: Shaftel, J., Belton-Kocher, E., Glasnapp, D., & Poggio, J. (2006)

Keywords: Educational tests & measurements, students-- Rating of, comprehension-- Testing, grading & marking, scale items, students with disabilities, regression analysis, academic achievement, performance

Introduction: ELL learners and special ed students are subgroups in schools performance data for NCLB and AYP One concern is test items pose difficulty to read on math tests and may unfairly impact performance for some students (ELL learners and special ed students) Simplified English assessments have been suggested to address the language aspect in large-scale assessments.

Methods: Research Questions 1. Do linguistic features affect the difficulty of mathematics test items? 2. Do linguistic test item features affect English language learners and students with disabilities disproportionately when compared to a general student sample? 3. Which language features have the greatest impact on student performance?

Data Used Kansas math assessments for grades 4, 7, and 10. (Spring 2000) Test items are based on state mathematics standards. Used all four versions of the assessments. All test items were multiple choice. Used a rubric for rating items in the area of linguistic features-focus areas included number of words, sentences and clauses, vocabulary, complex verbs and passive voice. Mean item scores were used for each test question Sample: 8,000 students were randomly selected. (Grades 4, 7, and 10) 20% at each grade level. Included students with disabilities and ELL learners

Findings: Grade 4-five linguistic elements that stood out were prepositions, vague words, complex verbs (verbs with three or more words e.g. had been going), pronouns, and math vocabulary. Students with disabilities had more trouble with items containing whole numbersGreater the number of linguistic elements=the more difficult the item. Positive effects-passive voice and infinitive verb forms Grade 7-two linguistic elements-math vocabulary and comparative terms (greater than, less than)Grade 10-one linguistic element-math vocabulary Sentence length and clause increased each grade level

Discussion/Implications: 1. Do linguistic features affect the difficulty of mathematics test items? Yes, linguistic features in these test items had a significant impact on student achievement. The lower the grade, the larger effect it had.

2. Do linguistic test item features affect English language learners and students with disabilities disproportionately when compared to a general student sample? “This research found no disproportionate impact on potentially vulnerable student groups, ELL and SWD, as a result of these linguistic test item elements, either as a set or individually.” P. 120

3. Which language features have the greatest impact on student performance? a. Difficult math vocabulary showed a steady impact for all student groups at all grade levels *a math content issue and not a language factor b. For Grade 4, slang/conversational words, and multiple meaning words posed difficulty, and test writers need examined test items carefully that contain these types of words c. For Grade 7-problems with comparative terms had noteworthy impact on student performance *many test items included specific reference to inequalities d. When math vocabulary and comparative terms were excluded and the data was reevaluated, grade 4 still had the same results. Grade 7, the number of words in a test item came forth as an issue, and grade 10 had no additional linguistic factors arise

Page 10: Suggestions - WOU Homepagejherold08/ED633/Research Notes.docx  · Web viewThis study proves that teachers need to use appropriate math vocabulary and not slang or “dummied down”

e. Possibility that 4th graders have less sophisticated verbal skills, and therefore are more susceptible to complex language in word problems

Suggestions 1. Test developers/item writers need to take into the account the general language development of students (especially the younger ones) and not use wording that causes items to be comprehension issue

2. Conduct a study creating pairs of items requiring the same computations but one as a word problem and the other as a calculation-ready format and see how large the language factor is over calculation.

3. Conduct research showing the relationship between math and reading achievement and language proficiency with student groups or AYP sub subgroups

Research Notes:

I am interested in studying more about the relationship between math and reading achievement and language proficiency with student groups. This study proves that teachers need to use appropriate math vocabulary and not slang or “dummied down” words e.g. take-away or minus vs. subtract.

Citation

Shaftel, J., Belton-Kocher, E., Glasnapp, D., & Poggio, J. (2006). The impact of language characteristics in mathematics test items on the performance of English language learners and students with disabilities. Educational Assessment, 11(2), 105-126. doi:10.1207/s15326977ea1102_2

Page 11: Suggestions - WOU Homepagejherold08/ED633/Research Notes.docx  · Web viewThis study proves that teachers need to use appropriate math vocabulary and not slang or “dummied down”

Citation: Abedi, J. (2002)

Key words: Achievement tests, English as a second language, test and scales/Social aspects, test and scales/Reliability

Introduction: ELL learners perform lowers that non-ELL learners in reading, science, and math=English language proficiency is a factor in assessment Content areas with higher language=higher the impact of language proficiencyFairness and validity topics are becoming top priorities nationally ELL population has skyrocketedNCLB (2001) now includes all students in assessment and accountability. This is to provide fair assessment and uphold standards on instruction for all studentsStandardized achievement tests are used to classify ELL students and results drive instruction and student learning. These tests also determine if a student is no longer ELL.Content-based tests are written for non-ELL learners; inadvertently, they may behave as English language proficiency testsELL learners have issues with assessments-unfamiliar with the linguistically complex structure, vocabulary, or interpret the test question literally and read more slowlyLanguage factors lower the validity and reliance of assessments and don’t demonstrate knowledge

Methods:Research Questions1. Could the performance difference between ELL and non-ELL students partly be explained by language factors in the assessment?2. Could the linguistic complexity of test items as a possible source of measurement error influence the reliability of the assessment?3. Could the linguistic complexity of test items as a possible source of construct irrelevant variance influence the validity of the assessment?

Data/SampleFour locations across the U.S. (undisclosed)Site 1-A large urban school district-used Iowa Tests of Basic Skills (ITBS). Grades 3 through 8 in 1999Site 2- A state with a very large number of ELL students. Used Stanford Achievement Test, 9th edition(Stanford 9) test data. Grades 2 to 11 1997–1998 school year.Site 3-Urban school district. Stanford 9 test. Grades 10 and 11. 1997–1998 school year.Site 4- A state with a large number of ELL students. Stanford 9 test. Grades 3, 6, 8, and 10. 1997–1998 school year.

Analyzed data from four sites and cross-validated itInternal consistency coefficients were computed for different tests across categories (ELL status, family income, parent education) Used multiple-group factor analyses- relationship of test items vs. test scores

Findings:ELL learners scored significantly lower than non-ELL students but not gap was on consistent across content areas. Reading and writing had the highest gap.Math and science (less language demands) had small gap. *nonexistent in math computationThe performance gap between ELL and non-ELL students was smaller in the lower grades.The performance gap between ELL and non-ELL students lessened when the level of language demand of test questions diminished.Students who are part of the free/reduced price lunch program had scored lower in all subject areas than those who did not participateStudents with parents had education scored higher.Language background/factor was higher impact on assessment than income levels or parent education

Page 12: Suggestions - WOU Homepagejherold08/ED633/Research Notes.docx  · Web viewThis study proves that teachers need to use appropriate math vocabulary and not slang or “dummied down”

Analyses of Math subscale- high language demand-math analytical skills, concepts and estimation, and problem solving. Math computation-low language demand. ELL students and Non-ELL students performed at the same level in math computationThe nonbilingual students generally outperformed the bilingual students.The differences arise between bilingual and nonbilingual students in grade 3 and beyond. The largest gap between bilingual and nonbilingual students was in reading.The next largest gaps=content areas with more language demand e.g. problem solving and data interpretations.Higher impact of language factors on students’ performance=areas with more language demandWhen assessing ELL students, language factors are a major component of measurement error due to unnecessary linguistic complexity in content-based areasComplex language in content-based assessments for ELL students may reduce the validity and reliability of conclusions about students’ content-based knowledge.Linguistic complexity of test items may invalidate achievement on tests.

Discussion/Implications:1. Results showed that students’ assessment results might be confounded with language background variables. Higher impact of language factors on students’ performance=areas with more language demand=strong impact on language proficiency on assessment2. Lower the language proficiency of the student=lower score3. Test writers need to examine how language factors impact test reliability and need to create test reliability index separately for ELL students4. Linguistic complexity of test items unrelated to the content being assessed may at least be partly responsible for the performance gap between ELL and non-ELL students.

Suggestions1. Continue to examine language factors that impact assessments for ELL learners2. Psychometric characteristics (techniques of assessment) should be cautiously reviewed for use with ELL students3. Language background variables should be taken into consideration when assessing, and efforts should be make to decrease the perplexing effects of language background on the assessment outcomes. Research Notes:I would like to read some studies of students taking the original and modified (without the linguistic complexity) and see the results. Also, do bilingual assessments help the students score higher? Serves as a reminder to constantly use correct content vocabulary and teach it intentionally.

CitationAbedi, J. (2002). Standardized achievement tests and English language learners: psychometrics issues. Educational

Assessment, 8(3), 231-57. doi: 10.1207/S15326977EA0803_02

Page 13: Suggestions - WOU Homepagejherold08/ED633/Research Notes.docx  · Web viewThis study proves that teachers need to use appropriate math vocabulary and not slang or “dummied down”

Citation: Abedi, J. & Lord, C. (2001)

Keywords: educational tests & measurements; language & education; word problems (Mathematics); mathematical ability testing

Introduction: Hot topic-importance of language in student performance on assessments in content-based areas such as mathematics Nation wide children perform 10% to 30% worse on word problems than same ones in numeric format=suggests that there is a problem beyond math skills ELL learners score lower on standardized math tests and Scholastic Aptitude Test (SAT)No evidence that basic abilities of ELL students are different than non-ELLsThis study compares performance of ELLs and non-ELLs on National Assessment of EducationalProgress (NAEP) word problems Two studies-Study #1 The Student Perceptions Study-original NAEP math items with parallel revised items (with simplified language)Study #2 The Accuracy Test Study- paper-and-pencil math tests. Modified vs. Original. Which receives higher scores?

Used 1992 NAEP math test. 69 released itemsItems that posed difficulty where modified (non-math vocabulary and linguistic structures) math vocab was not changed. For those items with language that might be difficult for students, simpler versions were drafted, keeping the math task the same but modifying nonmath vocabularyExamples of modifications: passive voice to active voice. Unfamiliar words to familiar words, conditional clauses were broken into to sentences. Reworded relative clauses. Complex questions were changed to simple questions.

Methods:Research QuestionsStudy #11. For mathematics items with parallel mathematics content, do students respond differently to items that contain different linguistic structures?2. Do students find linguistically simpler items easier to comprehend? 3. Do they show a preference for items with simpler language?

Study #21. Are there significant differences in the math performance of English language learners and proficient speakers of English?2. Does modifying the linguistic structures in math test items affect students’ test performance?3. Do student background variables?

DataStudy #1 Students were interviewed to determine students’ perceptions and preferencesStudy #2 Scores were compared on original and modified tests

SampleStudy #1 Student Perceptions Study-36 eighth-grade students from 4 school sites in Los Angeles areaStudy #2 Accuracy Test Study 1,174 eighth-grade students from 11 schools in Los Angeles areaIn both students, students have a range of language, socioeconomic, and ethnic backgrounds.

Findings:Study #1Majority of students picked the revised version for itemsStudents stated that the revised items were easier to understand. “Makes more sense” “It seems simpler; you get a clear idea of what they want you to do.”

Page 14: Suggestions - WOU Homepagejherold08/ED633/Research Notes.docx  · Web viewThis study proves that teachers need to use appropriate math vocabulary and not slang or “dummied down”

Some students mentioned that time was a factor in taking testsSome students commented on the difficulty of vocabulary items. Revised versions contained words they could relate to.Students liked less words

Original itemsWhen student read the questions aloud, students would pause to read unfamiliar words=probably causes comprehension issues.Students would change verb from passive to active when reading loud (did not do that with revised items)

Some kids chose the original because it was more interesting or kept with the original because it seemed the same as the original and the original was read first.

Results show:Math items can be simplified in meaningful ways for the studentsNeed a larger group of students to see if responses would actual be differences in math scores

Study #2Proficient English speakers had higher average scores than ELL learners=significant differenceLarge performance differences were found between students at different categories of SES (free and reduced lunch recipients) and type of math class. Students in the low SES group scored lower than students in the high SES categoryHigher math class students received higher scores

Overall Results

Proficient English speakers achieved significantly higher Average student scores were greater for the revised itemsSimplifying the language of math test items aided students in improving their performance. The biggest discrepancy improvement was between students in different levels of math classesLower students=benefited more from revisionsHighest (honor) students=revised items had a small negative effect Students in ESL math classes showed 0.9% progress in their math performance on the revised items.

Discussion/Implications:1. ELL learners scored significantly lowers than non-ELL learners.2. Changing linguistic structures can affect student achievement.3. Students preferred the revised items.4. Based on the 1000+ scores, it is small difference in the overall mean scores but still statistically significant5. ELL learners benefited more that non-ELL Learners6. Lower income students benefited more than others.7. Lower level and average math students benefited more than the higher level math students 8. Students with limited problem solving experience and low semantic schema= reliance more on the text9. Unfamiliar or rarely used vocabulary and passive voice=hinder comprehension10. Complex language=puts low-achieving students at a disadvantage.

Suggestions1. Continue research on other linguistic features2. Continue research on connections between socio-economic levels and other background variables and role of language content area assessments.

Research Notes:I would like to read some studies about poverty vs. student achievement. This study reminds me to continue using a variety of vocabulary in the classroom and speak in an active voice. Students need lots of practice with problem solving and determining what exactly the problem is asking.

Page 15: Suggestions - WOU Homepagejherold08/ED633/Research Notes.docx  · Web viewThis study proves that teachers need to use appropriate math vocabulary and not slang or “dummied down”

CitationAbedi, J. & Lord, C. (2001). The language factor in mathematics tests. Applied Measurement in Education, 14(3),

219-234.

Page 16: Suggestions - WOU Homepagejherold08/ED633/Research Notes.docx  · Web viewThis study proves that teachers need to use appropriate math vocabulary and not slang or “dummied down”

Citation: Abedi, J. (2009)

Keywords: English language -- Study & teaching; limited English-proficient students; mathematics; lexicon (Linguistics); examinations; encyclopedias & dictionaries; English language -- Pronunciation; reference sources; education; testing accommodations; educational support services

Introduction: ELL learners are now included in NCLBELL population has grown 57% since 199510% of all U.S. public school students are ELL learnersThe trials of serving and assessing ELL students continues to growThere is performance gap between ELL learners and non-ELL Learners in all content areas.Performance gap lowers as the level of language demand of test items lessens=the linguistic complexity may interfere with ELL’s ability to demonstrate what they know and are able to doVarious accommodations have been done to help ELL learners but are used for students with disabilities and really don’t help the ELL learners because they have language needs and not various disabilities, needs, and backgrounds“NCLB legislation, accommodations can include, ‘to the extent practicable, assessments in the language and form most likely to yield accurate data on what [ELL] students know and can do in content areas’ (see NCLB Title I, Part A, Sec. 1111 (3)(C)(ix)(II)).” P. 2Many of the NCLB allowed accommodations do not help ELL learners.Dictionaries/glossaries could help ELL learners and are allowed under NCLB; however, research has show that a dictionary or bilingual dictionaries may give ELL students an unfair advantageSome language-based accommodations may change the concepts being assessed and not be valid assessmentsComputer modifications and linguistic modifications had shown to be successful in reducing the performance gaps and be valid assessment outcomes

In this study the accommodations were: computerized tests, a pop-up glossary, a customized English dictionary, extra testing time, and small-group testing. The dictionary was customized and did not include any content related vocabulary.

The focus is to focus on computer accommodations that is able to incorporating linguistic features and to see if computer accommodations vs. traditional ones change performance levels

Methods:Research Questions1. Is computer testing as a form of accommodation effective in making content-based assessments more accessible for ELL students? (effectiveness)2. Does computer testing accommodation impact the validity of the assessment, i.e., change the content of the assessment? (validity)3. Can the computer accommodation be easily implemented? (feasibility)

DataMath tests used were from National Assessment of Educational Progress (NAEP) and Third International Mathematics and Science Study public released items. Grades 4 and 8NAEP scoring rubrics were used for both the math and reading tests (open ended questions)Linguistic complexity rating was based on a rubricSince students are at various proficiency levels, students were tested for their levels fluency subscale of Language Proficiency Scales (LAS), a National Assessment of Educational Progress (NAEP) reading block, and a test of English word recognitionQuestionnaires were given to receive students’ opinions of the accommodations and background information.Teachers were given a questionnaire about educational background, experience, and accommodations used in the classroom

“Grade 4 students were tested under five accommodation conditions: (a) customized English dictionary, (b) small-group testing, (c) extended time, (d) computer testing, and (e) standard testing condition with no accommodations. This generated 10 cells for Grade 4 students—5 accommodation conditions for ELL students and 5 for non-ELL

Page 17: Suggestions - WOU Homepagejherold08/ED633/Research Notes.docx  · Web viewThis study proves that teachers need to use appropriate math vocabulary and not slang or “dummied down”

students. For students in Grade 8, the accommodations were (a) customized English dictionary, (b) computer testing, and (c) no accommodation. Thus, for Grade 8, 6 cells were generated.” p. 5

Students who did not receive extra time accommodations were given a word list and told to check off test words that were hard to understand.

SampleIntact classrooms in Grades 4 Single urban public school district in Southern California were tested in math and reading on 2 consecutive days.

In Grade 4, 666 students. (304 ELL)In Grade 8, 643 students from 27 (290 ELL) 607 students in Grade 4 and 542 students in Grade 8 were used in the subsequent analyses

Findings:Grade 4EffectivenessELL students with some form of accommodation did better than peers without any accommodationStudent who used the computer scored a little higherStudents performed better under all accommodation conditions except small-group testing (note: very little students with this accommodation)ValidityNon-ELL students performed the same with or without accommodationsFeasibilityProgram tests with pop-up glossariesNeed more time in computer labs. Take away from teaching timeNeed internetNeed quiet room to testPaper/test version-some students did not know how to use a dictionary

Grade 8EffectivenessELL students performed higher under both accommodations (customized English dictionary or computer testing) than no accommodations Increased performance of ELL students under computer testing was significant but not the customized English dictionaryValidityFor Non-ELL learners, the average for computer testing and for the customized dictionary were slightly higher than no accommodations. No significant comparisons.FeasibilityIncreased due to experience using dictionaries and computers

Discussion/Implications:Grade 41. Extra time and computer testing were effective accommodations for ELL students. These did not affect non-ELL students’ performance.2. Paper-test students did not perform better than those who used the computer.3. For the more linguistically complex items, all the accommodations made a significant difference for ELL students. For the less complex math items, the computer and extra time accommodations were still significant for ELL students.

Grade 81. Only computer testing is an effective accommodation ELL students. No impact on non-ELL students; therefore, computer testing can be put into operation without validity concerns.2. ELL students spent three time as much time glossing and glossed twice as many words that Non-ELL students3. Paper-test students slightly performed better than those who used the computer but not significant.

Page 18: Suggestions - WOU Homepagejherold08/ED633/Research Notes.docx  · Web viewThis study proves that teachers need to use appropriate math vocabulary and not slang or “dummied down”

4. For ELL students, computer accommodations were significant for the more linguistically complex items but not for the less linguistically complex ones.

For Grades 4 and 81. Students used the pop-up glossary extensively but rarely used the customized English dictionary2. Non-ELL students, there were no significant accommodation effects=no validity concerns

Suggestion1. The computer accommodation is effective and valid; it will not change the focal concepts; therefore, Abedi recommends implementing accommodations for ELL students.

Research Notes:Since Oregon is uses a computerized testing system, I need to teach students how to use a glossary, dictionary, and pop-up features. I am interested to see the results of studies of computer testing with pop-up features vs. paper/test versions, both with linguistic modifications.

CitationAbedi, J. (2009). Computer testing as a form of accommodation for English language learners. Educational

Assessment, 14(3/4), 195-211. doi:10.1080/10627190903448851

Page 19: Suggestions - WOU Homepagejherold08/ED633/Research Notes.docx  · Web viewThis study proves that teachers need to use appropriate math vocabulary and not slang or “dummied down”

Citation: Hofstetter, C. (2003)Keywords: Mathematics/Achievements, Student; Middle school students/Non-English speaking; Tests and scales/Foreign language translations; Mathematics/Tests and scales

Introduction: Standards based education now includes ELL learners in large-scale, content-based assessments to show equal opportunity, representation, and accountabilityThere are concerns about the validity of these tests for ELL learners (LEP)Accommodations like having extra time and exams in native language are to help support ELL learners participate Test scores often measure language proficiency and not content knowledge

Appropriate accommodation=one that improves the performance of the English learners but leaves the performance of non-English learners secure .Usual accommodations categories are (a) setting (b) timing/scheduling (c) presentation format (d) response format (e) other (e.g., use of bilingual word lists or dictionaries).Common accommodations are small groups, individual administration, oral reading in English and bilingual words/listsModified versions like linguistically modifications are less commonly used and are barred by some states but evidence shows it could close the achievement gap between ELL learners and non-ELL learners Factors that influence test performance-level of fluency, exposure of language in academic and social environments, ELL Learners take longer on test because they decode and use translate strategies. Takes 5 to 7 years for the students’ cognitive–academic language proficiency (CALP) to develop“Translated tests are not necessarily equivalent in content, reliability, difficulty level, or validity to the original, untranslated version. Many languages have multiple dialects, so test words, test content, or contexts do not necessarily translate from one language to another. Words may not be understood across cultures and may have different frequency rates and difficulty levels (see AERA, APA, & NCME, 1999, for more information). Multiple forms of bias may be present related to test content, linguistic, and cultural factors (e.g., language proficiency, communicative competence in academic and social settings), school factors (e.g., language of instruction, opportunity to learn at school, students’ exposure to test content and item formats) and testing conditions (e.g., timed tests, lack of cultural sensitivity by test administrator). Threats to validity and equivalence emerge also because English learners have not been proportionally included in test norming samples. This trend, however, is changing.” P. 5

Students were randomly selected to take the same NAEP test but with one of two different accommodations (modified English or original Spanish translation) or with no accommodation at all (original English).

Students were in various classroom types English instruction and Spanish instruction) and levels (eighth-grade mathematics, prealgebra, algebra).

Modifications included were 1) change unfamiliar words to more familiar words, change passive verbs to active verbs, long nominals were shortened, and complex questions where rewritten to simple questions.

Methods:Research Questions1. What classroom and school factors are most related to English learners’ performance on the NAEP mathematics assessment?2. What interactions occur between contextual factors (student, classroom) and type of test accommodation a student receives?3. Do these effects vary by the students’ LEP designation, either LEP or non-LEP?

DataUsed math test items from National Assessment of Education Progress (NAEP).1995-1996 Bilingual Version for Grade 8Student language background questionnaire1992 NAEP reading proficiency test

Page 20: Suggestions - WOU Homepagejherold08/ED633/Research Notes.docx  · Web viewThis study proves that teachers need to use appropriate math vocabulary and not slang or “dummied down”

Sample849 8th-grade students in southern California (large urban city, low-income)LEP and non-LEP students. LEP students-mostly Hispanic descent

Findings:NAEP mathematics testNotable differences in mathematics and reading test performance between students according to LEP status (years in US, times changed schools, how well reads/writes English), language of instruction, type of mathematics class, and type of accommodation Significant level=LEP students who received mathematics instruction in Spanish scored lower than students instructed in English LEP students scored lower on the Original Spanish accommodation than on the English language testsLEP students did better when they took the test in the language they were instructed in.

NAEP reading testMajor effects-language of instruction, level of math class, and type of test bookletSignificant differences-reading scores across math class levels for LEP studentsLEP students with English-language instruction and received the Spanish test had lower reading scores than those with Modified English and Original English (but not significant)For LEP students with English-language mathematics instruction=those with the Original Spanish test scored lower than students with the Modified English or the Original English tests

OverallReading level is associated greatly with math test performanceLanguage of Language of instruction in math class had significant connection to math test performance for LEP students but not for non-LEP studentsLEP and non-LEP students receiving mathematics instruction in the English language and received the modified English test had slightly higher scores than peers without any accommodations but the differences were not statistically significant =there are other variables that are variables that are not linguistic LEP students instructed in Spanish and took Spanish test=performance improved. Non-LEP students in Spanish math classes=no connection with Spanish testLanguage of the mathematics test matches the students’ language of instruction=helps ELP students

Discussion1. Students’ reading proficiency influences LEP and non-LEP students math test performance2. Students who took the math test in the language of instruction did better3. Students in higher math classes performed better than students in the lower classes4. Students with the modified English did better that original English. (not statistically significant)5. Students with Spanish test performed lower than no accommodations unless they were receiving Spanish instruction=just giving a LEP student a Spanish test does not automatically help test performance

Research Notes:I always struggled with giving students the Spanish version when I have taught them in English. This study proves that whatever language they are instructed in, that should be the language of the test. I am interested in reading more studies about this because this is the first I have heard about these results. Also, always teach in the language of the assessment.

CitationHofstetter, C. (2003). Contextual and mathematics accommodation test effects for English-language learners.

Applied Measurement in Education, 16(2), 159-188

Page 21: Suggestions - WOU Homepagejherold08/ED633/Research Notes.docx  · Web viewThis study proves that teachers need to use appropriate math vocabulary and not slang or “dummied down”

Citation: Sato, E., Rabinowitz, S., Gallagher, C., & Huang, C. (2010)

Keywords: Correlation; Standardized Tests; Scoring; Test Construction; Student Surveys; Test Reliability; Factor Analysis; Test Items; English (Second Language); Limited English Speaking; Statistical Analysis; Middle School Students; Mathematics Tests; Test Bias; Scores; Testing Accommodations

Introduction: States are trying to determine if their assessments are consistent with NCLB for ELL studentsThere is a large gap between ELLs and non-ELLsProblems include: test formant, presentation of test information, sociocultural contexts, cultural specific references Students may need to focus on interpreting linguistic features rather than the targeted contentELL learners could be confined in demonstrating knowing in math other than math-related knowledge and skillsResearch has show that math tests can be modified to reduce linguistic complexity and not alter the construct

This study focuses on the linguistics modifications on math assessments on middle school students (grades 7 & 8) .Three subgroups: ELL Learners (EL), non-English language arts proficient (NEP) and English language arts proficient (EP)

Two tests sets were created. 25 multiple choice. One has original math items (grade 8 National Assessment of Educational Progress and grade 7 California state test ) and one with linguistic modifications. The test items came from two standards: measurement and number sense/operations.

Linguistic modifications were: reducing sentence length and complexity and using common or familiar words and concrete language

Each subgroup was divided in half-1/2 original and ½ modified.

Methods:Research Questions1. Does the effect of linguistic modification on students’ math performance vary across the three subgroups of students (EL, NEP, and EP students)? 2. If so, did the linguistic modification improve the math understanding scores for the EL and NEP students relative to the EP students?

DataExperts reviewTest ScoresCognitive interviews (think aloud, 9 students)Student language background surveyStudent achievement history data (state test scores, ELP level, and ELA level)

Sample4, 617 Middle school students (grades 7 & 8) .13 middle schools in California (urban, rural, and small city)

Findings:Linguistic modification test-significant difference between ELL students and English language arts proficient studentsPattern-the average differences on the two test forms were consistent for each subgroupBiggest difference in performance- ELL students, then non-English language arts proficient studentsFor English language arts proficient students, the difference in scores between the two test formats was very close to zeroThe effect of linguistic modification was greatest for EL students, followed by NEP and EP studentsThe dominant factor--math understanding--was found to be the cause of performance on both item sets

Page 22: Suggestions - WOU Homepagejherold08/ED633/Research Notes.docx  · Web viewThis study proves that teachers need to use appropriate math vocabulary and not slang or “dummied down”

Linguistically modified set- more reliably measured math understanding of ELL and NEP students that than the original format

Discussion/Implications:1. Linguistically modifying the language of mathematics test items, did not alter the targeted math construct—math understanding—being assessed2. Linguistic modifications would increase the validity of the test

Suggestions1. Continue to systematically examine the effects of different linguistic modification strategies on the validity of

measures of student academic content understanding.2. States need to base decisions about providing valid, appropriate, and fair assessments for students within the

diverse ELL population on research3. Focus on those linguistic modification strategies through in-depth item-level analyses 4.Exam linguistic modifications for other strands and content areas (science), other language backgrounds, literacy

and language proficiency levels, and grade levels.5. Degree that dictionaries, chunking texts, etc effect linguistic modifications

Research Notes:I would like to see how language modifications effect science assessments and more research about ELL vs. low level readers. Is there a difference and does it become a reading test as well? Does reading the test aloud to the student help with this?

CitationSato, E., Rabinowitz, S. Gallagher C., & Huang, C. (2010) Accommodations for English language learner students:

the effect of linguistic modification of math test item sets. (Final Report. NCEE 2009-4079) National Center for Education Evaluation and Regional Assistance National Center for Education Evaluation and Regional Assistance.

Page 23: Suggestions - WOU Homepagejherold08/ED633/Research Notes.docx  · Web viewThis study proves that teachers need to use appropriate math vocabulary and not slang or “dummied down”

Articles to be used in Literature Review

Abedi, J. & Lord, C. (2001). The language factor in mathematics tests. Applied Measurement in Education, 14(3), 219-234.

Abedi, J. (2002). Standardized achievement tests and English language learners: psychometrics issues. Educational Assessment, 8(3), 231-57. doi: 10.1207/S15326977EA0803_02

Abedi, J. (2009). Computer testing as a form of accommodation for English language learners. Educational Assessment, 14(3/4), 195-211. doi:10.1080/10627190903448851

Hofstetter, C. (2003). Contextual and mathematics accommodation test effects for English-language learners. Applied Measurement in Education, 16(2), 159-188

Martiniello, M. (2008). Language and the performance of English-language learners in math word problems. Harvard Educational Review, 78(2), 333-368.

Sato, E., Rabinowitz, S. Gallagher C., & Huang, C. (2010) Accommodations for English language learner students: the effect of linguistic modification of math test item sets. (Final Report. NCEE 2009-4079) National Center for Education Evaluation and Regional Assistance National Center for Education Evaluation and Regional Assistance.

Shaftel, J., Belton-Kocher, E., Glasnapp, D., & Poggio, J. (2006). The impact of language characteristics in mathematics test items on the performance of English language learners and students with disabilities. Educational Assessment, 11(2), 105-126. doi:10.1207/s15326977ea1102_2