DATA, DATA EVERYWHERE: PROGRESS, CHALLENGES, AND RECOMMENDATIONS FOR STATE DATA SYSTEMS CCSSO NCSA June 19, 2011 Presenters: Sunny Becker, Human Resources Research Organization (HumRRO) Shelby Dietz, Center on Education Policy (CEP) Jennifer Stegman, Oklahoma State Department of Education Kris Ellington, Florida Department of Education Discussant: Lauress Wise, HumRRO Moderator: Hilary Campbell, HumRRO
58
Embed
Data Data Everywhere CCSSO Presentation at National Conference on Student Assessment
The presentation by a number of states and agencies made at the CCSSO-sponsored National Conference on Student Assessment.
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
DATA, DATA EVERYWHERE:
PROGRESS, CHALLENGES, AND
RECOMMENDATIONS
FOR STATE DATA SYSTEMS
CCSSO NCSA June 19, 2011
Presenters:
Sunny Becker, Human Resources Research Organization (HumRRO)
Shelby Dietz, Center on Education Policy (CEP)
Jennifer Stegman, Oklahoma State Department of Education
Kris Ellington, Florida Department of Education
Discussant: Lauress Wise, HumRRO
Moderator: Hilary Campbell, HumRRO
2
Data, Data Everywhere: Progress, Challenges, and Recommendations for State Data Systems
Sunny Becker
Presented to:
CCSSO NCSA Conference
June 20, 2011
Demands on State Data Systems
3
Demands on State Data Systems
4
Demands on State Data Systems
5
Demands on State Data Systems
6
7
CEP Investigation of Achievement Trends
• Center on Education Policy (CEP) conducted a
multi-year study of the impact of NCLB.
• The study used up-to-date state assessment data
for all 50 states and DC.
• All data were verified by states prior to inclusion.
Products of this Research
1. A rich repository of verified data
– Available on CEP’s website for free use by secondary
researchers
2. 50-51 individual state reports (“profiles”) each
year
3. Series of reports on national trends
– Took care to tally trends rather than merge dissimilar
data
8
CEP Reports---A Sample
• Answering the Question That Matters Most: Has Student Achievement
Increased Since No Child Left Behind?
• State Test Score Trends Through 2007-08, Part 1: Is the Emphasis on
Proficiency Shortchanging Higher- and Lower-Achieving Students
• STSTT 2007-08, Part 2: Is There a Plateau Effect in Test Scores?
• STSTT 2007-08, Part 3: Are Achievement Gaps Closing and Is
Achievement Rising for All?
• STSTT 2007-08, Part 4: Has Progress Been Made in Raising Achievement
for Students with Disabilities?
• STSTT 2007-08, Part 5: Are There Differences in Achievement Between
Boys and Girls?
• STSTT 2007-08, Part 6: Has Progress Been Made in Raising Achievement
for English Language Learners?
• STSTT 2008-09, Part 1: Rising Scores on State Tests and NAEP
• STSTT 2008-09, Part 2: Slow and Uneven Progress in Narrowing Gaps
• STSTT 2008-09, Part 3: Student Achievement at 8th Grade
9
Data Collected
• Test data
– Percentage of students at each achievement level
• Overall and disaggregated by sex, race/ethnicity, income, Title I,
students with disabilities, ELL
• Grades 3-8 and high school
– Mean scale scores, standard deviations, and number of students
tested
• Overall and disaggregated by sex, race/ethnicity, income, Title I,
students with disabilities, ELL
• Grades 3-8 and high school
– Years 1998-99 through 2008-09
• Test characteristics
– What test is used at each grade level?
– What years are comparable?
• Not always a straightforward question
10
Tight Timeline to Report Current Results
Annual state assessment
results [summer-fall]
Collect, verify & analyze
data [winter]
Report trends [spring]
Next annual state assessment
results [summer-fall]
11
Data Collection Approach
12
HumRRO pulled information from state web sites
Materials were standardized by HumRRO
Materials sent to state
State revised as necessary
State returned verified materials
Considerations
• Overwhelmed and understaffed departments
• State staff changes
• Individual state staff members with:
– Data expertise
– Policy expertise
– “On the ground” expertise
• Uncommon analyses are “out of the blue” for state
staff
– We analyzed trends at basic-and-above as well as
proficient-and-above.
13
“Out of the Blue” Questions
1. Relative scores on Reading for males and
females reversed in 2009 from previous years.
2. The number of tested 11th grade students varied
quite a bit from year to year.
3. NAEP Mapping Study reported trend lines for
each state. Some year spans disagreed with ours.
14
So What Did We Learn?
Let’s look at a couple of indicators
• State capacity
• Data availability
15
Trends in State Capacity to Provide Data
First, some background….
• Our techniques improved over time. – Clarity regarding missing data and questions about trends
– Sample graphs to show basic trend lines and intended comparisons
– State familiarity with the requests --- in some cases
• We logged all communications with each state. – To keep track of where we stood
– To trigger a reminder to silent states
– To improve the process over time
16
Trends in State Capacity to Provide Data
After Year 1, the total # of
communications leveled out. Outliers persisted.
17
0
5
10
15
20
25
30
2006 2007 2008 2009
Year
Per-State Communications Trends
Average per State
Maximum per State
0
50
100
150
200
250
300
350
400
450
500
2006 2007 2008 2009
Year
Total Communications Logged
Trends in Data Availability Over Time
• Greater availability of disaggregated demographic
groups
– Students without disabilities
– Non-ELL students
– Title I and non-Title I
• Greater availability of metrics not required by NCLB
– mean scale scores and standard deviations
18
Recommended Data Collection Strategies
• Take time to research each state’s testing
programs and policies
• Plan for back-and-forth with state experts to
resolve anomalies
• Carefully construct “out of the blue” questions
19
Recommendations for State Data Repositories
• Documentation, documentation, documentation
• Clear, easy-to-locate instructions for data users
Jennifer Stegman, Oklahoma State department of Education
Jennifer Stegman, Assistant State Superintendent Oklahoma State Department of Education
June 2011
Jennifer Stegman, Oklahoma State Department of Education 30
Jennifer Stegman, Oklahoma State Department of Education 31
The Wave ◦ Created in 2004
◦ SIF compliant student information system
P20 Data Coordinating Council ◦ Created in 2009
◦ Oversight for the coordination of the creation of a unified, longitudinal student data system
Information Technology Consolidation and Coordination Act ◦ Created in 2011
Jennifer Stegman, Oklahoma State Department of Education 32
◦ Maintain proper audit procedures to assure high standards of data quality and reliability.
◦ Ensure effective mechanisms to maintain confidentiality of student records and adherence to the Family Education Rights and Privacy Act (FERPA) requirements.
◦ Implement a student identifier number that can be used from pre-school through adults, connecting all three education agencies and the Oklahoma Employment Security Commission (OESC).
◦ Create linkages between and among data systems so data can be transferred across systems and among interested parties to address questions that cut across levels of the educational system and agencies.
Jennifer Stegman, Oklahoma State Department of Education 33
Further recommendations : ◦ Connect essential data elements relating to student-
level course work and course grades. ◦ Incorporate college-readiness measures into the
data system. ◦ Provide help to and enable appropriate access to the
unified longitudinal database by a wide range of stakeholders to serve a variety of purposes, including improving of teaching and learning, informing public policy, fostering a culture of evidence-based decision making, conducting research, evaluation system and program effectiveness, and providing reports to various stakeholder groups.
Jennifer Stegman, Oklahoma State Department of Education 34
Further recommendations for the P-20 Data System: ◦ Incorporate teacher preparation attributes (e.g.,
certification type, school of origin) into the data system.
◦ Incorporate analysis and business management tools into the system
◦ Implement greater interactive reporting capabilities to respond to a range of stakeholders.
◦ Include student groups not now included (e.g., home-schooled) in the data system
◦ Complete basic policies such as data use/access protocols, data quality standards and governance
Jennifer Stegman, Oklahoma State Department of Education 35
Jennifer Stegman, Oklahoma State Department of Education 36
Past ◦ Special Populations reported as Pass or Fail
◦ Reports may have lagged one to two years
Present ◦ Variety of disaggregations and performance
categories
◦ Beginning to standardize terms
Future ◦ On-demand reports
◦ Longitudinal and student connected data
Jennifer Stegman, Oklahoma State Department of Education 37
Jennifer Stegman, Oklahoma State Department of Education
I asked a statistician for his phone number ... and he gave me an estimate.
38
The requests for data comes from a variety of stakeholders each with their own need for a different slice of data.
Legislature
Research Institutions and Research Companies
Graduate Students
Schools and Districts
Parents
USDE
Jennifer Stegman, Oklahoma State Department of Education 39
Moving towards a highly developed technology and information system ◦ Oklahoma has been moving towards on-demand
reporting but is still only providing “canned” reports.
Identifying and developing clearly defined variables and quality control procedures
Centralizing data systems to increase security and manage accessibility
Jennifer Stegman, Oklahoma State Department of Education
Jennifer Stegman, Oklahoma State Department of Education 41
In Florida
In
Data, Data, …
Kris Ellington, Deputy Commissioner
Florida Department of Education June 2011
The Data Focus
Florida
◦ Data Systems
◦ Statewide Longitudinal Data Systems
Modernization Project
Partnership for Assessment of College
and Career Readiness (PARCC) Plans
◦ Interactive Data Tool
◦ Common Data Standards
June 2011 Slide 43 K. Ellington, Florida Department of Education
Florida’s Educational Data Systems
Data Quality Campaign: Meet all 10 Essential Elements; Meet 7 of 10 State Actions
Public pre-kindergarten through graduate school student level data for public schools, community colleges, career and technical education, adult education, and the state university system
Staff, facilities, finance, and financial aid
Post-school employment and non-education system program data
June 2011 Slide 44 K. Ellington, Florida Department of Education
Necessary Ingredients
State level imperatives, funding
System of common course numbers and directories
Statewide articulation agreements and oversight committee
A state level culture of data management and interagency sharing
A means to follow the records of individual students
across geographic areas, education sectors, and related
programmatic areas.
67 School Districts, 28 community colleges, 11 state universities
Very open public records laws
June 2011 Slide 45 K. Ellington, Florida Department of Education
A Brief History… 1975 ◦ State Universities data system deployed
◦ Statewide K-12 student assessment data collected/reported