Overview of 2010 EHC-CAPI Field Test and Objectives Jason Fields Housing and Household Economic Statistics Division US Census Bureau Presentation to the.

Post on 03-Jan-2016

213 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

Transcript

Overview of2010 EHC-CAPI Field Test

and Objectives

Jason Fields

Housing and Household Economic Statistics Division

US Census Bureau

Presentation to the ASA/SRM SIPP Working Group

November 17, 2009

“Re-SIPP” Development

* Following successful completion of • the EHC Paper Field Test

“Re-SIPP” Development

* Following successful completion of • the EHC Paper Field Test

* Develop the 2010 plan to test an electronic EHC instrument

“Re-SIPP” Development

* Following successful completion of • the EHC Paper Field Test

* Develop the 2010 plan to test an electronic EHC instrument

* Broad involvement across Census Bureau- DID - FLD - TMO- DSD - HHES- DSMD - SRD

Primary Goals of 2010 Test

Primary Goals of 2010 Test(1) Strong evidence of comparable data quality

Primary Goals of 2010 Test(1) Strong evidence of comparable data quality

- How well do the calendar year 2009 data from the 2010 EHC-CAPI Field Test match data from the 2008 SIPP panel?

Primary Goals of 2010 Test(1) Strong evidence of comparable data quality

- How well do the calendar year 2009 data from the 2010 EHC-CAPI Field Test match data from the 2008 SIPP panel?

- Especially for income transfer programs

Primary Goals of 2010 Test(1) Strong evidence of comparable data quality

- How well do the calendar year 2009 data from the 2010 EHC-CAPI Field Test match data from the 2008 SIPP panel?

- Especially for income transfer programs

(2) Strong evidence to guide development and refinement before implementation in 2013 as the production SIPP instrument

Basic Design Features (1)

Basic Design Features (1)

8,000 Sample Addresses

Basic Design Features (1)

8,000 Sample Addresses

- could have been larger!

- enough sample and budget to support research and field activities

Basic Design Features (1)

8,000 Sample Addresses

- could have been larger!

- enough sample and budget to support research and field activities

“High Poverty” Sample Stratum

Basic Design Features (1)

8,000 Sample Addresses

- could have been larger!

- enough sample and budget to support research and field activities

“High Poverty” Sample Stratum

- to evaluate how well income transfer program data are collected

Basic Design Features (1)

8,000 Sample Addresses- could have been larger!- enough sample and budget to support research and field activities

“High Poverty” Sample Stratum- to evaluate how well income transfer program data are collected

State-Based Design

Basic Design Features (1)

8,000 Sample Addresses- could have been larger!- enough sample and budget to support research and field activities

“High Poverty” Sample Stratum- to evaluate how well income transfer program data are collected

State-Based Design- likely (possible?) access to admin records

RO State Sample N Notes

BOS Connecticut

Massachusetts

New York

Rhode Island

204

465

366

120

1,155

covers upstate (non-NYC) NY

RO State Sample N Notes

BOS Connecticut

Massachusetts

New York

Rhode Island

204

465

366

120

1,155

covers upstate (non-NYC) NY

NY New York 1,681 covers NYC portion of NY

RO State Sample N Notes

BOS Connecticut

Massachusetts

New York

Rhode Island

204

465

366

120

1,155

covers upstate (non-NYC) NY

NY New York 1,681 covers NYC portion of NY

PHIL Maryland 280

RO State Sample N Notes

BOS Connecticut

Massachusetts

New York

Rhode Island

204

465

366

120

1,155

covers upstate (non-NYC) NY

NY New York 1,681 covers NYC portion of NY

PHIL Maryland 280

CHI Illinois

Wisconsin

620

132

752

excludes 57 IL addresses in KC-RO

RO State Sample N Notes

BOS Connecticut

Massachusetts

New York

Rhode Island

204

465

366

120

1,155

covers upstate (non-NYC) NY

NY New York 1,681 covers NYC portion of NY

PHIL Maryland 280

CHI Illinois

Wisconsin

620

132

752

excludes 57 IL addresses in KC-RO

DAL Texas

Louisiana

1,382

325

1,707

RO State Sample N Notes

BOS Connecticut

Massachusetts

New York

Rhode Island

204

465

366

120

1,155

covers upstate (non-NYC) NY

NY New York 1,681 covers NYC portion of NY

PHIL Maryland 280

CHI Illinois

Wisconsin

620

132

752

excludes 57 IL addresses in KC-RO

DAL Texas

Louisiana

1,382

325

1,707

LA California 2,407 excludes 445 CA addresses in SEA-RO

RO State Sample N Notes

BOS Connecticut

Massachusetts

New York

Rhode Island

204

465

366

120

1,155

covers upstate (non-NYC) NY

NY New York 1,681 covers NYC portion of NY

PHIL Maryland 280

CHI Illinois

Wisconsin

620

132

752

excludes 57 IL addresses in KC-RO

DAL Texas

Louisiana

1,382

325

1,707

LA California 2,407 excludes 445 CA addresses in SEA-RO

TOTAL N: 7,982

RO State Sample N Notes

BOS Connecticut

Massachusetts

New York

Rhode Island

204

465

366

120

1,155

covers upstate (non-NYC) NY

NY New York 1,681 covers NYC portion of NY

PHIL Maryland 280

CHI Illinois

Wisconsin

620

132

752

excludes 57 IL addresses in KC-RO

DAL Texas

Louisiana

1,382

325

1,707

LA California 2,407 excludes 445 CA addresses in SEA-RO

TOTAL N: 7,982

TOTAL ADMIN RECS (?) N: 6,736

Basic Design Features (2)

Basic Design Features (2)

Field Period: Early Jan - mid March 2010

Basic Design Features (2)

Field Period: Early Jan - mid March 2010- collect data about calendar year 2009

Basic Design Features (2)

Field Period: Early Jan - mid March 2010- collect data about calendar year 2009

Field Representative training in Dec/Jan

Basic Design Features (2)

Field Period: Early Jan - mid March 2010- collect data about calendar year 2009

Field Representative training in Dec/Jan- goal: minimize # of FRs with post-training “down-time”- evaluation and improvement of training

Basic Design Features (2)

Field Period: Early Jan - mid March 2010- collect data about calendar year 2009

Field Representative training in Dec/Jan- goal: minimize # of FRs with post-training “down-time”- evaluation and improvement of training

Use FRs with a wide range of experience

Basic Design Features (2)

Field Period: Early Jan - mid March 2010- collect data about calendar year 2009

Field Representative training in Dec/Jan- goal: minimize # of FRs with post-training “down-time”- evaluation and improvement of training

Use FRs with a wide range of experience

Expand RO involvement

Research Agenda

Research Agenda

1. Quantify likely cost savings

Research Agenda

1. Quantify likely cost savings2. Test the data processing system

Research Agenda

1. Quantify likely cost savings2. Test the data processing system3. Evaluate data quality

Research Agenda

1. Quantify likely cost savings2. Test the data processing system3. Evaluate data quality4. Evaluate “field support” materials

Research Agenda

1. Quantify likely cost savings2. Test the data processing system3. Evaluate data quality4. Evaluate “field support” materials5. Evaluate FR training

Research Agenda

1. Quantify likely cost savings2. Test the data processing system3. Evaluate data quality4. Evaluate “field support” materials5. Evaluate FR training6. Identify & document instrument “bugs”

Research Agenda

1. Quantify likely cost savings2. Test the data processing system3. Evaluate data quality4. Evaluate “field support” materials5. Evaluate FR training6. Identify & document instrument “bugs”7. Identify “interview process” issues

Research Agenda

1. Quantify likely cost savings2. Test the data processing system3. Evaluate data quality4. Evaluate “field support” materials5. Evaluate FR training6. Identify & document instrument “bugs”7. Identify “interview process” issues8. Identify usability issues (esp. EHC)

Research Agenda

1. Quantify likely cost savings2. Test the data processing system3. Evaluate data quality4. Evaluate “field support” materials5. Evaluate FR training6. Identify & document instrument “bugs”7. Identify “interview process” issues8. Identify usability issues (esp. EHC)

HOW CAN WE IMPROVE FOR 2013?

Research Agenda

1. Quantify likely cost savings2. Test the data processing system3. Evaluate data quality4. Evaluate “field support” materials5. Evaluate FR training6. Identify & document instrument “bugs”7. Identify “interview process” issues8. Identify usability issues (esp. EHC)

HOW CAN WE IMPROVE FOR 2013?

Special Methods

1. Quantify likely cost savings

Special Methods

1. Quantify likely cost savings

- new cost code(s) established- timing interview length- exchange between 12-month recall and 3 interviews per year

Research Agenda

1. Quantify likely cost savings2. Test the data processing system3. Evaluate data quality4. Evaluate “field support” materials5. Evaluate FR training6. Identify & document instrument “bugs”7. Identify “interview process” issues8. Identify usability issues (esp. EHC)

HOW CAN WE IMPROVE FOR 2013?

Special Methods

2. Test the data processing system

Special Methods

2. Test the data processing system

The data collected in this test will be used to develop and test a new data processing system.

Research Agenda

1. Quantify likely cost savings2. Test the data processing system3. Evaluate data quality4. Evaluate “field support” materials5. Evaluate FR training6. Identify & document instrument “bugs”7. Identify “interview process” issues8. Identify usability issues (esp. EHC)

HOW CAN WE IMPROVE FOR 2013?

Special Methods

3. Evaluate data quality

Special Methods

3. Evaluate data quality

- administrative records

Special Methods

3. Evaluate data quality

- administrative records- recording of selected interviews

Special Methods

3. Evaluate data quality

- administrative records- recording of selected interviews- extract SIPP 2008 panel data; compareCY2009 estimates from the two surveys

Special Methods

3. Evaluate data quality

- administrative records- recording of selected interviews- extract SIPP 2008 panel data; compareCY2009 estimates from the two surveys

(Details) Interview Recording

(Details) Interview Recording

- close-to-RO FRs (approximately 80)

(Details) Interview Recording

- close-to-RO FRs (approximately 80)

- 3 recording windows (early Jan, late Jan, mid Feb)

(Details) Interview Recording

- close-to-RO FRs (approximately 80)

- 3 recording windows (early Jan, late Jan, mid Feb)

- message: “record the next two interviews”

(Details) Interview Recording

- close-to-RO FRs (approximately 80)

- 3 recording windows (early Jan, late Jan, mid Feb)

- message: “record the next two interviews”

- with consent; adults only (21+)

(Details) Interview Recording

- close-to-RO FRs (approximately 80)

- 3 recording windows (early Jan, late Jan, mid Feb)

- message: “record the next two interviews”

- with consent; adults only (21+)

- record R’s entire continuous “turn”

(Details) Interview Recording

- close-to-RO FRs (approximately 80)- 3 recording windows (early Jan, late Jan, mid Feb)- message: “record the next two interviews”- approximately 480 recorded interviews- with consent; adults only (21+)- record R’s entire continuous “turn”- in RO, with the assistance of the ROCS transfer recordings to the secure HQ network

Research Agenda

1. Quantify likely cost savings2. Test the data processing system3. Evaluate data quality4. Evaluate “field support” materials5. Evaluate FR training6. Identify & document instrument “bugs”7. Identify “interview process” issues8. Identify usability issues (esp. EHC)

HOW CAN WE IMPROVE FOR 2013?

Special Methods

4. Evaluate “field support” materials(advance letter, brochure, calendar aid)

Special Methods

4. Evaluate “field support” materials(advance letter, brochure, calendar aid)

- Respondent debriefing instrument block

Special Methods

4. Evaluate “field support” materials(advance letter, brochure, calendar aid)

- Respondent debriefing instrument block- FR debriefing sessions

Special Methods

4. Evaluate “field support” materials(advance letter, brochure, calendar aid)

- Respondent debriefing instrument block- FR debriefing sessions- recording of selected interviews

Special Methods

4. Evaluate “field support” materials(advance letter, brochure, calendar aid)

- Respondent debriefing instrument block- FR debriefing sessions- recording of selected interviews

(Details) R Debriefing Block

(Details) R Debriefing Block

- at end of interview (status=“complete”)

(Details) R Debriefing Block

- at end of interview (status=“complete”)

- focus on “field support” materials:

(Details) R Debriefing Block

- at end of interview (status=“complete”)

- focus on “field support” materials: advance letter, brochure, calendar aid

(Details) R Debriefing Block

- at end of interview (status=“complete”)

- focus on “field support” materials: advance letter, brochure, calendar aid

- very brief question set:

(Details) R Debriefing Block

- at end of interview (status=“complete”)

- focus on “field support” materials: advance letter, brochure, calendar aid

- very brief question set: “did you see [X]?”

(Details) R Debriefing Block

- at end of interview (status=“complete”)

- focus on “field support” materials: advance letter, brochure, calendar aid

- very brief question set: “did you see [X]?” “did you read [X]?”

(Details) R Debriefing Block

- at end of interview (status=“complete”)

- focus on “field support” materials: advance letter, brochure, calendar aid

- very brief question set: “did you see [X]?” “did you read [X]?” “did [X] have [+/-/0] impact?”

(Details) R Debriefing Block

- at end of interview (status=“complete”)

- focus on “field support” materials: advance letter, brochure, calendar aid

- very brief question set: “did you see [X]?” “did you read [X]?” “did [X] have [+/-/0] impact?”

- with most convenient respondent

Special Methods

4. Evaluate “field support” materials(advance letter, brochure, calendar aid)

- Respondent debriefing instrument block- FR debriefing sessions- recording of selected interviews

Special Methods

4. Evaluate “field support” materials(advance letter, brochure, calendar aid)

- Respondent debriefing instrument block- FR debriefing sessions- recording of selected interviews

(Details) FR Debriefings

(Details) FR Debriefings

- at (or near) end of field period

(Details) FR Debriefings

- at (or near) end of field period

- at least one session per RO

(Details) FR Debriefings

- at (or near) end of field period

- at least one session per RO

- with 8-10 FRs/SFRs

(Details) FR Debriefings

- at (or near) end of field period

- at least one session per RO

- with 8-10 FRs/SFRs

- guided 2-3 hour discussion

(Details) FR Debriefings

- at (or near) end of field period

- at least one session per RO

- with 8-10 FRs/SFRs

- guided 2-3 hour discussion

- wide range of issues – e.g., training, EHC procedures, usability, interview “process” issues, etc.

(Details) FR Debriefings

- at (or near) end of field period

- at least one session per RO

- with 8-10 FRs/SFRs

- guided 2-3 hour discussion

- wide range of issues – e.g., training, EHC procedures, usability, interview “process” issues, etc.

- improvements for 2013

Research Agenda

1. Quantify likely cost savings2. Test the data processing system3. Evaluate data quality4. Evaluate “field support” materials5. Evaluate FR training6. Identify & document instrument “bugs”7. Identify “interview process” issues8. Identify usability issues (esp. EHC)

HOW CAN WE IMPROVE FOR 2013?

Special Methods

5. Evaluate FR training

Special Methods

5. Evaluate FR training

- recording of selected interviews

Special Methods

5. Evaluate FR training

- recording of selected interviews- certification (and other) testing

Special Methods

5. Evaluate FR training

- recording of selected interviews- certification (and other) testing- HQ (and RO) training observation

Special Methods

5. Evaluate FR training

- recording of selected interviews- certification (and other) testing- HQ (and RO) training observation- HQ (and RO) interview observation

Special Methods

5. Evaluate FR training

- recording of selected interviews- certification (and other) testing- HQ (and RO) training observation- HQ (and RO) interview observation- FR debriefing sessions

Special Methods

5. Evaluate FR training

- recording of selected interviews- certification (and other) testing- HQ (and RO) training observation- HQ (and RO) interview observation- FR debriefing sessions- FR feedback instrument block

Special Methods

5. Evaluate FR training

- recording of selected interviews- certification (and other) testing- HQ (and RO) training observation- HQ (and RO) interview observation- FR debriefing sessions- FR feedback instrument block- FR training assessment form

Special Methods

5. Evaluate FR training

- recording of selected interviews- certification (and other) testing- HQ (and RO) training observation- HQ (and RO) interview observation- FR debriefing sessions- FR feedback instrument block- FR training assessment form- Trainers’ debriefing

Special Methods

5. Evaluate FR training

- recording of selected interviews- certification (and other) testing- HQ (and RO) training observation- HQ (and RO) interview observation- FR debriefing sessions- FR feedback instrument block- FR training assessment form- Trainers’ debriefing

(Details) HQ/RO Interview Observations

(Details) HQ/RO Interview Observations

- intensive HQ/RO observation of field test

(Details) HQ/RO Interview Observations

- intensive HQ/RO observation of field test

- key observation themes:

(Details) HQ/RO Interview Observations

- intensive HQ/RO observation of field test

- key observation themes: use of EHC techniques (landmarks, cross-domain referencing, calendar aid)

(Details) HQ/RO Interview Observations

- intensive HQ/RO observation of field test

- key observation themes: use of EHC techniques (landmarks, cross-domain referencing, calendar aid)

instrument usability/navigation

(Details) HQ/RO Interview Observations

- intensive HQ/RO observation of field test

- key observation themes: use of EHC techniques (landmarks, cross-domain referencing, calendar aid)

instrument usability/navigation FR preparedness/training

(Details) HQ/RO Interview Observations

- intensive HQ/RO observation of field test

- key observation themes: use of EHC techniques (landmarks, cross-domain referencing, calendar aid)

instrument usability/navigation FR preparedness/training R interest/engagement

(Details) HQ/RO Interview Observations

- intensive HQ/RO observation of field test

- key observation themes: use of EHC techniques (landmarks, cross-domain referencing, calendar aid)

instrument usability/navigation FR preparedness/training R interest/engagement

- R debriefing regarding landmarks

Special Methods

5. Evaluate FR training

- recording of selected interviews- certification (and other) testing- HQ (and RO) training observation- HQ (and RO) interview observation- FR debriefing sessions- FR feedback instrument block- FR training assessment form- Trainers’ debriefing

Special Methods

5. Evaluate FR training

- recording of selected interviews- certification (and other) testing- HQ (and RO) training observation- HQ (and RO) interview observation- FR debriefing sessions- FR feedback instrument block- FR training assessment form- Trainers’ debriefing

(Details) FR Feedback Block

(Details) FR Feedback Block

- at end of interview (status=“complete”)

(Details) FR Feedback Block

- at end of interview (status=“complete”)- brief set of Qs about:

(Details) FR Feedback Block

- at end of interview (status=“complete”)- brief set of Qs about:

use of EHC methods (domains; success)

(Details) FR Feedback Block

- at end of interview (status=“complete”)- brief set of Qs about:

use of EHC methods (domains; success)

EHC instrument bugs

(Details) FR Feedback Block

- at end of interview (status=“complete”)- brief set of Qs about:

use of EHC methods (domains; success)

EHC instrument bugs perceived +/- R reactions

(Details) FR Feedback Block

- at end of interview (status=“complete”)- brief set of Qs about:

use of EHC methods (domains; success)

EHC instrument bugs perceived +/- R reactions training gaps

Research Agenda

1. Quantify likely cost savings2. Test the data processing system3. Evaluate data quality4. Evaluate “field support” materials5. Evaluate FR training6. Identify & document instrument “bugs”7. Identify “interview process” issues8. Identify usability issues (esp. EHC)

HOW CAN WE IMPROVE FOR 2013?

Special Methods

6. Identify & document instrument “bugs”

Special Methods

6. Identify & document instrument “bugs”

- HQ (and RO) interview observations

Special Methods

6. Identify & document instrument “bugs”

- HQ (and RO) interview observations- FR debriefing sessions

Special Methods

6. Identify & document instrument “bugs”

- HQ (and RO) interview observations- FR debriefing sessions- FR feedback instrument block

Special Methods

6. Identify & document instrument “bugs”

- HQ (and RO) interview observations- FR debriefing sessions- FR feedback instrument block- item-level notes

Special Methods

6. Identify & document instrument “bugs”

- HQ (and RO) interview observations- FR debriefing sessions- FR feedback instrument block- item-level notes

(Details) Item-Level Notes

(Details) Item-Level Notes

- accessible throughout Blaise interview

(Details) Item-Level Notes

- accessible throughout Blaise interview

non-calendar sections standarized Q “script”

(Details) Item-Level Notes

- accessible throughout Blaise interview- FR training will encourage & instruct

(Details) Item-Level Notes

- accessible throughout Blaise interview- FR training will encourage & instruct- focus on “bugs” – instrument not working as planned, e.g.:

(Details) Item-Level Notes

- accessible throughout Blaise interview- FR training will encourage & instruct- focus on “bugs” – instrument not working as planned, e.g.:

wrong/missing fills

(Details) Item-Level Notes

- accessible throughout Blaise interview- FR training will encourage & instruct- focus on “bugs” – instrument not working as planned, e.g.:

wrong/missing fills garbled wording

(Details) Item-Level Notes

- accessible throughout Blaise interview- FR training will encourage & instruct- focus on “bugs” – instrument not working as planned, e.g.:

wrong/missing fills garbled wording wrong/missing Qs

(Details) Item-Level Notes

- accessible throughout Blaise interview- FR training will encourage & instruct- focus on “bugs” – instrument not working as planned, e.g.:

wrong/missing fills garbled wording wrong/missing Qs FR “work-arounds”

(Details) Item-Level Notes

- accessible throughout Blaise interview- FR training will encourage & instruct- focus on “bugs” – instrument not working as planned, e.g.:

wrong/missing fills garbled wording wrong/missing Qs FR “work-arounds” missing help screens

(Details) Item-Level Notes

- accessible throughout Blaise interview- FR training will encourage & instruct- focus on “bugs” – instrument not working as planned, e.g.:

wrong/missing fills garbled wording wrong/missing Qs FR “work-arounds” missing help screens confusing/inapp./redundant/etc. Qs

Research Agenda

1. Quantify likely cost savings2. Test the data processing system3. Evaluate data quality4. Evaluate “field support” materials5. Evaluate FR training6. Identify & document instrument “bugs”7. Identify “interview process” issues8. Identify usability issues (esp. EHC)

HOW CAN WE IMPROVE FOR 2013?

Special Methods

7. Identify “interview process” issues

Special Methods

7. Identify “interview process” issues(interview “flow,” R interest/engagement, EHCinteraction, mix of structured/unstructured Qs)

Special Methods

7. Identify “interview process” issues(interview “flow,” R interest/engagement, EHCinteraction, mix of structured/unstructured Qs)

- HQ (and RO) interview observations

Special Methods

7. Identify “interview process” issues(interview “flow,” R interest/engagement, EHCinteraction, mix of structured/unstructured Qs)

- HQ (and RO) interview observations- FR debriefing sessions

Special Methods

7. Identify “interview process” issues(interview “flow,” R interest/engagement, EHCinteraction, mix of structured/unstructured Qs)

- HQ (and RO) interview observations- FR debriefing sessions- FR feedback instrument block

Special Methods

7. Identify “interview process” issues(interview “flow,” R interest/engagement, EHCinteraction, mix of structured/unstructured Qs)

- HQ (and RO) interview observations- FR debriefing sessions- FR feedback instrument block- recording of selected interviews

Research Agenda

1. Quantify likely cost savings2. Test the data processing system3. Evaluate data quality4. Evaluate “field support” materials5. Evaluate FR training6. Identify & document instrument “bugs”7. Identify “interview process” issues8. Identify usability issues (esp. EHC)

HOW CAN WE IMPROVE FOR 2013?

Special Methods

8. Identify usability issues (esp. EHC)(instrument navigation, FRs’ ability to accessand use special features of the EHC)

Special Methods

8. Identify usability issues (esp. EHC)(instrument navigation, FRs’ ability to accessand use special features of the EHC)

- HQ (and RO) interview observations

Special Methods

8. Identify usability issues (esp. EHC)(instrument navigation, FRs’ ability to accessand use special features of the EHC)

- HQ (and RO) interview observations - FR debriefing sessions

Special Methods

8. Identify usability issues (esp. EHC)(instrument navigation, FRs’ ability to accessand use special features of the EHC)

- HQ (and RO) interview observations - FR debriefing sessions - FR feedback instrument block

Special Methods

8. Identify usability issues (esp. EHC)(instrument navigation, FRs’ ability to accessand use special features of the EHC)

- HQ (and RO) interview observations - FR debriefing sessions - FR feedback instrument block - recording of selected interviews

Special Methods

8. Identify usability issues (esp. EHC)(instrument navigation, FRs’ ability to accessand use special features of the EHC)

- HQ (and RO) interview observations - FR debriefing sessions - FR feedback instrument block - recording of selected interviews - FR testing sessions at HQ

Summary: Research Agenda

1. Quantify likely cost savings2. Test the data processing system3. Evaluate data quality4. Evaluate “field support” materials5. Evaluate FR training6. Identify & document instrument “bugs”7. Identify “interview process” issues8. Identify usability issues (esp. EHC)

Summary: Research Agenda

Lots of Extra “Stuff” – 2010 Test is Loaded- Data quality- Instrument quality- Training quality

Summary: Research Agenda

Lots of Extra “Stuff” – 2010 Test is Loaded- Data quality- Instrument quality- Training quality

GOAL: Fully Exploit the Test’s InformationPotential

Summary: Research Agenda

Lots of Extra “Stuff” – 2010 Test is Loaded- Data quality- Instrument quality- Training quality

GOAL: Fully Exploit the Test’s InformationPotential

Improvements/Refinements for 2013

What’s Missing from 2010?

What’s Missing from 2010?

- Attrition/mover effects in an annual interview

What’s Missing from 2010?

- Attrition/mover effects in an annual interview

- Year to year data quality - seams between waves of a 12-month reference period interview

What’s Missing from 2010?

- Attrition/mover effects in an annual interview

- Year to year data quality - seams between waves of a 12-month reference period interview

- Wave 2+ instrument and procedures

What’s Missing from 2010?

- Attrition/mover effects in an annual interview

- Year to year data quality - seams between waves of a 12-month reference period interview

- Wave 2+ instrument and procedures

- In Development – 2011 / 2012 Testing Plans

Thanks!

Questions?

contact:Jason.M.Fields@census.gov

top related