Top Banner
Survey research techniques: Matching the method with the research question Robert M. Stein Department of Political Science Rice University March 2015
21

Survey research techniques: Matching the method with the research question

Jan 07, 2016

Download

Documents

Tariq

Survey research techniques: Matching the method with the research question. Robert M. Stein Department of Political Science Rice University March 2013. Popular survey techniques. Face-to-face interviews Mail surveys Self-administered surveys Telephone interviews Live interviewer - PowerPoint PPT Presentation
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Survey research techniques:  Matching the method with the research question

Survey research techniques: Matching the method with the research question

Robert M. Stein

Department of Political Science

Rice University

March 2015

Page 2: Survey research techniques:  Matching the method with the research question

Popular survey techniques

• Face-to-face interviews • Mail surveys• Self-administered surveys• Telephone interviews– Live interviewer– Interactive voice recorded

• Web based surveys (with/without telephone recruitment. e.g., Mechanical Turk, Knowledge networks, Goggle Consumer Surveys

• Focus groups

Page 3: Survey research techniques:  Matching the method with the research question

Major research issues with surveys

• Sampling

• Cost

• Reactivity– Psychology of the survey response

– The ‘don’t know response’

– Response time

– Social desirability

• Question wording and placement

• Meaning of survey responses

Page 4: Survey research techniques:  Matching the method with the research question

Survey methods for studying who votes– Web based surveys

• Strengths: Cost, sample size, timely data collection, survey panels• Weaknesses: Selection bias e.g., digital divide

– Telephone, face-to-face surveys• Strengths: Sample size, limited to registered voters• Weaknesses : Social desirability, reactivity, recall problems, sample selection

bias (unlisted phone number– Exit poll

• Strengths: Validated voters, limited number of questions• Weaknesses : Reactivity, unable to study non-voter

– IVR poll• Strengths: Large sample, less invasive, lower cost• Weaknesses: Low response rate, limited number of questions, skewed sample

– ‘Surveyless’ Survey (e.g., annotated voter histories)• Strengths: Unambiguous, valid and reliable information• Weaknesses: Limited sample (i.e. registered voters) and bias time series

Page 5: Survey research techniques:  Matching the method with the research question

Overreporting voting

• Robert Bernstein, Anita Chadha and Robert Montjoy, “Overreporting Voting: Why it Happens and Why it Matters.” Public Opinion Quarterly 65(2002):22-44

• Allyson L. Holbrook and Jon A. Krosnick, “Measuring voter turnout by using the randomized response technique: Evidence calling into question the method’s validity,” Public Opinion Quarterly 74(2010):328-343.

• Brian D. Silver, Paul R. Abramson and Barbara A. Anderson, “The presence of others and overreporting of voting in American National Elections,” Public Opinion Quarterly 50(1986):229-239

• Robert F. Belli, Sean E. Moore and John VanHoewyk, “An experimental comparison of question forms used to reduced vote overreporting,” Electoral Studies 25(2006):751-759

Page 6: Survey research techniques:  Matching the method with the research question

Social expectation and over reporting of voting

• Treatment #1 – In talking to people about elections, we often find that a lot

of people were not able to vote because they weren't registered, they were sick, or they just didn't have time. How about you--did you vote in the elections this November?

• Treatment #2 – In talking to people about elections, we often find that a lot

of people were not able to vote because they weren't registered, they were sick, or they just didn't have time. Which of the following statements best describes you: One, I did not vote (in the election this November); Two, I thought about voting this time - but didn't; Three, I usually vote, but didn't this time; or Four, I am sure I voted?

Page 7: Survey research techniques:  Matching the method with the research question

Live telephone and IVR surveys of 2009 Houston mayoral election voters

• Live telephone interview– 500 registered voters interviewed 11/15/09 -11/18/09– Error rate +/- 4.5%– Interview length: 7.5-12 minutes (95%)– Response rate: 32%

• IVR interview– 586 registered voters interviewed 11/15/09 -11/18/09– Error rate +/- 3.3%– Interview length: 1- 3.5 minutes (95%)– Response rate: 8.5%

Page 8: Survey research techniques:  Matching the method with the research question

Live telephone interview

Wording A: In talking to people about elections, we often find that a lot of people were not able to vote because they weren't registered, they were sick, or they just didn't have time. How about you--did you vote in the elections this November?

Wording 'A' Voted Did not vote Reported voting 116 (98%) 72 (58%) Reported not voting 3 (2%) 52 (42%)

Wording B: In talking to people about elections, we often find that a lot of people were not able to vote because they weren't registered, they were sick, or they just didn't have time. Which of the following statements best describes you: One, I did not vote (in the election this November); Two, I thought about voting this time - but didn't; Three, I usually vote, but didn't this time; or Four, I am sure I voted?

Wording 'B' Voted Did not vote Reported voting 114 (91%) 65 (50%) Reported not voting 11 (9%) 65 (50%)

Page 9: Survey research techniques:  Matching the method with the research question

Interactive voice recorded telephone interview

Wording A: In talking to people about elections, we often find that a lot of people were not able to vote because they weren't registered, they were sick, or they just didn't have time. How about you--did you vote in the elections this November?

Ballot language 'A' Voted Did not vote Reported voting 232 (100%) 0 (0%) Reported not voting 0 (0%) 88 (100%)

Wording B: In talking to people about elections, we often find that a lot of people were not able to vote because they weren't registered, they were sick, or they just didn't have time. Which of the following statements best describes you: One, I did not vote (in the election this November); Two, I thought about voting this time - but didn't; Three, I usually vote, but didn't this time; or Four, I am sure I voted?

Ballot language 'B' Voted Did not vote Reported voting 183 (100%) 0 (0%) Reported not voting 0 (0%) 83 (100%)

Page 10: Survey research techniques:  Matching the method with the research question

Response time (latency) measures of contextual effects on voting behavior *

Problem: Studies of contextual processes have always involved the possibility that if individuals’ aggregation into geographic units is not exogenous to their values on the dependent variable, then what appear to be “contextual processes” may be due solely to selection effects.

Solution: measure contextual variables separately from the individual-level variables, and later to connect them to the survey data by means of linkage variables. Latency measures of response time to survey generated questions.

Methodology: Cati (computer assisted telephone interviewing) technology used to generate latency measures to query to respondent’s assessment of the partisanship of their neighbors

“M. Johnson, W. Phillips Shively and R.M. Stein, “Contextual data and the study of elections and voting behavior: connecting individuals to environments.” Electoral Studies 21(2002):219-233.

Page 11: Survey research techniques:  Matching the method with the research question

Hypotheses

1. When perception of partisanship of one’s neighbor’s (i.e., (generally speaking, do you usually think of your neighborhood as Republican, Democratic or Independent?) is latent (i.e., faster response time) context sould have a significant and positive effect on vote choice.

2. Republicans (Democrats) residing in neighborhoods that perceive to be Democratic (Republican) will be more likely to vote for Democrats (Republicans) when their perception of the partisan make-up of their neighborhood is readily accessible.

3. The accessibility of one’s context will be greater when that context is congruent with a respondent’s personal preference.

Page 12: Survey research techniques:  Matching the method with the research question

Research design

• Live telephone interview with 750 registered voters in Harris County, Texas, conducted September 23-29, 1999

• Retrospective vote choice in the 1996 Presidential election, partisanship and perception of partisanship of neighborhood

• Response times collected on Cati system similar to Bassli and Fletcher (1991).

Page 13: Survey research techniques:  Matching the method with the research question

Table 1. Response times for persons with congruent and incongruent perceptions about the partisan make-up of their neighborhoods.

Individual partisanship RT* Contextual partisanship RT Congruent 4.17 4.66 Incongruent 4.34 4.91 (t = 1.957, p = .051) (t = 2.241, p = .025) * Tabled values indicate the mean cleaned logged response times, as discussed in footnote 10.

Page 14: Survey research techniques:  Matching the method with the research question

Table 2. Logit estimates for 1996 presidential vote choice for respondents living in neighborhoods perceieved to have partisan identities at variance with their own.

s.e. Wald Party ID (self-reported)

3.685 ** (0.718) 26.333

Neighborhood Partisanship (perceived)

-0.493 (0.744) 0.439

Response time for Context

-2.737 * (1.591) 2.959

Context * Response time for Context

1.442 * (0.778) 3.434

Baseline Response time

-1.296 (1.574) 0.678

Context * Baseline Response time

0.302 (0.765) 0.156

Constant

-4.769 (2.186) 4.783

** p < 0.01; * p < 0.05 (one-tailed test) N = 148 Naglekerke R2 = .70 Percent predicted correctly = 88.4 Dependent Variable, 1996 Presidential Vote Choice = 1, Democrat Bill Clinton = 0, Republican Bob Dole

Page 15: Survey research techniques:  Matching the method with the research question

Table 3. Probability of voting for Clinton by personal partisanship, context and response time.

Respondent Context Fast RT Slow RT

Democrat Independent .996 .995 Republican .989 .997 Independent Democrat .94 .76 Republican .69 .89* Republican Democrat .28 .07 Independent .13 .11 *Most Typical Case, given data frequencies.

Page 16: Survey research techniques:  Matching the method with the research question

Analyzing surveys

• When are answers to survey questions revealing of individual perceptions, preferences and behavior?

• How might multiple survey responses reveal more information than single responses?

• An Answer: Risk perceptions and evacuations from hurricanes.

Page 17: Survey research techniques:  Matching the method with the research question

Google consumer surveys(GCS)

•GCS is a new tool developed by Google that surveys a sample of Internet users as they attempt to view content on the websites of online publishers who are part of the program. •These sites require users to complete a survey created by researchers in order to access premium content provided by the publishers.•Currently, GCS is available in four countries and it takes about 48 hours to field a survey of any size. The 4 countries are USA, Canada, UK, and Australia.•The current cost-structure of GCS makes very short surveys (even as short as one question) the most economically attractive.•Since the respondents are not part of an online panel, GCS has not collected this information at an earlier date (as is the case in many panels). Instead, GCS provides researchers with a set of respondents’ “inferred demographics” (location, income, age, gender) that are derived from their browsing histories and IP addresses .

Page 18: Survey research techniques:  Matching the method with the research question

Example of Survey Wall

Page 19: Survey research techniques:  Matching the method with the research question

Cost structure of GCS

Page 20: Survey research techniques:  Matching the method with the research question

Limitations of GCS• GCS recommends the question length to be 125

characters and sets the maximum limit to 175 characters. • Censored questions and populations: GCS places

restrictions on sensitive demographics information by prohibiting researchers from asking respondents for their age, gender, ethnicity, religion, and immigration status. Researchers can only ask these questions if the response choices include "I prefer not to say" as the opt out answer.

• GCS also requires an option for the respondents to not answer the question by either clicking an “I don’t know, show me another question” link.

• Observational studies that require statistical models including a long (or even small} list of control variables can likely be accomplished as efficiently using other survey tools.

Page 21: Survey research techniques:  Matching the method with the research question

Strengths of GCS• First, if these experiments can be implemented in one

question, then the considerable cost advantages of GCS over its competitors can be realized.

• Second, If true randomization of treatments can be achieved (and we argue that GCS’ methodology does so - and demonstrate resulting balance in a set of measured covariates then no additional control variables are necessary to make strong inferences.

• Finally, given the fact that most experimenters are much more concerned with the internal validity of their inferences than with tier external validity, unresolved questions about the representativeness of the GCS sample may be of secondary concern and certainly the sample is better than that of most lab experiments.