Top Banner
Thomas Songer, PhD Introduction to Research Metho In the Internet Era Bias essing Validity of Associa
31

Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

Mar 31, 2015

Download

Documents

Michelle Blong
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

Thomas Songer, PhD

Introduction to Research MethodsIn the Internet Era

Bias

Assessing Validity of Association

Page 2: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

Learning Objectives:

1. Identify the possible alternative explanations for statistical associations:

--- Chance--- Bias--- Confounding

2. Distinguish between the major types of bias in epidemiologic studies.

2

Page 3: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

Research Process

Research question

Hypothesis

Identify research design

Data collection

Presentation of data

Data analysis

Interpretation of data

Polgar, Thomas 3

Page 4: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

Epidemiologic Reasoning

Assess validity of association• true relationship between the exposure and disease

- Does the observed association really exist?- Is the association valid?

- Are there alternative explanations for the association?- Chance (Random Error)- Bias (Systematic Error)- Confounding

4

Page 5: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

A “valid” statistical association implies “Internal Validity” in the study

Internal Validity: The results of an observationare correct for the particular group being studied

What about “external validity”?

Do the results of the study apply (“generalize”) toto people who were not in the study (e.g. the target population)?

5

Evaluating Associations

Page 6: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

• Internal Validity – Strength of the measurement tools,

assessment methods of exposure and outcome variables in the study, and control for study effects

• External Validity -- strength of the study sample with regards to generalizability

6

Evaluating Associations

Page 7: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

• Threats to validity in research studies• Random error

• Sample size

• Systematic error– Selection bias– Measurement bias– Loss to follow-up

• Hawthorne Effect

• Confounding

• Regression to the mean

7

Page 8: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

Note: DO NOT compromise internal validity in the goal of generalization

* An invalid result cannot be generalized

* Thus, internal validity should never becompromised in an attempt to achievegeneralizability

8

Evaluating Associations

Page 9: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

Note: Keep in mind that even if chance, bias, and confounding have been sufficiently ruled out (or taken into account), it does not necessarily mean that the valid association observed is causal.

The observed association may simply be a coincidence.coincidence.

(i.e. In the last 10, years, incidence rates for prostate cancer have increased, as have sales of plasma TV screens).

9

Evaluating Associations

Page 10: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

How do we know that the associations observed in

epidemiologic studies are real?

10

Page 11: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

Evaluating Associations

Evaluating the validity of an association:

In any epidemiologic study, there are at least 3alternative explanations for the observed results:

1. CHANCE (random error)2. BIAS (systematic error)3. CONFOUNDING

These explanations are not mutually exclusive --more than one can be present in the same study

11

Page 12: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

Bias or Systematic Error• Systematic, non-random, deviation of

results from the truth

high systematic error low systematic error12

Page 13: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

Bias

• Potential biases must be considered and addressed in all epidemiologic studies

• We often assume that exposed and unexposed groups are comparable

• This is not necessarily true

13

Page 14: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

Systematic Error (Bias)

BIAS: Systematic error in the design, conduct, or analysis of a study that results in a mistaken estimate of an exposure/disease relationship

1. SELECTION BIAS

2. INFORMATION BIAS

* Recall Bias* Interviewer Bias* Reporting Bias* Surveillance Bias

14

Page 15: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

Selection Bias

• A distortion in a measure of disease frequency or association resulting from the manner in which subjects are selected for the study

• Result of deficiencies in study design

• E. g. - Case-control study - exposure status may influence selection of subjects to a different extent in cases and controls

- self-selection bias

15

Page 16: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

Bias

SELECTION BIAS: Any systematic error that arises in the process of identifying the two 2 study groups to be compared)

• Results in the study groups being non-comparable, unless some type of statistical adjustment can be made

16

Page 17: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

Selection Bias

EXAMPLE: Case Control Study

Outcome: Hemorrhagic strokeExposure: Appetite suppressant products that

contain Phenylpropanolamine (PPA)

Cases: Persons who experienced a strokeControls: Persons in the community without stroke

Bias: Control subjects were recruited by random-digit dialing from 9:00

AM to 5:00 PM. This resulted in over- representation of unemployed persons who may not represent the study base in terms of use of appetite suppressant products.17

Page 18: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

EXAMPLE: Non-Response

• If refusal or non-response is related to exposure, the estimate of effect may be biased. For example, if controls are selected by use of a household survey, non-response may be related to demographic and lifestyle factors associated with employment.

• Responders often differ systematically from persons who do not respond.

18

Selection Bias

Page 19: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

Berkson’s Bias

• A form of selection bias that affects hospital-based epidemiology studies.

• People in hospital are likely to suffer from multiple diseases and engage in unhealthy behaviours (e.g. smoking)

• As a result, they are atypical of the population in the community

19

Page 20: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

Healthy Worker Effect

• A form of selection bias that affects epidemiology studies of workers.

• Ill and disabled people are likely to be unemployed. The employed (workers) are healthier than other segments of the population.

• As a result, they are atypical of the population in the community

20

Page 21: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

Information Bias

Definition: Systematic differences in the way in which data on exposure and outcome are obtained from the various study groups.

Some Types/Sources of Information Bias:

• Bias in abstracting records• Bias in interviewing• Bias from surrogate interviews• Surveillance bias• Reporting and recall bias

21

Page 22: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

Information Bias

• Results from systematic differences in the way data on exposure or outcome are obtained

• May result from measurement defects or questionnaires or interviews that do not measure what they claim to

• Examples of information bias– Recall bias : self-reported information may be

inaccurate due to low levels of recall

22

Page 23: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

Recall Bias

DEFINITION: Study group participants systematically differ in the way data on exposure or outcome are recalled

• Particularly problematic in case-control studies

• Individuals who have experienced a disease or adverse health outcome may tend to think about possible “causes” of the outcome. This can lead to differential recall

23

Page 24: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

Recall Bias - Example

Outcome: Cleft palateExposure: Systemic infection during

pregnancyCases: Mothers giving birth to children with

cleft palateControls: Mothers giving birth to children

free of cleft palate

Bias: Mothers who have given birth to a child with cleft palate may recall more thoroughly colds and other infections experienced during pregnancy

24

Page 25: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

Interviewer Bias

DEFINITION: Systematic difference in the soliciting, recording, or interpretation of information from study participants

• Can affect every type of epidemiologic study

• May occur when interviewers are not “blinded” to exposure or outcome status of participants.

25

Page 26: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

• Interviewer’s knowledge of subjects’ disease status may result in differential probing of exposure history

• Similarly, interviewer’s knowledge of subjects’ exposure history may result in differential probing and recording of the outcome under examination

• Placebo control is one method used to maintain observer blindness in randomized trials. 26

Interviewer Bias

Page 27: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

Reporting Bias

DEFINITION: Selective suppression or revealing of information such as past history of sexually transmitted disease.

• Often occurs because subject reluctance to report an exposure due to attitudes, beliefs, and perceptions

• “Wish bias” may occur among subjects who have developed a disease and seek to show that the disease “is not their fault.”

27

Page 28: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

Surveillance Bias

• If a population is monitored over a period of time, disease ascertainment may be better in the monitored population than in the general population (“surveillance bias”).

• May lead to biased estimate of exposure/disease relationship.

28

Page 29: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

Misclassification BiasDEFINITION: Erroneous classification of the

exposure or disease status of an individual into a category to which it should not be assigned

Misclassification of the exposure or outcomeExample:--- Cases incorrectly classified as controls--- Controls incorrectly classified as cases--- Exposed incorrectly classified as non-

exposed--- Non-exposed incorrectly classified as

exposed29

Page 30: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

Control of Bias

Can only be prevented and controlled

during the design and conduct of a study

• Choice of a study population

• Methods of data collection

• Sources of case ascertainment and risk factor information

Sever 30

Page 31: Thomas Songer, PhD Introduction to Research Methods In the Internet Era Bias Assessing Validity of Association.

Good Study Design Protects Against All Forms of Error

31