Statistics for the Social Sciences Psychology 340 Fall 2013 Tuesday, November 19 Chi-Squared Test of Independence.

Post on 25-Dec-2015

214 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

Transcript

Statistics for the Social SciencesStatistics for the Social SciencesPsychology 340

Fall 2013Tuesday, November 19

Chi-Squared Test of Independence

Homework #13 due11/28Homework #13 due11/28

Ch 17 # 13, 14, 19, 20

Last Time:

• Clarification and review of some regression concepts

• Multiple regression

• Regression in SPSS

This Time:

•Review of multiple regression

•New Topic: Chi-squared test of independence

•Announcements:

•Final project due date extended from Dec. 5 to Dec. 6. Must be turned in to psychology department by 4 p.m.

•Extra credit due by the start of class (Dec. 5) to receive credit. Evidence of academic dishonesty regarding extra credit will be referred for disciplinary action.

•Exam IV (emphasizing correlation, regression, and chi-squared test) is on Tuesday, December 3

•Final exam is on Tuesday, 12/10 at 7:50 a.m.

Multiple RegressionMultiple Regression

• Typically researchers are interested in predicting with more than one explanatory variable

• In multiple regression, an additional predictor variable (or set of variables) is used to predict the residuals left over from the first predictor.

Multiple RegressionMultiple Regression

Y = intercept + slope (X) + error

• Bi-variate regression prediction models

Multiple RegressionMultiple Regression

• Multiple regression prediction models

“fit” “residual”

Y = intercept + slope (X) + error

• Bi-variate regression prediction models

Multiple RegressionMultiple Regression

• Multiple regression prediction models

FirstExplanatory Variable

SecondExplanatory Variable

FourthExplanatory Variable

whatever variability is left over

ThirdExplanatory Variable

Multiple RegressionMultiple Regression• Predict test performance based on:

FirstExplanatory Variable

SecondExplanatory Variable

FourthExplanatory Variable

whatever variability is left over

ThirdExplanatory Variable

• Study time • Test time

• What you eat for breakfast • Hours of sleep

Multiple RegressionMultiple Regression• Predict test performance based on:

• Study time • Test time

• What you eat for breakfast • Hours of sleep

• Typically your analysis consists of testing multiple regression models to see which “fits” best (comparing R2s of the models)

versus

versus

• For example:

Multiple RegressionMultiple Regression

Response variableTotal variability it test performance

Total study timer = .6

Model #1: Some co-variance between the two variables

R2 for Model = .36

64% variance unexplained

• If we know the total study time, we can predict 36% of the variance in test performance

Multiple RegressionMultiple Regression

Response variableTotal variability it test performance

Test timer = .1

Model #2: Add test time to the model

Total study timer = .6

R2 for Model = .37

63% variance unexplained

• Little co-variance between these test performance and test time

• We can explain more the of variance in test performance

Multiple RegressionMultiple Regression

Response variableTotal variability it test performance

breakfastr = .0

Model #3: No co-variance between these test performance and breakfast food

Total study timer = .6

Test timer = .1

R2 for Model = .37

63% variance unexplained

• Not related, so we can NOT explain more the of variance in test performance

Multiple RegressionMultiple Regression

Response variableTotal variability it test performance

breakfastr = .0

• We can explain more the of variance • But notice what happens with the overlap (covariation

between explanatory variables), can’t just add r’s or r2’s

Total study timer = .6

Test timer = .1

Hrs of sleepr = .45

R2 for Model = .45

55% variance unexplained

Model #4: Some co-variance between these test performance and hours of sleep

Multiple RegressionMultiple Regression

The “least squares” regression equation when there are multiple intercorrelated predictor (x) variables is found by calculating “partial regression coefficients” for each x

A partial regression coefficient for x1 shows the relationship between y and x1 while statistically controlling for the other x variables (or holding the other x variables constant)

Multiple RegressionMultiple Regression

The formula for the partial regression coefficient is:b1= (rY1-rY2r12)/(1-r12

2)*(sY/s1)WhererY1=correlation of x1and yrY2=correlation of x2and yr12=correlation of x1 and x2

sY=standard deviation of y, s1=standard deviation of x1

Multiple RegressionMultiple Regression

• Multiple correlation coefficient (R) is an estimate of the relationship between the dependent variable (y) and the best linear combination of predictor variables (correlation of y and y-pred.)

• R2 tells you the amount of variance in y explained by the particular multiple regression model being tested.

Multiple Regression in SPSSMultiple Regression in SPSSSetup as before:

Variables (explanatory and response) are entered into columns

• A couple of different ways to use SPSS to compare different models

Regression in SPSSRegression in SPSS• Analyze: Regression, Linear

Multiple Regression in SPSSMultiple Regression in SPSS

• Method 1:enter all the explanatory

variables together – Enter:

• All of the predictor variables into the Independent Variable field

• Predicted (criterion) variable into Dependent Variable field

Multiple Regression in SPSSMultiple Regression in SPSS• The variables in the model

• r for the entire model

• r2 for the entire model

• Unstandardized coefficients

• Coefficient for var1 (var name)

• Coefficient for var2 (var name)

Multiple Regression in SPSSMultiple Regression in SPSS• The variables in the model

• r for the entire model

• r2 for the entire model

• Standardized coefficients

• Coefficient for var1 (var name)

• Coefficient for var2 (var name)

Multiple RegressionMultiple Regression

– Which coefficient to use, standardized or unstandardized?– Unstandardized b’s are easier to use if you want to predict a raw score based on raw scores (no z-scores needed).

– Standardized β’s are nice to directly compare which variable is most “important” in the equation

Multiple Regression in SPSSMultiple Regression in SPSS

• Predicted (criterion) variable into Dependent Variable field

• First Predictor variable into the Independent Variable field

• Click the Next button

• Method 2: enter first model, then add another variable for second model, etc. – Enter:

Multiple Regression in SPSSMultiple Regression in SPSS

• Method 2 cont: – Enter:

• Second Predictor variable into the Independent Variable field

• Click Statistics

Multiple Regression in SPSSMultiple Regression in SPSS

– Click the ‘R squared change’ box

Multiple Regression in SPSSMultiple Regression in SPSS• The variables in the first model (math SAT)

• Shows the results of two models

• The variables in the second model (math and verbal SAT)

Multiple Regression in SPSSMultiple Regression in SPSS• The variables in the first model (math SAT)

• r2 for the first model

• Coefficients for var1 (var name)

• Shows the results of two models

• The variables in the second model (math and verbal SAT)

• Model 1

Multiple Regression in SPSSMultiple Regression in SPSS• The variables in the first model (math SAT)

• Coefficients for var1 (var name)

• Coefficients for var2 (var name)

• Shows the results of two models

• r2 for the second model

• The variables in the second model (math and verbal SAT)

• Model 2

Multiple Regression in SPSSMultiple Regression in SPSS• The variables in the first model (math SAT)

• Shows the results of two models

• The variables in the second model (math and verbal SAT)

• Change statistics: is the change in r2 from Model 1 to Model 2 statistically significant?

Cautions in Multiple RegressionCautions in Multiple Regression

• We can use as many predictors as we wish but we should be careful not to use more predictors than is warranted.– Simpler models are more likely to generalize to other samples.

– If you use as many predictors as you have participants in your study, you can predict 100% of the variance. Although this may seem like a good thing, it is unlikely that your results would generalize to any other sample and thus they are not valid.

– You probably should have at least 10 participants per predictor variable (and probably should aim for about 30).

New (Final) TopicNew (Final) Topic

Chi-Squared Test of Independence

Chi-Squared Test for IndependenceChi-Squared Test for Independence

A manufacturer of watches takes a sample of 200 people. Each person isclassified by age and watch type preference (digital vs. analog).

A manufacturer of watches takes a sample of 200 people. Each person isclassified by age and watch type preference (digital vs. analog).

The question: is there a relationship between age and watch preference?

Young (under 30)

Old (over 30)

Chi-Squared Test for IndependenceChi-Squared Test for Independence

A manufacturer of watches takes a sample of 200 people. Each person isclassified by age and watch type preference (digital vs. analog).

A manufacturer of watches takes a sample of 200 people. Each person isclassified by age and watch type preference (digital vs. analog).

The question: is there a relationship between age and watch preference?

Young (under 30)

Old (over 30)

Statistical analysis follows design

We have finished the top part of the chart!

Focus on this section for rest of semester

A manufacturer of watches takes a sample of 200 people. Each person isclassified by age and watch type preference (digital vs. analog). The question: is there a relationship between age and watch preference?

A manufacturer of watches takes a sample of 200 people. Each person isclassified by age and watch type preference (digital vs. analog). The question: is there a relationship between age and watch preference?

Chi-Squared Test for IndependenceChi-Squared Test for Independence

Chi-Squared Test for IndependenceChi-Squared Test for Independence

Step 1: State the hypotheses– H0: Preference is

independent of age (“no relationship”)

– HA: Preference is related to age (“there is a relationship”)

A manufacturer of watches takes a sample of 200 people. Each person isclassified by age and watch type preference (digital vs. analog). The question: is there a relationship between age and watch preference?

A manufacturer of watches takes a sample of 200 people. Each person isclassified by age and watch type preference (digital vs. analog). The question: is there a relationship between age and watch preference?

Observed scores

Chi-Squared Test for IndependenceChi-Squared Test for IndependenceStep 2: Compute your degrees of freedom & get critical value

df = (#Columns - 1) * (#Rows - 1) = (3-1) * (2-1) = 2

• For this example, with df = 2, and = 0.05• The critical chi-squared value is 5.99

– Go to Chi-square statistic table (B-8) and find the critical value

Chi-Squared Test for IndependenceChi-Squared Test for Independence

Step 3: Collect the data. Obtain row and column totals (sometimes called the marginals) and calculate the expected frequencies

Observed scores

Chi-Squared Test for IndependenceChi-Squared Test for Independence

Step 3: Collect the data. Obtain row and column totals (sometimes called the marginals) and calculate the expected frequencies

Observed scores

Spot check: make sure the row totals and column totals add up to the same thing

Chi-Squared Test for IndependenceChi-Squared Test for Independence

Step 3: Collect the data. Obtain row and column totals (sometimes called the marginals) and calculate the expected frequencies

Under 30

Over 30

Digital

Analog

Undecided

70 56 14

30 24 6

Observed scores

Expected scores

Chi-Squared Test for IndependenceChi-Squared Test for Independence

Step 3: Collect the data. Obtain row and column totals (sometimes called the marginals) and calculate the expected frequencies

Under 30

Over 30

Digital

Analog

Undecided

70 56 14

30 24 6

Observed scores

Expected scores

“expected frequencies” - if the null hypothesis is

correct, then these are the frequencies that you would expect

“expected frequencies” - if the null hypothesis is

correct, then these are the frequencies that you would expect

• Find the residuals (fo - fe) for each cell

Chi-Squared Test for IndependenceChi-Squared Test for Independence

Step 3: compute the χ2

Computing the Chi-squareComputing the Chi-square

Step 3: compute the χ2

• Find the residuals (fo - fe) for each cell

Computing the Chi-squareComputing the Chi-square

• Square these differences

• Find the residuals (fo - fe) for each cell

Step 3: compute the χ2

Computing the Chi-squareComputing the Chi-square

• Square these differences

• Find the residuals (fo - fe) for each cell

• Divide the squared differences by fe

Step 3: compute the χ2

Computing the Chi-squareComputing the Chi-square

• Square these differences

• Find the residuals (fo - fe) for each cell

• Divide the squared differences by fe

• Sum the results

Step 3: compute the χ2

Chi-Squared, the final stepChi-Squared, the final stepStep 4: Compare this computed statistic (38.09)

against the critical value (5.99) and make a decision about your hypotheses

A manufacturer of watches takes a sample of 200 people. Each person isclassified by age and watch type preference (digital vs. analog). The question: is there a relationship between age and watch preference?

A manufacturer of watches takes a sample of 200 people. Each person isclassified by age and watch type preference (digital vs. analog). The question: is there a relationship between age and watch preference?

here we reject the H0 and conclude that there is a relationship between age and watch preference

In SPSSIn SPSS

Analyze => Descriptives => CrosstabsSelect the two variables (usually they are nominal

or ordinal) you want to examine and click the arrow to move one into the “rows” and one into the “columns” box.

Click on “statistics” button, and check the “Chi-square” box.

Click “continue.”Click “OK.”

SPSS OutputSPSS OutputLook at the “Chi-square tests” box.The top row of this box gives results for

“Pearson’s Chi-Square”• “Value” is the value of the χ2 statistic,• “df” is the degrees of freedom for the test• “Asymp. Sig. (2-sided)” is the probability (p-

value) associated with the test.• The chi-squared distribution, like the F-

distribution, is “squared” so 1-tailed test is not possible.

top related