Top Banner
Multiple Regression
29

Multiple regression

Mar 20, 2017

Download

Science

Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Multiple regression

Multiple Regression

Page 2: Multiple regression
Page 3: Multiple regression
Page 4: Multiple regression
Page 5: Multiple regression
Page 6: Multiple regression
Page 7: Multiple regression

一個變數多個變數

Page 8: Multiple regression
Page 9: Multiple regression
Page 10: Multiple regression

• Example 1• A study of the relation of amount of body fat

• a sample of 20 healthy females: 25-34 years old

• Y : body fat• X1: triceps skinfold thickness (三頭肌皮層厚度 )• X2: thigh circumference (大腿圍 )• X3: midarm circumference (中臂圍 )

• It would be very helpful if a regression model with some or all these predictor variables could provide reliable estimates of amount of body fat.

Page 11: Multiple regression
Page 12: Multiple regression
Page 13: Multiple regression
Page 14: Multiple regression

• Assume X1 is in the model

• SSR(X1): The regression sum of squares

• SSE(X1): The error sum of squares:

• measure the marginal effect of adding another variable to the regression model when X1 is already in the model

• SSR(X2|X1): The extra sum of squares

Page 15: Multiple regression

SSR(X2|X1) = SSE(X1) − SSE(X1, X2) = 143.12 − 109.95 = 33.17

= SSR(X1, X2) − SSR(X1) = 385.44 − 352.27 = 33.17

• SSR(X2|X1)

• reflects the additional or extra reduction in the error sum of squares (SSE) associated with X2, given that X1 is already included in the model the marginal increase in the regression sum of squares (SSR)

Page 16: Multiple regression
Page 17: Multiple regression

• An extra sum of squares: adding X3 SSR( X3| X1, X2) = SSE(X1,X2) − SSE(X1,X2,X3) = 109.95 − 98.41 =

11.54 = SSR(X1,X2,X3) − SSR(X1,X2) = 396.98 − 385.44 =

11.54

• An extra sum of squares: adding X2, X3SSR(X2,X3|X1) = SSE(X1) − SSE(X1, X2,X3) = 143.12 − 98.41 = 44.71 = SSR(X1,X2,X3) − SSR(X1) = 396.98 − 352.27 = 44.71

Page 18: Multiple regression

• A variety of decompositions of SSR into extra sums of squares

• Considering two X variables:

SSTO = SSR(X1) + SSE(X1)

= SSR(X1) + SSR(X2|X1) + SSE(X1, X2)

SSTO = SSR(X1, X2) + SSE(X1, X2)

SSR(X1, X2) = SSR(X1) + SSR(X2|X1)

Page 19: Multiple regression

• Decomposition SSR(X1, X2) = SSR(X1) + SSR(X2|X1)

• SSR(X1): measuring the contribution by including X1 alone in the model

• SSR(X2|X1): measuring the addition contribution when X2 is included, given that X1 is already in the model

• The order of the X variables is arbitrary

• SSR(X1, X2) = SSR(X2) + SSR(X1|X2)

Page 20: Multiple regression

• When the regression model contains three X variables (X1, X2, X3):

SSR(X1, X2, X3) = SSR(X1) + SSR(X2|X1) + SSR(X3|X1, X2)

= SSR(X2) + SSR(X3|X2) + SSR(X1|X2, X3)

= SSR(X3) + SSR(X1|X3) + SSR(X2|X1, X3)

= SSR(X1) + SSR(X2, X3|X1)

Page 21: Multiple regression
Page 22: Multiple regression
Page 23: Multiple regression
Page 24: Multiple regression
Page 25: Multiple regression

More about Predictors

Page 26: Multiple regression
Page 27: Multiple regression
Page 28: Multiple regression
Page 29: Multiple regression

• The residual plot can easily be used for examining other factors of the aptness of the model.