1 Multivariate Linear Multivariate Linear Regression Models Regression Models Shyh-Kang Jeng Shyh-Kang Jeng Department of Electrical Engineeri Department of Electrical Engineeri ng/ ng/ Graduate Institute of Communicatio Graduate Institute of Communicatio n/ n/ Graduate Institute of Networking a Graduate Institute of Networking a nd Multimedia nd Multimedia
115
Embed
1 Multivariate Linear Regression Models Shyh-Kang Jeng Department of Electrical Engineering/ Graduate Institute of Communication/ Graduate Institute of.
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
11
Multivariate Linear Multivariate Linear Regression ModelsRegression Models
Shyh-Kang JengShyh-Kang JengDepartment of Electrical Engineering/Department of Electrical Engineering/Graduate Institute of Communication/Graduate Institute of Communication/
Graduate Institute of Networking and MultiGraduate Institute of Networking and Multimediamedia
22
Regression AnalysisRegression Analysis
A statistical methodology A statistical methodology – For predicting value of one or more For predicting value of one or more
response (dependent) variablesresponse (dependent) variables– Predict from a collection of predictor Predict from a collection of predictor
Example 7.1 Fitting a Straight LineExample 7.1 Fitting a Straight Line
Observed dataObserved data
Linear regression modelLinear regression model
zz11 00 11 22 33 44
yy 11 44 33 88 99
110)( responseMean zYE
44
Example 7.1 Fitting a Straight LineExample 7.1 Fitting a Straight Line
0 321 4 5
z
y
2
4
6
8
10
0
55
Classical Linear Regression ModelClassical Linear Regression Model
1,
1
1
1
0
2
1
1
0
21
22221
11211
2
1
110
j
nrnrnn
r
r
n
rr
z
zzz
zzz
zzz
Y
Y
Y
zzY
εβZY
66
Classical Linear Regression ModelClassical Linear Regression Model
Iεεε
ε2
2
)'()Cov(
0)(
,0),Cov(
)Var(
0)(
E
E
kj
E
kj
j
j
77
Example 7.1Example 7.1
43210
11111'
98341'
,,
1
1
1
,
5
2
1
1
0
51
21
11
5
2
1
Z
y
εβZY
εZβY
z
z
z
Y
Y
Y
88
Examples 6.6 & 6.7Examples 6.6 & 6.7
level 1% at the rejected is 0:
27.13)01.0(5.195/10
2/78
/SS
)1/(SS
53-3)2(3 d.f.,10SS
213 d.f. ,78SS
128SS,216SS
011
11
121
222
33
444
444
44
444
213
20
969
3210
5,2
H
Fgn
gF
res
tr
res
tr
meanobs
99
Example 7.2 One-Way ANOVAExample 7.2 One-Way ANOVA
11100000
00011000
00000111
11111111
'
21320969'
otherwise0
population from isn observatio theif1
,,,
,,
3322110
3322110
333222111
Z
Y
jz
zzzY
eXeXeX
j
jjjjj
jjjjjj
1010
Method of Least SquaresMethod of Least Squares
βZyyyyβZyε
bβ
ZbyZby
b
b
b
ˆˆ fitted,ˆˆˆ
ˆˆˆˆ residuals
)(minargˆ
'
)(
minimize toas so Selects
110
1
2110
jrrjjj
n
jjrrjj
zzyε
S
zbzbbyS
1111
Result 7.1Result 7.1
βZyyHIyyyyεε
yZyZεyεZ
yHIyyε
ZZZZHHyβZy
yZZZβZ
ˆ''ˆ'ˆ'ˆ
ˆ'',0ˆ'ˆ,0ˆ'
ˆˆ
')'(,ˆˆ
')'(ˆ,1rank full has 1
1
nr
1212
Proof of Result 7.1Proof of Result 7.1
)(minargˆ
0)(')''('
)'''(''ˆ
ˆ'ˆ2
ˆ''ˆˆ'ˆ
')(
ˆˆˆˆ
''ˆ
1
1
1
bβ
ZZyZZZZZIy
ZZZZZIyZβZy
bβZβZy
bβZZbββZyβZy
ZbyZbyb
bβZβZyZbβZβZyZby
yZZZβ
bS
S
1313
Proof of Result 7.1Proof of Result 7.1
yyyyyZZZZIy
yZZZZZZZZZZZZIy
yZZZZIZZZZIy
yyyyεε
εZβεy
yZZZZIZ
βZyZyyZεZ
ˆ'')''('
)''''''2('
)'')(''('
)ˆ()'ˆ(ˆ'ˆ
0ˆ''ˆˆ'ˆ
0)''('
ˆ')ˆ('ˆ'
1
111
11
1
1414
Example 7.1 Fitting a Straight LineExample 7.1 Fitting a Straight Line
Observed dataObserved data
Linear regression modelLinear regression model
zz11 00 11 22 33 44
yy 11 44 33 88 99
110)( responseMean zYE
1515
Example 7.3Example 7.3
6ˆ'ˆ equations of sum residual
'01210ˆˆ
'97531ˆˆ,21ˆ
2
1''
ˆ
ˆˆ,
70
25'
1.02.0
2.06.0',
3010
105'
1
1
0
1
εε
yyε
βZy
yZZZβyZ
ZZZZ
zy
1616
Coefficient of DeterminationCoefficient of Determination
n
jj
n
jj
n
jj
n
jj
n
jj
n
jj
n
jj
n
jj
n
jj
n
jj
yy
yy
yyR
yyyy
ynyn
yyyy
1
2
1
2
1
2
1
2
2
1
2
1
2
1
2
22
111
ˆˆ
1
ˆˆ
ˆ'ˆˆˆ'ˆ'
ˆˆˆˆ'00ˆ'
ˆ'ˆˆ'ˆ)ˆˆ()'ˆˆ()ˆˆ()'ˆˆ('
0ˆ'ˆ
εεyyyy
ε1εZ
εεyyεyεyyyyyyyyy
εy
1717
Geometry of Least SquaresGeometry of Least Squares
plane modelˆ,plane modelon ˆˆ),(minargˆ
)()'()(
plane) modelin (vector - vector)observed(
1
1
1
)( 2
1
1
21
11
10
εβZybβ
ZbyZbyb
Zby
εZβY
ZβY
bS
S
z
z
z
z
z
z
E
nr
r
r
r
n
1818
Geometry of Least SquaresGeometry of Least Squares
1919
Projection MatrixProjection Matrix
βZyZZZZyqqyqq
qqq
y
qqZeZeZZZZ
eeZeZeqqZeq
eeeeeeZZ
eeeeeeZZ
ˆ''
is ,,,
by dconstructe plane model on the of projection
'''
'
111'
'
11
1
'1
1
'
121
1
1
'1
1
'11
'2/12/1'2/12/1'2/1
'
1
'22
2
'11
1
1
'1
'222
'111
r
iii
r
iii
r
r
iii
r
iiii
ikkkikikikikiiii
rrr
rrr
2020
Result 7.2 Result 7.2
eduncorrelat are ˆ and ˆ
)(,1
)('
)1(
ˆ'ˆ
1ˆ'ˆ
''ˆCov,0ˆ
)(ˆˆ
')ˆCov(,ˆ
''ˆ,
222
2
212
12
1
εβ
YHIYεε
εε
HIZZZZIεε
YHIβZYε
ZZβββ
YZZZβεZβY
sErnrn
s
rnE
E
E
2121
Proof of Result 7.2Proof of Result 7.2
''
''')Cov('')ˆCov(
0)('')ˆ(
'')Cov('')ˆ(Cov
)('')ˆ(
''
''''ˆ
'')(''''ˆ
12
11
1
1211
1
1
11
111
ZZZZI
ZZZZIεZZZZIε
εZZZZIε
ZZZZZεZZZβ
βεZZZββ
εZZZZI
εZβZZZZIYZZZZIε
εZZZβεZβZZZYZZZβ
εZβY
EE
EE
2222
Proof of Result 7.2Proof of Result 7.2
)1(
''tr''tr
)'(''tr'ˆˆ
'''tr
'''tr'''
''''''ˆˆ
0''''
'')'(''ˆˆ)ˆ,ˆCov(
2
12212
1
1
11
11
112
11
rn
tr
EE
EE
ZZZZIZZZZI
εεZZZZIεε
εεZZZZI
εZZZZIεεZZZZIε
εZZZZIZZZZIεεε
ZZZZIZZZ
ZZZZIεεZZZεββεβ
2323
Result 7.3Result 7.3Gauss Least Square TheoremGauss Least Square Theorem
βc
Ya'
βc
βcβcβcβc
IεεεZβY
'for unbiased arethat
form theofestimator all among
variancepossiblesmallest thehas ' of
estimatoran as ˆˆˆˆ'
)Cov(,0)(,
2211
1100
2
nn
rr
YaYaYa
E
2424
Proof of Result 7.3Proof of Result 7.3
(BLUE) ˆ'*'' when minimum is
**'*'*
**'**
')'Var(''Var'Var
'*',*''''ˆ'
')ˆ'(
'''')''()'(
,' ofestimator unbiasedan as 'For
2
2
2
1
βcYaYa
aaaaaa
aaaaaa
aIaεaεaZβaYa
cZaYaYZZZcβc
βcβc
cZaβcZβaεaZβaYa
βcYa
E
EE
2525
Result 7.4Result 7.4
21
22
22
121
1
2
:ˆ'ˆˆ
ofestimator likelihood maximum:ˆ
ˆ oft independen
',:ˆ
''ˆestimator squaresleast the
as same theis ofestimator likelihood maximul
),(:,
rn
r
n
n
N
N
εε
ZβYε
ZZββ
YZZZβ
β
I0εεZβY
2626
Proof of Result 7.4Proof of Result 7.4
oft independen,''ˆ
)()'(minarg,maxarg
, fixedFor
2
1
2
1
2
1,
1
2
2
2/)()'(2/
2/'2/
2/
1
2
2
222
yZZZβ
ZβyZβyβ
β
ββ
ZβyZβy
εε
L
e
eeL
nn
nn
n
j
j
2727
Proof of Result 7.4Proof of Result 7.4
n
n
L
εε
βZyβZy
β
ˆ'ˆ
'ˆ'ˆ
),ˆ(maxargˆ 22
2
2828
Proof of Result 4.11Proof of Result 4.11
SxxxxΣ
ΣΣμ
xμ
μxΣμxxxxxΣ
Σμ
xxxxΣ
n
n
n
eL
n
L
n
jjj
nnp
n
jjj
n
jjj
1'
1ˆ
2
1),ˆ(
ˆ
'2
1'tr
2
1
:),( ofExponent
1
'tr
2/2/
1
1
1
1
1
2929
Proof of Result 7.4Proof of Result 7.4
''|'
|'
')Cov(
ˆ
Cov
''
''
ˆ
1
1
2
1
1
ZZZZI0
0ZZ
AεA
ε
β
Aεαε
ZZZZI
ZZZ
0
β
ε
β
3030
Proof of Result 7.4Proof of Result 7.4
'11
'22
'11
1
1121
211
1
2
121
1
''
0,1
''tr
1''tr
1,0
''''
''
rnrn
nrnrnrn
n
rn
eeeeeeZZZZI
ZZZZI
ZZZZI
ee
ZZZZIZZZZI
eeZZZZI
3131
Proof of Result 7.4Proof of Result 7.4
2
122
12
22
1
1
1
'12
2
2''
1
'1
'2
'1
1
2
1
:
''''ˆ'ˆˆ
),0(t independen :
)Cov(),Cov(
)Cov(,:
rnrn
rn
iii
i
ikkiki
rn
rnrn
VVV
n
NV
VV
N
V
V
V
εeeεεZZZZIεεε
eεe
V0ε
e
e
e
V
3232
22 Distribution Distribution
)2,12/on with distributi (Gamma
0,0
0,)2/(2
1)(
(d.f.) freedom of degrees :,
)1,0(:);,(:
,),,(:),,(:
2
22/12/22/2
2
1
2
2
2222
2111
2
n
enf
x
NX
ZNX
NXNX
n
nn
i i
ii
i
iii
3333
Result 7.5Result 7.5
i
i
rnrii
rnr
n
s
Fr
Fsr
N
ˆ toingcorrespond
' ofelement diagonal:)ˆ(Var
)()1()ˆ(Varˆ
for intervals confidence )%-(1100 usSimultaneo
)1(ˆ''ˆ
for region confidence )%-(1100
),0(:,
12^
1,1
^
i
1,12
2
ZZ
ββZZββ
β
IεεZβY
3434
Proof of Result 7.5Proof of Result 7.5
)()1('ˆ''ˆ
region Confidence
:)1/(1
)1/('
:1
:'),,0(:
')ˆCov(')Cov(
0)(,ˆ'
1,12
1,12
21
22
21
221
22/12/1
2/1
rnr
rnr
rn
rr
Fsr--
Frnsrn
r
srn
N
E
VVββZZββ
VV
VVIV
IZZβZZV
VββZZV
3535
Example 7.4 (Real Estate Data)Example 7.4 (Real Estate Data)20 homes in a Milwaukee, Wisconsin, 20 homes in a Milwaukee, Wisconsin, neighborhoodneighborhood
Regression modelRegression model
dollars) of (thousands valueassessed:
feet) squared of (hundereds size dwelling total:
dollars) of (thousands price selling:
2
1
22110
z
z
Y
zzY jjj
3636
Example 7.4Example 7.4
647.0,556.0:for interval confidence %95
285.0110.2045.0)ˆ(Var025.0ˆ
834.0,045.0634.2967.30ˆ
285.00067.0,785.00512.0,88.71523.5
473.3,
045.0
634.2
967.30
''ˆ
0067.00172.01463.0
0512.02544.0
1523.5
'
2
2
^
172
22
)285.0(1
)785.0()88.7(
1
1
t
Rzzy
sss
syZZZβ
ZZ
3737
Result 7.6Result 7.6
yZZZβ
βZyβZyβZyβZy
ZZ
εβZβZεβ
βZZεZβY
I0εβ
ZZ
β
''ˆ
)ˆ()'ˆ()ˆ()'ˆ(
)(SS)(SS
),(:,
)()/()(SS)(SS
if 0: rejects test ratio Likelihood
11
11)1(
)1(1)1(1
1
)2(2)1(1)2(
)1(21
221
')2(
1,21
)2(0
resres
nrqq
rnqrresres
N
Fs
qr
H
3838
Effect of RankEffect of Rank
In situations where In situations where ZZ is not of full rank, is not of full rank, rrank(ank(ZZ)) replaces replaces r+1r+1 and and rank(rank(ZZ11)) replaces replaces q+1q+1 in Result 7.6 in Result 7.6
3939
Proof of Result 7.6Proof of Result 7.6
)ˆ()'ˆ(ˆ and ''ˆat occurs
maximum the where,ˆ2
1),(max
and ,0:Under
)ˆ()'ˆ(ˆ and ''ˆat occurswhich
ˆ2
1
2
1max),(max
)1(1)1(1211
111(1)
2/
12/
2)1(
,
)1(1)2(0
21
2/2/
2/)()'(2/
,
2
,
2(1)
2
22
βZyβZyyZZZβ
β
εβZYβ
βZyβZyyZZZβ
β
β
ZβyZβy
ββ
n
nn
n
nn
nn
eL
H
e
eL
4040
Proof of Result 7.6Proof of Result 7.6
1,2
122
12
122
21
2
221
22210
2/
2
221
2/
2
21
2
,
2)1(
,
)2(0
:,:ˆ,:ˆ
)/()(SS)(SS
)1/(ˆ)/(ˆˆ
or ˆ/ˆˆ largefor reject toequivalent is
ˆ
ˆˆ1
ˆ
ˆ
),(max
),(max
of valuessmallfor 0:Reject
2
2(1)
rnqrqnrn
resres
nn
FFnn
Fs
qr
rnn
qrn
H
L
L
H
ZZ
β
β
β
β
β
4141
Wishart DistributionWishart Distribution
)'|'(:')|(:
)|(:
)|(:),|(:
:Properties
definite positive:
21
2
)|(
2121
2211
1
2/)1(4/)1(2/)1(
2/tr2/)2(
1
21
21
1
CΣΣCACCACΣAA
ΣAAAA
ΣAAΣAA
A
Σ
AΣA
AΣ
mm
mm
mm
p
i
nppnp
pn
n
WW
W
WW
in
ew
4242
Generalization of Result 7.6Generalization of Result 7.6
2211
12
1,2
11
0
:ˆ)''('ˆ
and )')'(,(:ˆ since
)( ˆ)''('ˆ
if levelat :Reject
matrix )1()(:
rn
qr
rnqr
NC
Fqrs
H
rqr
βCCZZCβC
CZZCCββ
βCCZZCβC
0Cβ
C
4343
Example 7.5 (Service Ratings Example 7.5 (Service Ratings Data)Data)
4444
Example 7.5: Design MatrixExample 7.5: Design Matrix
4545
Example 7.5 Example 7.5
tsignifican isgender hebut that t
effect,location no is thererify that further vecan We
level eappropriatany for ant insignific :89.0
12/)(SS
2/)(SS)(SS)46/()(SS)(SS
0:
14418)rank(,1.3419)(SS
of columnssix first :
12)rank(,4.2977)(SS,6)rank(
'
12
1
3231222112110
11
1
323122211211213210
Z
ZZZZ
ZZ
ZZ
ZZZ
β
res
resresresres
res
res
sF
H
n
n
4646
Result 7.7Result 7.7
20
1'01
'0
00
20
1'0
'0
00'0
'00011000
00001'0
2
'2/ˆ
is | of interval congidence )%1(100
'ˆVar
variance.minimumwith
| ofestimator unbiased theis ˆ
|
at response:,1
),0(:,
st
YE
YE
zzYE
Yzz
N
rn
rr
r
n
zZZzβz
z
zZZzβz
zβz
βzz
zz
IεεZβY
4747
Proof of Result 7.7Proof of Result 7.7
1
01'
02
'0
'0
22
01'
02'
0'0
20
1'0
'0
'0
21
22
121
20
1'00
'00
'0
'0
i'0
:'
ˆ
/
'/ˆ
)',(:ˆ
)1/(:/ oft independen iswhich
)',(:ˆ
')ˆCov(ˆVar
7.3Result by varianceminimum
with the ofestimator unbiased theis ˆ
s' ofn combinatiolinear a is
rn
rn
r
tss
N
rns
N
zZZz
βzβzzZZzβzβz
zZZzβzβz
ZZββ
zZZzzβzβz
βzβz
βz
4848
Result 7.8Result 7.8
01'
02
1'0
0
01'
02'
00
'00
220
0'00
'12/ˆ
:for interval prediction )%1(100
'1)ˆ Var(
ˆ: ofpredictor unbiased
,ˆ , oft independen ),0(:
zZZzβz
zZZzβz
βz
βε
βz
st
Y
Y
Y
sN
Y
rn
4949
Proof of Result 7.8Proof of Result 7.8
1
01'
02
'00
21
22
01'
02'
00
01'
02
'00
'00
'00
'00
'00
'00
'0
'00
:'1
)ˆ(
:)1(
)'1,0(:)ˆ(
'1
))ˆ(Var()Var()ˆVar(
0))ˆ(()()ˆ(
)ˆ(ˆˆerror Forecast
rn
rn
ts
Y
srn
NY
Y
EEYE
Y
zZZz
βz
zZZzβz
zZZz
ββzβz
ββzβz
ββzβzβzβz
5050
Example 7.6 (Computer Data)Example 7.6 (Computer Data)
5151
Example 7.6Example 7.6
155.86) (148.08,or '1)025.0(ˆ
at interval prediction %95
153.94) (150.00,or ')025.0(ˆ
timeCPUmean for the interval confidence %95
776.2)025.0(,71.0'
97.1515.7*42.0130*08.142.8ˆ,204.1
01440.000107.008831.0
00052.006411.0
17969.8
'
42.008.142.8ˆ,5.71301
01'
04'0
0
01'
04'0
401'
0
'0
1
21'0
zZZzβz
z
zZZzβz
zZZz
βz
ZZ
z
st
st
ts
s
zzy
5252
Adequacy of the ModelAdequacy of the Model
),0(: of estimatean isˆ
')'(ˆ
ˆˆˆˆ
ˆˆˆˆ
ˆˆˆˆ
2
1
110
2211022
1111011
N
zzy
zzy
zzy
jj
nrrnnn
rr
rr
yHIyZZZZIε
5353
Residual PlotsResidual Plots
5454
Q-QQ-Q Plots and Histograms Plots and Histograms
Used to detect the presence of Used to detect the presence of unusual observations or severe unusual observations or severe departures from normality that may departures from normality that may require special attention in the require special attention in the analysisanalysis
If If nn is large, minor departures from is large, minor departures from normality will not greatly affect normality will not greatly affect inferences about inferences about
5555
Test of Independence of TimeTest of Independence of Time
)1(2
ˆ
ˆˆ
TestWatson -Durbin
ˆ
ˆˆ
ationautocorrelfirst thefrom dconstructeTest
1
1
2
2
21
1
2
21
1
r
r
n
jj
n
jjj
n
jj
n
jjj
5656
Example 7.7: Residual PlotExample 7.7: Residual Plot
5757
LeverageLeverage
““Outliers” in either the response or Outliers” in either the response or explanatory variables may have a explanatory variables may have a considerable effect on the analysis considerable effect on the analysis and determine the fitand determine the fit
Leverage for simple linear Leverage for simple linear regression with one explanatory regression with one explanatory variable variable zz
n
r
zz
zz
nh n
jj
jjj
1 average,
1
1
2
2
5858
Mallow’s Mallow’s CCpp Statistic Statistic
Select variables from all possible Select variables from all possible combinationscombinations
)2(model fullfor varianceresidual
interceptan including ,parameters with
modelssubset for squares of sum residual
pnp
C p
5959
Usage of Mallow’s Usage of Mallow’s CCpp Statistic Statistic
6060
Stepwise RegressionStepwise Regression1. The predictor variable that 1. The predictor variable that explains the largest significant explains the largest significant proportion of the variation in proportion of the variation in YY is the is the first variable to enterfirst variable to enter2. The next to enter is the one that 2. The next to enter is the one that makes the highest contribution to makes the highest contribution to the regression sum of squares. Use the regression sum of squares. Use Result 7.6 to determine the Result 7.6 to determine the significance (significance (FF-test) -test)
6161
Stepwise RegressionStepwise Regression3. Once a new variable is included, 3. Once a new variable is included, the individual contributions to the the individual contributions to the regression sum of squares of the regression sum of squares of the other variables already in the other variables already in the equation are checked using equation are checked using FF-tests. -tests. If the If the FF-statistic is small, the variable -statistic is small, the variable is deletedis deleted4. Steps 2 and 3 are repeated until 4. Steps 2 and 3 are repeated until all possible additions are non-all possible additions are non-significant and all possible deletions significant and all possible deletions are significantare significant
6262
Treatment of ColinearityTreatment of ColinearityIf If ZZ is not of full rank, is not of full rank, Z’ZZ’Z does not have an inv does not have an inverse erse Colinear ColinearNot likely to have exact colinearityNot likely to have exact colinearityPossible to have a linear combination of coluPossible to have a linear combination of columns of mns of ZZ that are nearly 0 that are nearly 0Can be overcome somewhat byCan be overcome somewhat by– Delete one of a pair of predictor variables that are sDelete one of a pair of predictor variables that are s
trongly correlatedtrongly correlated– Relate the response Relate the response YY to the principal components to the principal components
of the predictor variablesof the predictor variables
6363
Bias Caused by a Bias Caused by a Misspecified ModelMisspecified Model
Mean Corrected Form of the Mean Corrected Form of the Regression ModelRegression Model
2'
22'
2'
2
2'
'22
11
2121
1111
111*
110
'''
0,|
1
1
1
matrixdesign correctedmean
)()(
ccccc
ccc
cc
rnrn
rr
rr
c
jrjrrj
jjrrjj
n
zzzz
zzzz
zzzz
zzzz
zzY
ZZ0
0
ZZ1Z
Z111ZZ
1ZZ1Z
108108
Mean Corrected Form of the Mean Corrected Form of the Regression ModelRegression Model
21
2'
2
2
21'
*
**
1
2'
22'
*
'2
1
2'
2'
2
1
2'
2
'1'*
'/
)ˆCov()ˆ,ˆCov(
)ˆ,ˆCov()ˆVar(
'ˆˆˆ
''/1
ˆ
ˆ
cc
cc
cc
c
cccc
cccccc
ccc
c
n
yy
yn
ZZ0
0
ZZββ
β
zzZZZyzzβ
yZZZyZ
y1
ZZ0
0
yZZZβ
109109
Mean Corrected Form for Mean Corrected Form for Multivariate Multiple RegressionsMultivariate Multiple Regressions
ii
ii
ii
zzii
zzii
zziji
iccc
i
sn
sn
snzz
y
i
)1(ˆ~̂
)1(~
)1(/
ablesinput vari edstandardiz
ˆ
:responseth the
for t vectorscoefficien theof estimates squareleast
)('
2
1
2'
2
)(
(i)
yZZZβ
110110
Relating the FormulationsRelating the Formulations
1'1'
1
2'
22
1
2'
22
2222
1'1
2'
22'
0*
1
2'
22'
*
1'0
)1()1(
''
''''
'ˆ'ˆ,ˆˆ
'ˆˆˆ
:form correctedmean
'ˆˆ :7.13Result
ZZZZZZ
ZZZ
ZZZ
SsSs
ZZZ1yZZZy
Z1yZ1Z1yZy
βSsZZZyβ
zzZZZyzzβ
zzSszβ
YY
cccccc
cccc
Ycccc
cccc
Y
nn
y
yyy
y
yy
y
111111
Example 7.15Example 7.15
Example 7.6, classical linear Example 7.6, classical linear regression modelregression model
Example 7.12, joint normal Example 7.12, joint normal distribution, best predictor as the distribution, best predictor as the conditional meanconditional mean
Both approaches yielded the same Both approaches yielded the same predictor of predictor of YY11
21 42.008.142.8ˆ zzy
112112
Remarks on Both FormulationRemarks on Both Formulation
Conceptually differentConceptually different
Classical modelClassical model– Input variables are set by experimenterInput variables are set by experimenter– Optimal among linear predictorsOptimal among linear predictors
Conditional mean modelConditional mean model– Predictor values are random variables Predictor values are random variables
observed with the response valuesobserved with the response values– Optimal among all choices of predictorsOptimal among all choices of predictors
113113
Example 7.16 Natural Gas DataExample 7.16 Natural Gas Data
114114
Example 7.16 : First ModelExample 7.16 : First Model
52.0ˆ
ˆˆ
)ˆ( ation autocorrel lag1
But,
intercept except the t,significan are tscoefficien All
952.0
Weekend15.857 - Windspeed1.315
DHDLag 1.405 DHD 5.874 1.858Sendout
1
2
21
1
2
n
jj
n
jjj
r
R
115115
Example 7.16 : Second ModelExample 7.16 : Second Model
negligible all are residuals theof nscorrelatio auto
89.228ˆ
240.0470.0
Weekend10.109 - Windspeed1.207
DHDLag 1.426 DHD 5.810 2.130 Sendout
as model fitted aget toSASApply
noise siveautoregresan with errorst independen theReplace