Introduction Preliminaries Linear Models Bayes Regress Model Comparison Summary References Linear Models for Regression Henrik I Christensen Robotics & Intelligent Machines @ GT Georgia Institute of Technology, Atlanta, GA 30332-0280 [email protected]Henrik I Christensen (RIM@GT) Linear Regression 1 / 39
40
Embed
Linear Models for Regression - College of Computinghic/8803-Fall-09/slides/8803-09-lec03.pdf · Linear Models for Regression Henrik I Christensen Robotics & Intelligent Machines @
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Introduction Preliminaries Linear Models Bayes Regress Model Comparison Summary References
Linear Models for Regression
Henrik I Christensen
Robotics & Intelligent Machines @ GTGeorgia Institute of Technology,
Henrik I Christensen (RIM@GT) Linear Regression 17 / 39
Introduction Preliminaries Linear Models Bayes Regress Model Comparison Summary References
Line Estimation
Least square minimization:
Line equation: y = ax + bError in fit:
∑i (yi − axi − b)2
Solution: (y2
y
)=
(x2 xx 1
) (ab
)Minimizes vertical errors. Non-robust!
Henrik I Christensen (RIM@GT) Linear Regression 18 / 39
Introduction Preliminaries Linear Models Bayes Regress Model Comparison Summary References
LSQ on Lasers
Line model: ri cos(φi − θ) = ρ
Error model: di = ri cos(φi − θ)− ρ
Optimize: argmin(ρ,θ)
∑i (ri cos(φi − θ)− ρ)2
Error model derived in Deriche et al. (1992)
Well suited for “clean-up” of Hough lines
Henrik I Christensen (RIM@GT) Linear Regression 19 / 39
Introduction Preliminaries Linear Models Bayes Regress Model Comparison Summary References
Total Least Squares
Line equation: ax + by + c = 0
Error in fit:∑
i (axi + byi + c)2 where a2 + b2 = 1.
Solution: (x2 − x x xy − x y
xy − x y y2 − y y
) (ab
)= µ
(ab
)where µ is a scale factor.
c = −ax − by
Henrik I Christensen (RIM@GT) Linear Regression 20 / 39
Introduction Preliminaries Linear Models Bayes Regress Model Comparison Summary References
Line Representations
The line representation is crucialOften a redundant model isadoptedLine parameters vs end-pointsImportant for fusion ofsegments.End-points are less stable
Henrik I Christensen (RIM@GT) Linear Regression 21 / 39
Introduction Preliminaries Linear Models Bayes Regress Model Comparison Summary References
Sequential Adaptation
In some cases one at a time estimation is more suitable
Also known as gradient descent
w(τ+1) = w(τ) − η∇En
= w(τ) − η(tn −w(τ)Tφ(xn))φ(xn)
Knows as least-mean square (LMS). An issue is how to choose η?
Henrik I Christensen (RIM@GT) Linear Regression 22 / 39
Introduction Preliminaries Linear Models Bayes Regress Model Comparison Summary References
Regularized Least Squares
As seen in lecture 2 sometime control of parameters might be useful.
Consider the error function:
ED(w) + λEW (w)
which generates
1
2
N∑i=1
{ti − w tφ(xi )}2 +λ
2wTw
which is minimized by
w =(λI + ΦTΦ
)−1ΦT t
Henrik I Christensen (RIM@GT) Linear Regression 23 / 39
Introduction Preliminaries Linear Models Bayes Regress Model Comparison Summary References
Outline
1 Introduction
2 Preliminaries
3 Linear Basis Function Models
4 Baysian Linear Regression
5 Baysian Model Comparison
6 Summary
Henrik I Christensen (RIM@GT) Linear Regression 24 / 39
Introduction Preliminaries Linear Models Bayes Regress Model Comparison Summary References
Bayesian Linear Regression
Define a conjugate prior over w
p(w) = N(w |m0,S0)
given the likelihood function and regular from Bayesian analysis wecan derive
p(w |t) = N(w |mN ,SN)
where
mN = SN
(S−1
0 m0 + βΦT t)
S−1N = S−1
0 + βΦTΦ
Henrik I Christensen (RIM@GT) Linear Regression 25 / 39
Introduction Preliminaries Linear Models Bayes Regress Model Comparison Summary References
Bayesian Linear Regression (2)
A common choice is
p(w) = N(w |0, α−1I )
So that
mN = βSNΦT t
S−1N = αI + βΦTΦ
Henrik I Christensen (RIM@GT) Linear Regression 26 / 39
Introduction Preliminaries Linear Models Bayes Regress Model Comparison Summary References
Example - No Data
Henrik I Christensen (RIM@GT) Linear Regression 27 / 39
Introduction Preliminaries Linear Models Bayes Regress Model Comparison Summary References
Example - 1 Data Point
Henrik I Christensen (RIM@GT) Linear Regression 28 / 39
Introduction Preliminaries Linear Models Bayes Regress Model Comparison Summary References
Example - 2 Data Points
Henrik I Christensen (RIM@GT) Linear Regression 29 / 39
Introduction Preliminaries Linear Models Bayes Regress Model Comparison Summary References
Example - 20 Data Points
Henrik I Christensen (RIM@GT) Linear Regression 30 / 39
Introduction Preliminaries Linear Models Bayes Regress Model Comparison Summary References
Outline
1 Introduction
2 Preliminaries
3 Linear Basis Function Models
4 Baysian Linear Regression
5 Baysian Model Comparison
6 Summary
Henrik I Christensen (RIM@GT) Linear Regression 31 / 39
Introduction Preliminaries Linear Models Bayes Regress Model Comparison Summary References
Bayesian Model Comparison
How does one select an appropriate model?
Assume for a minute we want to compare a set of models Mi ,i ∈ 1, ...L for a dataset D
We could compute
p(Mi |D) ∝ p(D|Mi )p(Mi )
Bayes Factor: Ratio of evidence for two models
p(D|Mi )
p(D|Mj)
Henrik I Christensen (RIM@GT) Linear Regression 32 / 39
Introduction Preliminaries Linear Models Bayes Regress Model Comparison Summary References
The mixture distribution approach
We could use all the models:
p(t|x ,D) =L∑
i=1
p(t|x ,Mi ,D)p(Mi |D)
Or simply go with the most probably/best model.
Henrik I Christensen (RIM@GT) Linear Regression 33 / 39
Introduction Preliminaries Linear Models Bayes Regress Model Comparison Summary References
Model Evidence
We can compute model evidence
p(D|Mi ) =
∫p(D|w ,Mi )p(w |Mi )dw
Allow computation of model fit based on parameter range
Henrik I Christensen (RIM@GT) Linear Regression 34 / 39
Introduction Preliminaries Linear Models Bayes Regress Model Comparison Summary References
Evaluation of Parameters
Evaluation of posterior over parameters
p(w |D,Mi ) =P(D|w ,Mi )p(w |Mi )
P(D|Mi )
There is a need to understand how good is a model?
Henrik I Christensen (RIM@GT) Linear Regression 35 / 39
Introduction Preliminaries Linear Models Bayes Regress Model Comparison Summary References
Model Comparison
Consider evaluation of a model w. parameters w
p(D) =
∫p(D|w)p(w)dw ≈ p(D|wmap)
σposterior
σprior
Then
ln p(D) ≈ ln p(D|wmap) + ln
(σposterior
σprior
)
Henrik I Christensen (RIM@GT) Linear Regression 36 / 39
Introduction Preliminaries Linear Models Bayes Regress Model Comparison Summary References
Model Comparison as Kullback-Leibler
From earlier we have comparison of distributions
KL =
∫p(D|M1) ln
p(D|M1)
p(D|M2)dD
Enables comparison of two different models
Henrik I Christensen (RIM@GT) Linear Regression 37 / 39
Introduction Preliminaries Linear Models Bayes Regress Model Comparison Summary References
Outline
1 Introduction
2 Preliminaries
3 Linear Basis Function Models
4 Baysian Linear Regression
5 Baysian Model Comparison
6 Summary
Henrik I Christensen (RIM@GT) Linear Regression 38 / 39
Introduction Preliminaries Linear Models Bayes Regress Model Comparison Summary References
Summary
Brief intro to linear methods for estimation of models
Prediction of values and models
Needed for adaptive selection of models (black-box/grey-box)Evaluation of sensor models, ...
Consideration of batch and recursive estimation methods
Significant discussion of methods for evaluation of models andparameters.
This far purely a discussion of linear models
Henrik I Christensen (RIM@GT) Linear Regression 39 / 39
Introduction Preliminaries Linear Models Bayes Regress Model Comparison Summary References
Deriche, R., Vaillant, R., & Faugeras, O. 1992. From Noisy Edges Pointsto 3D Reconstruction of a Scene : A Robust Approach and ItsUncertainty Analysis. Vol. 2. World Scientific. Series in MachinePerception and Artificial Intelligence. Pages 71–79.
Henrik I Christensen (RIM@GT) Linear Regression 39 / 39