Top Banner
Probabilistic Graphical Models Introduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne 26/9/2011 (EPFL) Graphical Models 26/9/2011 1 / 28
45

Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de...

Mar 06, 2018

Download

Documents

buibao
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Probabilistic Graphical Models

Introduction. Basic Probability and Bayes

Volkan Cevher, Matthias SeegerEcole Polytechnique Fédérale de Lausanne

26/9/2011

(EPFL) Graphical Models 26/9/2011 1 / 28

Page 2: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Outline

1 Motivation

2 Probability. Decisions. Estimation

3 Bayesian Terminology

(EPFL) Graphical Models 26/9/2011 2 / 28

Page 3: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Motivation

Benefits of Doubt

Not to be absolutely certain is,I think, one of the essential thingsin rationality B. Russell (1947)

Real-world problems are uncertainMeasurement errorsIncomplete, ambiguous dataModel? Features?

(EPFL) Graphical Models 26/9/2011 3 / 28

Page 4: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Motivation

Benefits of Doubt

Not to be absolutely certain is,I think, one of the essential thingsin rationality B. Russell (1947)

If uncertainty is part of your problem . . .

Ignore/remove itCostlyComplicatedNot always possible

(EPFL) Graphical Models 26/9/2011 3 / 28

Page 5: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Motivation

Benefits of Doubt

Not to be absolutely certain is,I think, one of the essential thingsin rationality B. Russell (1947)

If uncertainty is part of your problem . . .

Ignore/remove itCostlyComplicatedNot always possible

Live with itQuantify it: probabilitiesCompute it: Bayesian inference

(EPFL) Graphical Models 26/9/2011 3 / 28

Page 6: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Motivation

Benefits of Doubt

Not to be absolutely certain is,I think, one of the essential thingsin rationality B. Russell (1947)

If uncertainty is part of your problem . . .

Ignore/remove itCostlyComplicatedNot always possible

Exploit itExperimental designRobust decision makingMultimodal data integration

(EPFL) Graphical Models 26/9/2011 3 / 28

Page 7: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Motivation

Image Reconstruction

������������

��� �������

�������� �

�����

�������

������

(EPFL) Graphical Models 26/9/2011 4 / 28

Page 8: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Motivation

Reconstruction is Ill Posed

(EPFL) Graphical Models 26/9/2011 5 / 28

Page 9: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Motivation

Image Statistics

Whatever images are . . .

they are not Gaussian!

Image gradient super-Gaussian (“sparse”)

Use sparsity prior distribution P(u)

(EPFL) Graphical Models 26/9/2011 6 / 28

Page 10: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Motivation

Posterior Distribution

Likelihood P(y |u): Data fit

Prior P(u): Signal propertiesPosterior distribution P(u |y ):Consistent information summary

P(y |u)

(EPFL) Graphical Models 26/9/2011 7 / 28

Page 11: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Motivation

Posterior Distribution

Likelihood P(y |u): Data fitPrior P(u): Signal properties

Posterior distribution P(u |y ):Consistent information summary

P(y |u)⇥ P(u)

(EPFL) Graphical Models 26/9/2011 7 / 28

Page 12: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Motivation

Posterior Distribution

Likelihood P(y |u): Data fitPrior P(u): Signal propertiesPosterior distribution P(u |y ):Consistent information summary

P(u |y ) =P(y |u)⇥ P(u)

P(y )

(EPFL) Graphical Models 26/9/2011 7 / 28

Page 13: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Motivation

Estimation

Maximum a Posteriori (MAP) Estimation

u⇤ = argmaxu

P(y |u)P(u)

(EPFL) Graphical Models 26/9/2011 8 / 28

Page 14: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Motivation

Estimation

Maximum a Posteriori (MAP) Estimation

There are many solutions. Why settle for any single one?

(EPFL) Graphical Models 26/9/2011 8 / 28

Page 15: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Motivation

Bayesian Inference

Use All SolutionsWeight each solution by our uncertaintyAverage over them. Integrate, don’t prune

(EPFL) Graphical Models 26/9/2011 9 / 28

Page 16: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Motivation

Bayesian Experimental Design

Posterior: Uncertainty inreconstructionExperimental design:Find poorly determineddirectionsSequential search withinterjacent partialmeasurements

(EPFL) Graphical Models 26/9/2011 10 / 28

Page 17: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Motivation

Structure of Course

Graphical Models [⇡ 6 weeks]Probabilistic database. Expert systemQuery for making optimal decisionsGraph separation $ conditional independence) (More) efficient computation (dynamic programming)

Approximate Inference [⇡ 6 weeks]Bayesian inference is never really tractableVariational relaxations (convex duality)Propagation algorithmsSparse Bayesian models

(EPFL) Graphical Models 26/9/2011 11 / 28

Page 18: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Motivation

Course Goals

This course is not:Exhaustive (but we will give pointers)Playing with data until it worksPurely theoretical analysis of methods

This course is:Computer scientist’s view on Bayesian machine learning:Layers above and below formulae

Understand concepts (what to do and why)Understand approximations, relaxations, generic algorithmicschemata (how to do, above formulae)Safe implementation on a computer (how to do, below formulae)

Red line through models, algorithms.Exposing roots in specialized computational mathematics

(EPFL) Graphical Models 26/9/2011 12 / 28

Page 19: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Probability. Decisions. Estimation

Why Probability?

Remember sleeping through Statistics 101(p-value, t-test, . . . )? Forget that impression!Probability leads to beautiful, useful insights and algorithms.Not much would work today without probabilistic algorithms,decisions from incomplete knowledge.

Numbers, functions, moving bodies ) CalculusPredicates, true/false statements ) Predicate logicUncertain knowledge about numbers, predicates, . . . ) Probability

Machine learning? Have to speak probability!Crash course here. But dig further, it’s worth it:

Grimmett, Stirzaker: Probability and Random ProcessesPearl: Probabilistic Reasoning in Intelligent Systems

(EPFL) Graphical Models 26/9/2011 13 / 28

Page 20: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Probability. Decisions. Estimation

Why Probability?

Reasons to use probability (forget “classical” straightjacket)We really don’t / cannot know (exactly)It would be too complicated/costly to find outIt would take too long to computeNondeterministic processes (given measurement resolution)Subjective beliefs, interpretations

(EPFL) Graphical Models 26/9/2011 14 / 28

Page 21: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Probability. Decisions. Estimation

Probability over Finite/Countable SetsYou know databases?You know probability!

Probability distribution P:Joint table/hypercube (· � 0;

P· = 1)

Random variable F : Index of tableEvent E : Part of tableProbability P(E): Sum over cells in EMarginal distribution P(F ): Projectionof table (sum over others)

Marginalization (Sum Rule)Not interested in variable(s) right now?) Marginalize over them!

BoxFruit red blueapple 1/10 9/20orange 3/10 3/20

(EPFL) Graphical Models 26/9/2011 15 / 28

Page 22: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Probability. Decisions. Estimation

Probability over Finite/Countable SetsYou know databases?You know probability!

Probability distribution P:Joint table/hypercube (· � 0;

P· = 1)

Random variable F : Index of tableEvent E : Part of tableProbability P(E): Sum over cells in EMarginal distribution P(F ): Projectionof table (sum over others)

Marginalization (Sum Rule)Not interested in variable(s) right now?) Marginalize over them!

BoxFruit red blueapple 1/10 9/20orange 3/10 3/20

P(F ) =X

B=r ,b

P(F , B)

(EPFL) Graphical Models 26/9/2011 15 / 28

Page 23: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Probability. Decisions. Estimation

Probability over Finite/Countable Sets (II)

Conditional probability:Factorization of table

Chop out part you’re sure about(don’t marginalize: you know!)Renormalize to 1 (/ marginal)

Conditioning (Product Rule)Observed some variable/event?) Condition on it!Joint = Conditional ⇥ Marginal

Information propagation (B!F)

PredictMarginalize

BoxFruit red blueapple 1/10 9/20orange 3/10 3/20

(EPFL) Graphical Models 26/9/2011 16 / 28

Page 24: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Probability. Decisions. Estimation

Probability over Finite/Countable Sets (II)

Conditional probability:Factorization of table

Chop out part you’re sure about(don’t marginalize: you know!)Renormalize to 1 (/ marginal)

Conditioning (Product Rule)Observed some variable/event?) Condition on it!Joint = Conditional ⇥ Marginal

Information propagation (B!F)

PredictMarginalize

BoxFruit red blueapple 1/10 9/20orange 3/10 3/20

P(F , B) = P(F |B)P(B)

(EPFL) Graphical Models 26/9/2011 16 / 28

Page 25: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Probability. Decisions. Estimation

Probability over Finite/Countable Sets (II)

Conditional probability:Factorization of table

Chop out part you’re sure about(don’t marginalize: you know!)Renormalize to 1 (/ marginal)

Conditioning (Product Rule)Observed some variable/event?) Condition on it!Joint = Conditional ⇥ Marginal

Information propagation (B!F)PredictMarginalize

BoxFruit red blueapple 1/10 9/20orange 3/10 3/20

P(F , B) = P(F |B)P(B)

P(F ) =X

B

P(F |B)P(B)

(EPFL) Graphical Models 26/9/2011 16 / 28

Page 26: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Probability. Decisions. Estimation

Probability over Finite/Countable Sets (III)

Bayes Formula

P(B|F )P(F ) = P(F , B) = P(F |B)P(B)

BoxFruit red blueapple 1/10 9/20orange 3/10 3/20

(EPFL) Graphical Models 26/9/2011 17 / 28

Page 27: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Probability. Decisions. Estimation

Probability over Finite/Countable Sets (III)

Bayes Formula

P(B|F ) =P(F |B)P(B)

P(F )

Inversion of information flowCausal ! diagnostic(diseases ! symptoms)Inverse problem

BoxFruit red blueapple 1/10 9/20orange 3/10 3/20

(EPFL) Graphical Models 26/9/2011 17 / 28

Page 28: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Probability. Decisions. Estimation

Probability over Finite/Countable Sets (III)

Bayes Formula

P(B|F ) =P(F |B)P(B)

P(F )

Inversion of information flowCausal ! diagnostic(diseases ! symptoms)Inverse problem

BoxFruit red blueapple 1/10 9/20orange 3/10 3/20

Chain rule of probability

P(X1, . . . , Xn) = P(X1)P(X2|X1)P(X3|X1, X2) . . . P(Xn|X1, . . . , Xn�1)

Holds in any orderingStarting point for Bayesian networks [next lecture]

(EPFL) Graphical Models 26/9/2011 17 / 28

Page 29: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Probability. Decisions. Estimation

Probability over Continuous Variables

(EPFL) Graphical Models 26/9/2011 18 / 28

Page 30: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Probability. Decisions. Estimation

Probability over Continuous Variables (II)

Caveat: Null sets [P({x = 5}) = 0; P({x 2 N}) = 0]Every observed event is a null set!P(y |x) = P(y , x)/P(x) cannot work for P(x) = 0Define conditional density as P(y |x) s.t.

P(y |x 2 A) =

Z

AP(y |x)P(x) dx for all eventsA

Most cases in practice:Look at y 7! P(y , x) (“plug in x”)Recognize density / normalize

Another (technical) caveat: Not all subsets can be events.) Events: “Nice” subsets (measurable)

(EPFL) Graphical Models 26/9/2011 19 / 28

Page 31: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Probability. Decisions. Estimation

Probability over Continuous Variables (II)

Caveat: Null sets [P({x = 5}) = 0; P({x 2 N}) = 0]Every observed event is a null set!P(y |x) = P(y , x)/P(x) cannot work for P(x) = 0Define conditional density as P(y |x) s.t.

P(y |x 2 A) =

Z

AP(y |x)P(x) dx for all eventsA

Most cases in practice:Look at y 7! P(y , x) (“plug in x”)Recognize density / normalize

Another (technical) caveat: Not all subsets can be events.) Events: “Nice” subsets (measurable)

(EPFL) Graphical Models 26/9/2011 19 / 28

Page 32: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Probability. Decisions. Estimation

Expectation. Moments of a Distribution

Expectation

E[f (x )] =

Zf (x )P(x ) dx or

X

x

f (x )P(x )

(EPFL) Graphical Models 26/9/2011 20 / 28

Page 33: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Probability. Decisions. Estimation

Expectation. Moments of a Distribution

Expectation

E[f (x )] =

Zf (x )P(x ) dx or

X

x

f (x )P(x )

P(x ) complicated. How does x ⇠ P(x ) behave?Moments: Essential behaviour of distribution

Mean (1st order)

E[x ] =

ZxP(x ) dx

Covariance (2nd order)

Cov[x , y ] = E[xy

T ]� E[x ](E[y ])T

= E[vxv

Ty ], vx = x � E[x ]−3 −2 −1 0 1 2 3 4 5

−3

−2

−1

0

1

2

3

4

5

(EPFL) Graphical Models 26/9/2011 20 / 28

Page 34: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Probability. Decisions. Estimation

Expectation. Moments of a Distribution

Expectation

E[f (x )] =

Zf (x )P(x ) dx or

X

x

f (x )P(x )

P(x ) complicated. How does x ⇠ P(x ) behave?Moments: Essential behaviour of distribution

Mean (1st order)

E[x ] =

ZxP(x ) dx

Covariance (2nd order)

Cov[x ] = Cov[x , x ]

−3 −2 −1 0 1 2 3 4 5−3

−2

−1

0

1

2

3

4

5

(EPFL) Graphical Models 26/9/2011 20 / 28

Page 35: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Probability. Decisions. Estimation

Decision Theory in 30 Seconds

Recipe for optimal decisions1 Choose a loss function L

(depends on problem, your valuation, susceptibility)2 Model actions (A), outcomes (O). Compute P(O|A)3 Compute risks (expected losses) R(A) =

RL(O)P(O|A) dO

4 Go for A⇤ = argminA R(A)

Special case: Pricing of A (option, bet, car)Choose �R(A) + MarginNext best measurement? Next best scientific experiment?Harder if timing plays a role (optimal control, etc)

(EPFL) Graphical Models 26/9/2011 21 / 28

Page 36: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Probability. Decisions. Estimation

Maximum Likelihood EstimationBayesian inversion hard. In simple cases, with enough data:

Maximum Likelihood EstimationObserved {xi}. Interested in cause ✓

Construct sampling model P(x |✓)Likelihood L(✓) = P(D|✓) =

Qi P(xi |✓):

Should be high close to “true” ✓0

Maximum likelihood estimator:✓⇤ = argmax L(✓) = argmax log L(✓) = argmax

Pi log P(xi |✓)

(EPFL) Graphical Models 26/9/2011 22 / 28

Page 37: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Probability. Decisions. Estimation

Maximum Likelihood EstimationBayesian inversion hard. In simple cases, with enough data:

Maximum Likelihood EstimationObserved {xi}. Interested in cause ✓

Construct sampling model P(x |✓)Likelihood L(✓) = P(D|✓) =

Qi P(xi |✓):

Should be high close to “true” ✓0

Maximum likelihood estimator:✓⇤ = argmax L(✓) = argmax log L(✓) = argmax

Pi log P(xi |✓)

Method of choice for simple ✓, lots of data.Well understood asymptoticallyKnowledge about ✓ besides D? Not usedBreaks down if ✓ “larger” than DAnd another problem . . .

(EPFL) Graphical Models 26/9/2011 22 / 28

Page 38: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Probability. Decisions. Estimation

Overfitting

Overfitting Problem (Estimation)For finite D, more and more complicated models fit D better and better.With huge brain, you just learn by heart.Generalization only comes with a limit on complexity!Marginalization solves this problem, but even Bayesian estimation(”half-way marginalization”) embodies complexity control.

0 0.2 0.4 0.6 0.8 10

0.5

1

1.5

2

2.5

3

0 0.2 0.4 0.6 0.8 10

0.5

1

1.5

2

2.5

3

3.5

4

0 0.2 0.4 0.6 0.8 1−4

−3

−2

−1

0

1

2

3

4

(EPFL) Graphical Models 26/9/2011 23 / 28

Page 39: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Bayesian Terminology

Probabilistic Model

ModelConcise description of joint distribution (generative process) of allvariables of interest

Encoding assumptions:What are the entities?How do they relate?Variables have different roles.Roles may change dependingon what model is used forModel specifies variables and their (in)dependencies) Graphical models [next lecture]

(EPFL) Graphical Models 26/9/2011 24 / 28

Page 40: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Bayesian Terminology

The Linear Model (Polynomial Fitting)Fit data with polynomial (degree k )

Prior P(w )Restrictions on w

Prior knowledge) Prefer smaller kwkLikelihood P(y |w )

Posterior

P(w |y ) =P(y |w )P(w )

P(y )

Linear Model

y = X w + "

y Responses (observed)X Design (controlled)w Weights (query)" Noise (nuisance)

(EPFL) Graphical Models 26/9/2011 25 / 28

Page 41: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Bayesian Terminology

The Linear Model (Polynomial Fitting)Fit data with polynomial (degree k )

Prior P(w )Restrictions on w

Prior knowledge) Prefer smaller kwkLikelihood P(y |w )

Posterior

P(w |y ) =P(y |w )P(w )

P(y )

Linear Model

y = X w + "

y Responses (observed)X Design (controlled)w Weights (query)" Noise (nuisance)

Prediction: y⇤ = x

T⇤ E[w |y ]

Marginal likelihood

P(y ) = P(y |k) =

ZP(y |w )P(w ) dw

(EPFL) Graphical Models 26/9/2011 25 / 28

Page 42: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Bayesian Terminology

The Linear Model (Polynomial Fitting)

−2 −1.5 −1 −0.5 0 0.5 1 1.5 2−8

−6

−4

−2

0

2

4

(EPFL) Graphical Models 26/9/2011 26 / 28

Page 43: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Bayesian Terminology

The Linear Model (Polynomial Fitting)

−2 −1.5 −1 −0.5 0 0.5 1 1.5 2−8

−6

−4

−2

0

2

4

(EPFL) Graphical Models 26/9/2011 26 / 28

Page 44: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Bayesian Terminology

Model Selection

Posterior P(w |y , k = 3) Marginal Likelihood P(y |k)

−2 −1.5 −1 −0.5 0 0.5 1 1.5 2−8

−6

−4

−2

0

2

4

1 2 3 40

4

8x 10−13

Simpler hypotheses considered as well) Occam’s razor

(EPFL) Graphical Models 26/9/2011 27 / 28

Page 45: Introduction. Basic Probability and Bayes - LIONS | EPFL · PDF fileIntroduction. Basic Probability and Bayes Volkan Cevher, Matthias Seeger Ecole Polytechnique Fédérale de Lausanne

Bayesian Terminology

Model Averaging

Don’t know polynomial order k ) Marginalize out

E[y⇤|y ] =X

k�1

P(k |y )x (k)⇤

TE[w |y , k ]

1 2 3 40

4

8x 10−13

−2 −1.5 −1 −0.5 0 0.5 1 1.5 2−8

−6

−4

−2

0

2

4

(EPFL) Graphical Models 26/9/2011 28 / 28