Markov Random Fields and Gibbs Distributions

Post on 28-Jan-2016

85 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

Markov Random Fields and Gibbs Distributions. Qiang He School Of EE & CS Oregon State University. 1. Introduction. Markov random fields (MRFs). A statistical theory for analyzing spatial & contextual dependencies of physical phenomena. A Bayesian labeling problem - PowerPoint PPT Presentation

Transcript

Markov Random Fields and Gibbs Distributions

Qiang HeSchool Of EE & CS

Oregon State University

2

Contents1. Introduction

2. Nondirected graphs

3. Markov Random Fields

4. Gibbs Random Fields

5. Markov-Gibbs Equivalence

6. Inference tasks

7. Summary

3

1. Introduction

4

A statistical theory for analyzing spatial & contextual dependencies of physical phenomena.

A Bayesian labeling problem

A method to establish the probabilistic distributions of interacting labels

Widely used in image processing and computer vision

Markov random fields (MRFs)

5

Properties of MRF

Not ad hoc, can be solved based on sound mathematical principles (maximum a posterior probability, MAP)

Incorporating prior contextual information

Using local properties, which can be implemented in parallel

6

An example: image restoration using MRF

7

Image restoration process Build the neighborhood systems and cliques

Define the clique potentials for prior probability

Derive the likelihood energy

Compute the posterior energy

Solve the MAP

Goals

Restore degraded and noisy images

Infer the true pixels from noisy ones

8

Definition for symbols

f

r

S

= hidden “true “ pixel

= observed “noisy “ pixel

= set of sites or nodes

N

),( NS

= neighbors

= a nondirected graph

9

2. Nondirected graphs

10

A neighborhood system for is defined as

where is the set of sites neighboring . The neighboring

relationship has the following properties:

(1) a site is not neighboring to itself

(2) the neighboring relationship is mutual

S

Neighborhood Systems

}|{ SNN ii

iN

},),(|{ iidpixelpixeldistSiN iii

i

11

Neighborhood Systems

12

Cliques

SA clique is defined as a subset of sites in , where every pair

of sites are neighbors of each other. The collections of single-

site, double-site, and triple-site cliques are denoted by , ,

and ,…

A collection of cliques is 1C 2C

3C

...321 CCCC

13

Cliques

14

3. Markov Random Fields

15

Random field: A family of rvs

defined on the set

Configuration: a value assignment on a random field

Probability:

--discrete case: joint probability

--continuous case: joint PDF

},...,{ 1 mFFF

S

),...,()()( 11 mm fFfFPfFPfP

},...,{ 1 mfffF

Basics

16

Positivity:

FffP ,0)(

Markov random fields

Markovianity:

)|()|( }{ iNiiSi ffPffP

Homogeneity: probability independent of positions of sites

Isotropy: probability independent of orientations of sites

17

)(/)()|()|( rpfPfrprfP

Bayesian labeling problem

)|(maxarg* rfFPfSf

18

4. Gibbs Random Fields (GRFs)

19

Partition function:

))(1

exp(*)( 1 fUT

ZfP

Gibbs distribution:

Ff

fUT

Z ))(1

exp(

Temperature: T

Energy function: )()( fVfUCc

c

Clique potentials: )( fVc

Special case: Gaussian distribution

20

5. Markov-Gibbs Equivalence

21

Proof: MRF=GRF

LfiS

iSiiSi

ifP

fP

fP

ffPffP

' )(

)(

)(

),()|(

}{

}{}{

' ))(exp(

))(exp()|( }{

if Cc c

Cc ciSi fV

fVffP

' )})(exp(*))({exp(

))(exp(*))(exp()|( }{

if Bc cAc c

Bc cAc ciSi fVfV

fVfVffP

' ))(exp(

))(exp()|( }{

if Ac c

Ac ciSi fV

fVffP

Conditional probability:

Extended from clique

potentials: BAC

Factor into two terms

Containing i or not:

Remove the term

containing i:

22

MRF prior and Gibbs distribution

...),()()()(21 },{2

}{1

ji

Cjii

CiCcc ffVfVfVfU

23

Posterior MRF energy

))(exp(*)|( 1 fEZrfFp E

))|(exp(*)|( 1 frUZfFrp r

)|(/)()|()( frUTfUrfUfE

)|(minarg* rfUfSf

Likelihood function:

)|( frULikelihood energy:

Posterior probability:

Posterior energy:

MAP solution:

24

6. Inference tasks

25

Solve MRF prior probability through Gibbs distribution (since MRF=GRF)

Solve likelihood function by estimating the likelihood energy and the posterior energy: coding method or least square error method

Solve the MAP

Solve the Bayesian labeling problem, that is, find the maximum a posterior (MAP) configuration under the observation (simulated annealing process)

Compute a marginal probability

(Gibbs sampling)

Goals

Parameter estimation

)|( rfp

26

Look back at image restoration

Build the neighborhood systems and cliques 4-neighborhood system and two-site cliques

},),(|{ ijrpixelpixeldistSjN iji

},|},{{2 SiNjjiCC i

)(*),( 202 jiji ffgvffV

Define the prior clique potentials

)](1[)( ji ffxg

27

Compute the likelihood energy

nfr ii

Compute the posterior energy

),0(~ Nn

Si

ii frfrU /)()|( 2

Si

iiSi Nj

ji frTffgvrfUi

/)(/)(*)|( 220

28

7. Summary

29

The MRF modeling is to solve the Bayesian labeling problem, that is, find the maximum a posterior (MAP) configuration under the observation

The MRF factors joint distribution into a product of clique potentials

The MRF modeling provides a systematic approach in solving image processing and computer vision problems

Thank you very much!

top related