Top Banner
1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28
35

Markov random field: A brief introduction

Feb 02, 2016

Download

Documents

vanig

Markov random field: A brief introduction. Tzu-Cheng Jen Institute of Electronics, NCTU 2007-03-28. Outline. Neighborhood system and cliques Markov random field Optimization-based vision problem Solver for the optimization problem. Neighborhood system and cliques. Prior knowledge. - PowerPoint PPT Presentation
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Markov random field:       A brief introduction

1

Markov random field: A brief introduction

Tzu-Cheng Jen

Institute of Electronics, NCTU

2007-03-28

Page 2: Markov random field:       A brief introduction

2

Outline

Neighborhood system and cliques

Markov random field

Optimization-based vision problem

Solver for the optimization problem

Page 3: Markov random field:       A brief introduction

3

Neighborhood system and cliques

Page 4: Markov random field:       A brief introduction

4

Prior knowledge

In order to explain the concept of the MRF, we first introduce following definition:

1. i: Site (Pixel)

2. Ni: The neighboring point of i

3. S: Set of sites (Image)

4. fi: The value at site i (Intensity)

f1 f2 f3

f4 fi f6

f7 f8 f9

A 3x3 imagined image

Page 5: Markov random field:       A brief introduction

5

Neighborhood system

The sites in S are related to one another via a neighborhood system. Its definition for S is defined as:

where Ni is the set of sites neighboring i.

The neighboring relationship has the following properties: (1) A site is not neighboring to itself

(2) The neighboring relationship is mutual f1 f2 f3

f4 fi f6

f7 f8 f9

' 'i ii N i N

Page 6: Markov random field:       A brief introduction

6

Neighborhood system: Example

First order neighborhood system

Second order neighborhood system

Nth order neighborhood system

Page 7: Markov random field:       A brief introduction

7

Neighborhood system: Example

The neighboring sites of the site i are m, n, and f.

The neighboring sites of the site j are r and x

Page 8: Markov random field:       A brief introduction

8

Clique

A clique C is defined as a subset of sites in S. Following are some examples

Page 9: Markov random field:       A brief introduction

9

Clique: Example

Take first order neighborhood system and second order neighborhood for example:

Neighborhood system Clique types

Page 10: Markov random field:       A brief introduction

10

Markov random field

Page 11: Markov random field:       A brief introduction

11

Markov random field (MRF)

View the 2D image f as the collection of the random variables (Random field)

A random field is said to be Markov random field if it satisfies following properties

Image configuration f

f1 f2 f3

f4 fi f6

f7 f8 f9

{ }

(1) ( ) 0, (Positivity)

(2) ( | ) ( | ) (Markovianity)i S i i Ni

P f f

P f f P f f

F

Page 12: Markov random field:       A brief introduction

12

Gibbs random field (GRF) and Gibbs distribution

A random field is said to be a Gibbs random field if and only if its configuration f obeys Gibbs distribution, that is:

Image configuration f

f1 f2 f3

f4 fi f6

f7 f8 f91 2

1 2 '{ } { , '}

1 2 '{ } { } '

( ) ( ) ( ) ( , ) .....

( ) ( , ) .....i

c i i ic C i C i i C

i i ii S i S i N

U f V f V f V f f

V f V f f

1( )1( )

U fTP f Z e

U(f): Energy function; T: Temperature Vi(f): Clique potential

Design U for different applications

Page 13: Markov random field:       A brief introduction

13

Markov-Gibbs equivalence

Hammersley-Clifford theorem: A random field F is an MRF if and only if F is a GRF

Proof(<=): Let P(f) be a Gibbs distribution on S with the neighborhood system N.

f1 f2 f3

f4 fi f6

f7 f8 f9

A 3x3 imagined image

( )

{ } ( '){ }

'

( )( | )

( )

cc C

cc C

i

V f

i S i V fS i

f

P f eP f f

P fe

{ }( | ) ( | ) i S i i NiP f f P f f

Page 14: Markov random field:       A brief introduction

14

Markov-Gibbs equivalence

Divide C into two set A and B with A consisting of cliques containing i and B cliques not containing i:

A 3x3 imagined image

f1 f2 f3

f4 fi f6

f7 f8 f9

( ) ( ) ( )

{ } ( ') ( ') ( ')

''

( )

( ')

'

[ ][ ]( | )

{[ ][ ]}

[ ] ( | )

{[ ]}

c c cc C c A c B

c c cc C c A c B

ii

cc A

cc A

i

V f V f V f

i S i V f V f V f

ff

V f

i NiV f

f

e e eP f f

e e e

eP f f

e

Page 15: Markov random field:       A brief introduction

15

Optimization-based vision problem

Page 16: Markov random field:       A brief introduction

16

Denoising

Noisy signal d denoised signal f

Page 17: Markov random field:       A brief introduction

17

MAP formulation for denoising problem

The problem of the signal denoising could be modeled as the MAP estimation problem, that is,

arg max{ ( | )}

By Baye's rule:

arg max{ ( | ) ( )}

:

:

f

f

f p f d

f p d f p f

f Unknown data

d Observed data

(Prior model)

(Observation

model)

Page 18: Markov random field:       A brief introduction

18

MAP formulation for denoising problem

Assume the observation is the true signal plus the independent Gaussian noise, that is

Under above circumstance, the observation model could be expressed as

2, (0, )i i i id f e e N

2 2

1

( ) / 2( | )

2 2

1 1( | )

2 2

m

i i ii

f dU d f

m m

i ii m i m

p d f e e

U(d|f): Likelihood energy

Page 19: Markov random field:       A brief introduction

19

MAP formulation for denoising problem

Assume the unknown data f is MRF, the prior model is:

Based on above information, the posteriori probability becomes

1( )1( )

U fTP f Z e

2 2

1

( )( ) / 21

2

1( | ) ( | )* ( ) *

2

m

i i ii

U ff dT

m

ii m

p f d P d f P f e Z e

Page 20: Markov random field:       A brief introduction

20

MAP formulation for denoising problem

The MAP estimator for the problem is:

2 2

1

( )( ) / 21

2

2 2

1

arg max{ ( | )} arg max{ ( | ) ( )}

1arg max{ * }

2

arg min{ ( ) / 2 ( )}

arg min{ ( | ) ( )}

m

i i ii

f f

U ff dT

f m

ii m

m

f i i ii

f

f p f d p d f p f

e Z e

f d U f

U d f U f

?

Page 21: Markov random field:       A brief introduction

21

MAP formulation for denoising problem

Define the smoothness prior:

Substitute above information into the MAP estimator, we could get:

21( ) ( )i i

i

U f f f

22

121 1

arg max{ ( | )} arg min{ ( | ) ( )}

( )arg min{ ( ) }

2

f f

m mi i

f i ii i

f p f d U d f U f

f df f

Observation model (Similarity measure)

Prior model (Reconstruction constrain)

Page 22: Markov random field:       A brief introduction

22

Super-resolution

Super-Resolution (SR): A method to reconstruct high-resolution images/videos from low-resolution images/videos

Page 23: Markov random field:       A brief introduction

23

Super-resolution

Illustration for super-resolution

d(1) d(2) d(3) d(4)

f(1)

Use the low-resolution frames to reconstruct the high resolution frame

Page 24: Markov random field:       A brief introduction

24

MAP formulation for super-resolution problem

The problem of the super-resolution could be modeled as the MAP estimation problem, that is,

(1) (2) ( )

(1) (2) ( )

( )

arg max{ ( | ..... )}

By Bayes rule:

arg max{ ( ..... | ) ( )}

:

:

Mf

Mf

i

f p f d d d

f p d d d f p f

f High resolution image

d Low resolution image

(Prior model) (Observation model)

Page 25: Markov random field:       A brief introduction

25

MAP formulation for super-resolution problem

The conditional PDF can be modeled as the Gaussian distribution if the noise source is Gaussian noise

We also assume the prior model is joint Gaussian distribution

(1) (2) ( ) (1) (2) ( )( ..... | ) exp( ( , ,...., , ))M Mp d d d f H d d d f

1( ) exp( ( ) ( ))

:

: var

Tp f f M f M

where

M Mean of f

Co iance matrix

Page 26: Markov random field:       A brief introduction

26

MAP formulation for super-resolution problem

Substitute above relation into the MAP estimator, we can get following expression:

(1) (2) ( )

(1) (2) ( ) 1

(1) (2) ( ) 1

arg max{ ( ..... | ) ( )}

arg max{exp{-( ( , ,...., , ) ( ) ( ))}}

arg min{ ( , ,...., , ) ( ) ( ))} arg min ( )

Mf

M Tf

M Tf f

f p d d d f p f

H d d d f f M f M

H d d d f f M f M E f

(Prior model) (Observation model)

Page 27: Markov random field:       A brief introduction

27

Solver for the optimization problem

Page 28: Markov random field:       A brief introduction

28

The solver of the optimization problem

In this section, we will introduce different approaches for solving the optimization problem: 1. Brute-force search (Global extreme)

2. Gradient descent search (Local extreme, Usually)

3. Genetic algorithm (Global extreme)

4. Simulated annealing algorithm (Global extreme)

Page 29: Markov random field:       A brief introduction

29

Gradient descent algorithm (1)

Page 30: Markov random field:       A brief introduction

30

Gradient descent algorithm (2)

Page 31: Markov random field:       A brief introduction

31

Simulation: SR by gradient descent algorithm

Use 6 low resolution frames (a)~(f) to reconstruct the high resolution frame (g)

Page 32: Markov random field:       A brief introduction

32

Simulation: SR by gradient descent algorithm

Page 33: Markov random field:       A brief introduction

33

The problem of the gradient descent algorithm

Gradient descent algorithm may be trapped into the local extreme instead of the global extreme

Page 34: Markov random field:       A brief introduction

34

Genetic algorithm (GA)

The GA includes following steps:

Page 35: Markov random field:       A brief introduction

35

Simulated annealing (SA)

The SA includes following steps: