Top Banner
1 Applications of belief propagation in low- level vision Bill Freeman Massachusetts Institute of Technology Jan. 12, 2010 Joint work with: Egon Pasztor, Jonathan Yedidia, Yair Weiss, Thouis Jones, Edward Adelson, Marshall Tappen.
50

Applications of belief propagation in low-level vision

Jan 19, 2016

Download

Documents

Step

Applications of belief propagation in low-level vision. Bill Freeman Massachusetts Institute of Technology Jan. 12, 2010. Joint work with: Egon Pasztor, Jonathan Yedidia, Yair Weiss, Thouis Jones, Edward Adelson, Marshall Tappen. y 1. x 1. y 2. x 2. y 3. x 3. - PowerPoint PPT Presentation
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Applications of belief propagation in low-level vision

1

Applications of belief propagation in low-level vision

Bill Freeman

Massachusetts Institute of Technology

Jan. 12, 2010

Joint work with: Egon Pasztor, Jonathan Yedidia, Yair Weiss, Thouis Jones, Edward Adelson, Marshall Tappen.

Page 2: Applications of belief propagation in low-level vision

2

x1MMSE meanx1

sumx2

sumx3

P(x1,x2,x3,y1,y2,y3)

y1

Derivation of belief propagation

),( 11 yx

),( 21 xx

),( 22 yx

),( 32 xx

),( 33 yx

x1

y2

x2

y3

x3

Page 3: Applications of belief propagation in low-level vision

3

),(),(sum

),(),(sum

),(mean

),(),(

),(),(

),(sumsummean

),,,,,(sumsummean

3233

2122

111

3233

2122

111

3213211

3

2

1

321

321

xxyx

xxyx

yxx

xxyx

xxyx

yxx

yyyxxxPx

x

x

xMMSE

xxxMMSE

xxxMMSE

The posterior factorizes

y1

),( 11 yx

),( 21 xx

),( 22 yx

),( 32 xx

),( 33 yx

x1

y2

x2

y3

x3

Page 4: Applications of belief propagation in low-level vision

4

Propagation rules

y1

),( 11 yx

),( 21 xx

),( 22 yx

),( 32 xx

),( 33 yx

x1

y2

x2

y3

x3),(),(sum

),(),(sum

),(mean

),(),(

),(),(

),(sumsummean

),,,,,(sumsummean

3233

2122

111

3233

2122

111

3213211

3

2

1

321

321

xxyx

xxyx

yxx

xxyx

xxyx

yxx

yyyxxxPx

x

x

xMMSE

xxxMMSE

xxxMMSE

Page 5: Applications of belief propagation in low-level vision

5

Propagation rules

y1

),( 11 yx

),( 21 xx

),( 22 yx

),( 32 xx

),( 33 yx

x1

y2

x2

y3

x3

),(),(sum

),(),(sum

),(mean

3233

2122

111

3

2

1

xxyx

xxyx

yxx

x

x

xMMSE

)( ),( ),(sum)( 23222211

21

2

xMyxxxxMx

Page 6: Applications of belief propagation in low-level vision

6

Propagation rules

y1

),( 11 yx

),( 21 xx

),( 22 yx

),( 32 xx

),( 33 yx

x1

y2

x2

y3

x3

),(),(sum

),(),(sum

),(mean

3233

2122

111

3

2

1

xxyx

xxyx

yxx

x

x

xMMSE

)( ),( ),(sum)( 23222211

21

2

xMyxxxxMx

Page 7: Applications of belief propagation in low-level vision

7

Belief propagation messages

jii =

ijNk

jkjji

xi

ji xMxxxM

j \)(ij )(),( )(

j

To send a message: Multiply together all the incoming messages, except from the node you’re sending to,then multiply by the compatibility matrix and marginalize over the sender’s states.

A message: can be thought of as a set of weights on each of your possible states

Page 8: Applications of belief propagation in low-level vision

8

Belief propagation: the nosey neighbor rule

“Given everything that I’ve heard, here’s what I think is going on inside your house”

(Given my incoming messages, affecting my state probabilities, and knowing how my states affect your states, here’s how I think you should modify the probabilities of your states)

Page 9: Applications of belief propagation in low-level vision

9

Beliefs

j

)(

)( )(jNk

jkjjj xMxb

To find a node’s beliefs: Multiply together all the messages coming in to that node.

(Show this for the toy example.)

Page 10: Applications of belief propagation in low-level vision

10

Optimal solution in a chain or tree:Belief Propagation

• “Do the right thing” Bayesian algorithm.

• For Gaussian random variables over time: Kalman filter.

• For hidden Markov models: forward/backward algorithm (and MAP variant is Viterbi).

Page 11: Applications of belief propagation in low-level vision

11

Markov Random Fields

• Allows rich probabilistic models for images.• But built in a local, modular way. Learn local

relationships, get global effects out.

Page 12: Applications of belief propagation in low-level vision

12

MRF nodes as pixels

Winkler, 1995, p. 32

Page 13: Applications of belief propagation in low-level vision

13

MRF nodes as patches

image patches

(xi, yi)

(xi, xj)

image

scene

scene patches

Page 14: Applications of belief propagation in low-level vision

14

Network joint probability

scene

image

Scene-scenecompatibility

functionneighboringscene nodes

local observations

Image-scenecompatibility

function

i

iiji

ji yxxxZ

yxP ),(),(1

),(,

Page 15: Applications of belief propagation in low-level vision

15

In order to use MRFs:

• Given observations y, and the parameters of the MRF, how infer the hidden variables, x?

• How learn the parameters of the MRF?

Page 16: Applications of belief propagation in low-level vision

16

Inference in Markov Random Fields

Gibbs sampling, simulated annealingIterated conditional modes (ICM)Belief propagation

Application examples:super-resolutionmotion analysisshading/reflectance separation

Graph cutsVariational methods

Page 17: Applications of belief propagation in low-level vision

17

Inference in Markov Random Fields

Gibbs sampling, simulated annealingIterated conditional modes (ICM)Belief propagation

Application examples:super-resolutionmotion analysisshading/reflectance separation

Graph cutsVariational methods

Page 18: Applications of belief propagation in low-level vision

18

),,,,,(sumsummean 3213211321

yyyxxxPxxxx

MMSE

y1

Derivation of belief propagation

),( 11 yx

),( 21 xx

),( 22 yx

),( 32 xx

),( 33 yx

x1

y2

x2

y3

x3

Page 19: Applications of belief propagation in low-level vision

19

No factorization with loops!

y1

x1

y2

x2

y3

x3

),(),(sum

),(),(sum

),(mean

3233

2122

111

3

2

1

xxyx

xxyx

yxx

x

x

xMMSE

31 ),( xx

Page 20: Applications of belief propagation in low-level vision

20

Applications of belief propagation in low-level vision

Bill Freeman

Massachusetts Institute of Technology

Jan. 12, 2010

Joint work with: Egon Pasztor, Jonathan Yedidia, Yair Weiss, Thouis Jones, Edward Adelson, Marshall Tappen.

Page 21: Applications of belief propagation in low-level vision

21

Belief, and message updates

jii =

ijNk

jkjji

xi

ji xMxxxM

j \)(ij )(),( )(

j

)(

)( )(jNk

jkjjj xMxb

Page 22: Applications of belief propagation in low-level vision

22

Optimal solution in a chain or tree:Belief Propagation

• “Do the right thing” Bayesian algorithm.

• For Gaussian random variables over time: Kalman filter.

• For hidden Markov models: forward/backward algorithm (and MAP variant is Viterbi).

Page 23: Applications of belief propagation in low-level vision

23

Justification for running belief propagation in networks with loops

• Experimental results:

– Error-correcting codes

– Vision applications

• Theoretical results:

– For Gaussian processes, means are correct.

– Large neighborhood local maximum for MAP.

– Equivalent to Bethe approx. in statistical physics.

– Tree-weighted reparameterization

Weiss and Freeman, 2000

Yedidia, Freeman, and Weiss, 2000

Freeman and Pasztor, 1999;Frey, 2000

Kschischang and Frey, 1998;McEliece et al., 1998

Weiss and Freeman, 1999

Wainwright, Willsky, Jaakkola, 2001

Page 24: Applications of belief propagation in low-level vision

24

Results from Bethe free energy analysis

• Fixed point of belief propagation equations iff. Bethe approximation stationary point.

• Belief propagation always has a fixed point.• Connection with variational methods for inference: both

minimize approximations to Free Energy,– variational: usually use primal variables.

– belief propagation: fixed pt. equs. for dual variables.

• Kikuchi approximations lead to more accurate belief propagation algorithms.

• Other Bethe free energy minimization algorithms—Yuille, Welling, etc.

Page 25: Applications of belief propagation in low-level vision

25

References on BP and GBP

• J. Pearl, 1985– classic

• Y. Weiss, NIPS 1998– Inspires application of BP to vision

• W. Freeman et al learning low-level vision, IJCV 1999– Applications in super-resolution, motion, shading/paint

discrimination• H. Shum et al, ECCV 2002

– Application to stereo• M. Wainwright, T. Jaakkola, A. Willsky

– Reparameterization version• J. Yedidia, AAAI 2000

– The clearest place to read about BP and GBP.

Page 26: Applications of belief propagation in low-level vision

26

Inference in Markov Random Fields

Gibbs sampling, simulated annealingIterated conditional modes (ICM)Belief propagation

Application examples:super-resolutionmotion analysisshading/reflectance separation

Graph cutsVariational methods

Page 27: Applications of belief propagation in low-level vision

27

Super-resolution

• Image: low resolution image

• Scene: high resolution image

imag

esc

ene

ultimate goal...

Page 28: Applications of belief propagation in low-level vision

28

Polygon-based graphics images are resolution independent

Pixel-based images are not resolution

independent

Pixel replication

Cubic splineCubic spline, sharpened

Training-based super-resolution

Page 29: Applications of belief propagation in low-level vision

29

3 approaches to perceptual sharpening

(1) Sharpening; boost existing high frequencies.

(2) Use multiple frames to obtain higher sampling rate in a still frame.

(3) Estimate high frequencies not present in image, although implicitly defined.

In this talk, we focus on (3), which we’ll call “super-resolution”.

spatial frequency

ampl

itud

e

spatial frequencyam

plit

ude

Page 30: Applications of belief propagation in low-level vision

30

Super-resolution: other approaches

• Schultz and Stevenson, 1994

• Pentland and Horowitz, 1993

• fractal image compression (Polvere, 1998; Iterated Systems)

• astronomical image processing (eg. Gull and Daniell, 1978; “pixons” http://casswww.ucsd.edu/puetter.html)

• Follow-on: Jianchao Yang, John Wright, Thomas S. Huang, Yi Ma: Image super-resolution as sparse representation of raw image patches. CVPR 2008

Page 31: Applications of belief propagation in low-level vision

31

Training images, ~100,000 image/scene patch pairs

Images from two Corel database categories: “giraffes” and “urban skyline”.

Page 32: Applications of belief propagation in low-level vision

32

Do a first interpolation

Zoomed low-resolution

Low-resolution

Page 33: Applications of belief propagation in low-level vision

33

Zoomed low-resolution

Low-resolution

Full frequency original

Page 34: Applications of belief propagation in low-level vision

34

Full freq. originalRepresentationZoomed low-freq.

Page 35: Applications of belief propagation in low-level vision

35

True high freqsLow-band input

(contrast normalized, PCA fitted)

Full freq. originalRepresentationZoomed low-freq.

(to minimize the complexity of the relationships we have to learn,we remove the lowest frequencies from the input image,

and normalize the local contrast level).

Page 36: Applications of belief propagation in low-level vision

36

Training data samples (magnified)

......

Gather ~100,000 patches

low freqs.

high freqs.

Page 37: Applications of belief propagation in low-level vision

37

True high freqs.Input low freqs.

Training data samples (magnified)

......

Nearest neighbor estimate

low freqs.

high freqs.

Estimated high freqs.

Page 38: Applications of belief propagation in low-level vision

38

Input low freqs.

Training data samples (magnified)

......

Nearest neighbor estimate

low freqs.

high freqs.

Estimated high freqs.

Page 39: Applications of belief propagation in low-level vision

39

Example: input image patch, and closest matches from database

Input patch

Closest imagepatches from database

Correspondinghigh-resolution

patches from database

Page 40: Applications of belief propagation in low-level vision

40

Page 41: Applications of belief propagation in low-level vision

41

Scene-scene compatibility function, (xi, xj)

Assume overlapped regions, d, of hi-res. patches differ by Gaussian observation noise:

d

Uniqueness constraint,not smoothness.

Page 42: Applications of belief propagation in low-level vision

42

Image-scene compatibility function, (xi, yi)

Assume Gaussian noise takes you from observed image patch to synthetic sample:

y

x

Page 43: Applications of belief propagation in low-level vision

43

Markov network

image patches

(xi, yi)

(xi, xj)

scene patches

Page 44: Applications of belief propagation in low-level vision

44

Iter. 3

Iter. 1

Belief PropagationInput

Iter. 0

After a few iterations of belief propagation, the algorithm selects spatially consistent high resolution

interpretations for each low-resolution patch of the input image.

Page 45: Applications of belief propagation in low-level vision

45

Zooming 2 octaves

85 x 51 input

Cubic spline zoom to 340x204 Max. likelihood zoom to 340x204

We apply the super-resolution algorithm recursively, zooming

up 2 powers of 2, or a factor of 4 in each dimension.

Page 46: Applications of belief propagation in low-level vision

46

True200x232

Original50x58

(cubic spline implies thin plate prior)

Now we examine the effect of the prior assumptions made about images on the

high resolution reconstruction.First, cubic spline interpolation.

Page 47: Applications of belief propagation in low-level vision

47

Cubic splineTrue

200x232

Original50x58

(cubic spline implies thin plate prior)

Page 48: Applications of belief propagation in low-level vision

48

True

Original50x58

Training images

Next, train the Markov network algorithm on a world of random noise

images.

Page 49: Applications of belief propagation in low-level vision

49

Markovnetwork

True

Original50x58

The algorithm learns that, in such a world, we add random noise when zoom

to a higher resolution.

Training images

Page 50: Applications of belief propagation in low-level vision

50

True

Original50x58

Training images

Next, train on a world of vertically oriented rectangles.