1 Image Segmentation Jianbo Shi Robotics Institute Carnegie Mellon University Cuts, Random Walks, and Phase-Space Embedding Joint work with: Malik,Malia,Yu.

Post on 28-Mar-2015

221 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

Transcript

1

                                                                      

Image Segmentation

Jianbo Shi

Robotics InstituteCarnegie Mellon University

Cuts, Random Walks, and Phase-Space Embedding

Joint work with: Malik,Malia,Yu

2

                                                                      

Taxonomy of Vision Problems

Reconstruction:– estimate parameters of external 3D world.

Visual Control:– visually guided locomotion and manipulation.

Segmentation:– partition I(x,y,t) into subsets of separate objects.

Recognition:– classes: face vs. non-face,– activities: gesture, expression.

3

                                                                      

4

                                                                      

We see Objects

5

                                                                      

Outline

Problem formulation Normalized Cut criterion & algorithm The Markov random walks view of Normalized Cut Combining pair-wise attraction & repulsion Conclusions

6

                                                                      

Edge-based image segmentation

Edge detection by gradient operators

Linking by dynamic programming, voting, relaxation, …Montanari 71, Parent&Zucker 89, Guy&Medioni 96, Shaashua&Ullman 88Williams&Jacobs 95, Geiger&Kumaran 96, Heitger&von der Heydt 93

- Natural for encoding curvilinear grouping- Hard decisions often made prematurely

7

                                                                      

Region-based image segmentation

Region growing, split-and-merge, etc...- Regions provide closure for free, however,

approaches are ad-hoc. Global criterion for image segmentation

• Markov Random Fields e.g. Geman&Geman 84• Variational approaches e.g. Mumford&Shah 89• Expectation-Maximization e.g. Ayer&Sawhney 95, Weiss 97

- Global method, but computational complexity precludes exact MAP estimation- Problems due to local minima

8

                                                                      

Bottom line: It is hard, nothing worked well, use edge

detection, or just avoid it.

9

                                                                      

Global good, local bad.

Global decision good, local bad– Formulate as hierarchical graph partitioning

Efficient computation– Draw on ideas from spectral graph theory to define an

eigenvalue problem which can be solved for finding segmentation.

Develop suitable encoding of visual cues in terms of graph weights.

Shi&Malik,97

10

                                                                      

Image segmentation by pairwise similarities

Image = { pixels } Segmentation = partition of

image into segments Similarity between pixels i and

j Sij = Sji 0

Objective: “similar pixels should be in the same segment, dissimilar pixels should be in different segments”

Sij

11

                                                                      

Segmentation as weighted graph partitioning

Pixels i I = vertices of graph GEdges ij = pixel pairs with Sij > 0

Similarity matrix S = [ Sij ]

is generalized adjacency matrix

di = j Sij degree of i

vol A = i A di volume of A I

Sij

i

j

i

A

12

                                                                      

Cuts in a graph

(edge) cut = set of edges whose removal makes a graph disconnected

weight of a cut

cut( A, B ) = i A,j B Sij

the normalized cutNCut( A,B ) = cut( A,B )( + )

1 .vol A

1 .vol B

13

                                                                      

Normalized Cut and Normalized Association

Minimizing similarity between the groups, and maximizing similarity

within the groups can be achieved simultaneously.

B)A,(2 B)A,( NassocNcut

)(

B)A,(B)A,( B)A,(

BVol

cut

Vol(A)

cutNcut

)(

B)B,( B)A,(

BVol

assoc

Vol(A)

assoc(A,A)Nassoc

14

                                                                      

The Normalized Cut (NCut) criterion

Criterion

min NCut( A,A )Small cut between subsets of ~ balanced grouping

NP-Hard!

15

                                                                      

Some definitions

.1)(,}1,1{ in vector a be Let

);,(),( matrix, diag. thebe DLet

;),( matrix,n associatio thebe Let ,

Aiixx

jiWiiD

wjiWW

N

j

ji

16

                                                                      

Normalized Cut As Generalized Eigenvalue problem

Rewriting Normalized Cut in matrix form:

...

),(

),( ;

11)1(

)1)(()1(

11

)1)(()1(

)B(

)BA,(

)A(

B)A,(B)A,(

0

i

x

T

T

T

T

iiD

iiDk

Dk

xWDx

Dk

xWDx

Vol

cut

Vol

cutNcut

i

17

                                                                      

More math…

18

                                                                      

Normalized Cut As Generalized Eigenvalue problem

after simplification, we get

...

),(

),( ;

11)1(

)1)(()1(

11

)1)(()1(

)B(

)BA,(

)A(

B)A,(B)A,(

0

i

x

T

T

T

T

iiD

iiDk

Dk

xWDx

Dk

xWDx

Vol

cut

Vol

cutNcut

i

.01},,1{ with ,)(

),(

DybyDyy

yWDyBANcut T

iT

T

y2i i

A

y2i

i

A

DxxWD )(

19

                                                                      

Interpretation as a Dynamical System

20

                                                                      

Interpretation as a Dynamical System

21

                                                                      

Brightness Image Segmentation

22

                                                                      

brightness image segmentation

23

                                                                      

Results on color segmentation

24

                                                                      

Malik,Belongie,Shi,Leung,99

25

                                                                      

26

                                                                      

Motion Segmentation with Normalized Cuts

Networks of spatial-temporal connections:

27

                                                                      

Results

28

                                                                      

Results

Shi&Malik,98

29

                                                                      

Results

30

                                                                      

Results

31

                                                                      

Stereoscopic data

32

                                                                      

Conclusion I

Global is good, local is bad– Formulated Ncut grouping criterion– Efficient Ncut algorithm using generalized eigen-system

Local pair-wise allows easy encoding and combining of Gestalt grouping cues

33

                                                                      

Goals of this work

Better understand why spectral segmentation works– random walks view for NCut algorithm– complete characterization of the “ideal” case

ideal case is more realistic/general than previously thought

Learning feature combination/object shape model– Max cross-entropy method for learning

Malia&Shi,00

34

                                                                      

The random walks view

Construct the matrix P = D-1S

D = S =

P is stochastic matrix j Pij = 1 P is transition matrix of Markov chain with state space I

= [ d1 d2 . . . dn ]T is stationary distribution

d1

d2

. . .

dn

S11 S12 S1n

S21 S22 S2n

. . .

Sn1 Sn2 Snn

1 .vol I

35

                                                                      

Reinterpreting the NCut criterion

NCut( A, A ) = PAA + PAA

PAB = Pr[ A --> B | A ] under P,

NCut looks for sets that “trap” the random walk Related to Cheeger constant, conductivity in Markov

chains

36

                                                                      

Reinterpreting the NCut algorithm

(D-W)y = Dy

1=0 2 . . . n

y1 y2 . . . Yn

k = 1 - k

yk = xk

Px = x

1=1 2 . . . n

x1 x2 . . . xn

The NCut algorithm segments based on the second largest eigenvector of P

37

                                                                      

So far...

We showed that the NCut criterion & its approximation the NCut algorithm have simple interpretations in the Markov chain framework– criterion - finds “almost ergodic” sets– algorithm - uses x2 to segment

Now: Will use Markov chain view to show when the NCut

algorithm is exact, i.e. when P has K piecewise constant eigenvectors

38

                                                                      

Piecewise constant eigenvectors: Examples

Block diagonal P (and S)Eigenvalues

EigenvectorsS

Eigenvalues

Eigenvectors

P Equal rows in each block

39

                                                                      

Piecewise constant eigenvectors: general case

Theorem[Meila&Shi] Let P = D-1S with D non-

singular and let be a partition of I. Then P has K

piecewise constant eigenvectors w.r.t iff P is

block stochastic w.r.t and P non-singular.Eigenvectors

EigenvaluesP

40

                                                                      

Block stochastic matrices

= ( A1, A2, . . . AK ) partition of I

P is block stochastic w.r.t

j As Pij = j As Pi’j for i,i’ in same segment As’

Intuitively: Markov chain can be aggregated so that random walk over is Markov

P = transition matrix of Markov chain over

41

                                                                      

Learning image segmentation

Targets Pij* = for i in segment A

Model Sij = exp( q qfqij )

0, j A

00, j A 1 . |A|

Model

normalize

Image

Segmentation

Learning

Pij*

PijSij

fijq

42

                                                                      

The objective function

J = - i I j I Pij* log Pij

J = KL( P* || P )

where 0 = and Pi j* = 0Pij* the flow i j

Max entropy formulation maxPij H( j | i )

s.t. <fijq>Pij = <fij

q>Pij* for all q

Convex problem with convex constraints at most one optimum

The gradient = <fijq> Pij * - <fij

q> Pij

1 .|I|

1 .|I|

J q

43

                                                                      

Experiments - the features

IC - intervening contour

fijIC = max Edge( k )

k (i,j) Edge( k ) = output of edge filter for pixel kDiscourages transitions across edges

CL - colinearity/cocircularity

fijCL = +

Encourages transitions along flow lines

Random features

2-cos(2i)-cos(2j)1 - cos(l)

2-cos(2i + j)1 - cos(0)

i

j

k

iji

j

orientation

Edgeness

44

                                                                      

Training examples

IC CL

45

                                                                      

Test examples

Original Image Edge Map Segmentation

46

                                                                      

Conclusions

Showed that the NCut segmentation method has a probabilistic interpretation

Introduced – a principled method for combining features in image segmentation– supervised learning and synthetic data to find the optimal combination of

features

Graph Cuts Generalized Eigen-system

Markov Random Walks

top related