Segmentation using eigenvectors Papers: “Normalized Cuts and Image Segmentation”. Jianbo Shi and Jitendra Malik, IEEE, 2000 “Segmentation using eigenvectors:

Post on 29-Dec-2015

219 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

Transcript

Segmentation using eigenvectors

Papers: “Normalized Cuts and Image Segmentation”. Jianbo Shi and Jitendra Malik, IEEE, 2000

“Segmentation using eigenvectors: a unifying view”. Yair Weiss, ICCV 1999.

Presenter: Carlos Vallespi

cvalles@cs.cmu.edu

Image Segmentation

Image segmentation

How do you pick the right segmentation?

•Bottom up segmentation: - Tokens belong together because they are locally coherent. •Top down segmentation: - Tokens grouped because they lie on the same object.

•Bottom up segmentation: - Tokens belong together because they are locally coherent. •Top down segmentation: - Tokens grouped because they lie on the same object.

“Correct” segmentation

There may not be a single correct answer.

Partitioning is inherently hierarchical.One approach we will use in this

presentation: “Use the low-level coherence of brightness,

color, texture or motion attributes to come up with partitions”

Outline

1. Introduction

2. Graph terminology and representation.

3. “Min cuts” and “Normalized cuts”.

4. Other segmentation methods using eigenvectors.

5. Conclusions.

Outline

1.1. IntroductionIntroduction

2. Graph terminology and representation.

3. “Min cuts” and “Normalized cuts”.

4. Other segmentation methods using eigenvectors.

5. Conclusions.

Graph-based Image Segmentation

Image (I)

Graph Affinities(W)

IntensityColorEdgesTexture

Slide from Timothee Cour (http://www.seas.upenn.edu/~timothee)

Graph-based Image Segmentation

Image (I)

)(

1

)(

1),(),(

BvolAvolBAcutBANcut

Slide from Timothee Cour (http://www.seas.upenn.edu/~timothee)

Graph Affinities(W)

IntensityColorEdgesTexture

Graph-based Image Segmentation

Image (I)

)(

1

)(

1),(),(

BvolAvolBAcutBANcut

Eigenvector X(W)

Aiif

AiifiX

DXXWD

A 0

1)(

)(

Slide from Timothee Cour (http://www.seas.upenn.edu/~timothee)

Graph Affinities(W)

IntensityColorEdgesTexture

Graph-based Image Segmentation

Image (I)

)(

1

)(

1),(),(

BvolAvolBAcutBANcut

Eigenvector X(W)

Discretization

Slide from Timothee Cour (http://www.seas.upenn.edu/~timothee)

Aiif

AiifiX

DXXWD

A 0

1)(

)(

Graph Affinities(W)

IntensityColorEdgesTexture

Outline

1. Introduction

2.2. Graph terminology and representation.Graph terminology and representation.

3. “Min cuts” and “Normalized cuts”.

4. Other segmentation methods using eigenvectors.

5. Conclusions.

Graph-based Image Segmentation

V: graph nodesE: edges connection nodes

G = {V,E}

PixelsPixel similarity

Slides from Jianbo Shi

Graph terminology

Similarity matrix:

Slides from Jianbo Shi

2

2

2)()(

,X

ji XX

ji ew

jiwW ,

Affinity matrix

Similarity of image pixels to selected pixelBrighter means more similar

Reshape

N*M pixels

N*M pixels

M pixels

N pixels

WarningWarningthe size of W is quadratic

with the numberof parameters!

WarningWarningthe size of W is quadratic

with the numberof parameters!

Graph terminology

Degree of node:

Slides from Jianbo Shi

j

jii wd ,

……

Graph terminology

Volume of set:

Slides from Jianbo Shi

Ai

i VAdAvol ,)(

Graph terminology

AjAi

jiwAAcut,

,),(

Slides from Jianbo Shi

Cuts in a graph:

Representation

Partition matrix Partition matrix XX::

Pair-wise similarity matrix Pair-wise similarity matrix WW::

Degree matrix Degree matrix DD::

Laplacian matrix Laplacian matrix LL::

KXXX ,...,1

j jiwiiD ,),(

WDL

segments

pix

els

),(),( jiaffjiW

Pixel similarity functions

Intensity

Texture

Distance

2

2

2)()(

),( I

ji II

ejiW

2

2

2)()(

),( X

ji XX

ejiW

2

2

2)()(

),( c

ji cc

ejiW

Pixel similarity functions

Intensity

Texture

Distance

2

2

2)()(

),( I

ji II

ejiW

2

2

2)()(

),( X

ji XX

ejiW

2

2

2)()(

),( c

ji cc

ejiW

here c(x) is a vector of filter outputs. A natural thing to do is to square the outputs

of a range of different filters at different scales and orientations,

smooth the result, and rack these into a vector.

Definitions

Methods that use the spectrum of the affinity matrix to cluster are known as spectral clusteringspectral clustering.

Normalized cuts, Average cuts, Average association make use of the eigenvectors of the affinity matrix.

Why these methods work?

Spectral Clustering

Data Similarities

* Slides from Dan Klein, Sep Kamvar, Chris Manning, Natural Language Group Stanford University

Eigenvectors and blocks

Block matrices have block eigenvectors:

Near-block matrices have near-block eigenvectors:

1 1 0 0

1 1 0 0

0 0 1 1

0 0 1 1

eigensolver

.71

.71

0

0

0

0

.71

.71

1= 2 2= 2 3= 0

4= 0

1 1 .2 0

1 1 0 -.2

.2 0 1 1

0 -.2 1 1

eigensolver

.71

.69

.14

0

0

-.14

.69

.71

1= 2.02 2= 2.02 3= -0.02

4= -0.02

* Slides from Dan Klein, Sep Kamvar, Chris Manning, Natural Language Group Stanford University

Spectral Space

Can put items into blocks by eigenvectors:

Clusters clear regardless of row ordering:

1 1 .2 0

1 1 0 -.2

.2 0 1 1

0 -.2 1 1

.71

.69

.14

0

0

-.14

.69

.71

e1

e2

e1 e2

1 .2 1 0

.2 1 0 1

1 0 1 -.2

0 1 -.2 1

.71

.14

.69

0

0

.69

-.14

.71

e1

e2

e1 e2

* Slides from Dan Klein, Sep Kamvar, Chris Manning, Natural Language Group Stanford University

Outline

1. Introduction

2. Graph terminology and representation.

3.3. ““Min cuts” and “Normalized cuts”.Min cuts” and “Normalized cuts”.

4. Other segmentation methods using eigenvectors.

5. Conclusions.

How do we extract a good cluster?

Simplest ideaSimplest idea: we want a vector xx giving the association between each element and a cluster

We want elements within this cluster to, on the whole, have strong affinity with one anotherstrong affinity with one another

We could maximizemaximize But need the constraintconstraint This is an eigenvalue problemeigenvalue problem - choose the

eigenvector of W with largest eigenvalue.

WxxT

1xxT

Criterion for partition:

Minimum cut

BvAu

BAvuwBAcut

,,

),(min),(min

First proposed by Wu and Leahy

A

B

Ideal Cut

Cuts with lesser weightthan the ideal cut

Problem! Problem! Weight of cut is directly proportional to the number of edges in the cut.

Problem! Problem! Weight of cut is directly proportional to the number of edges in the cut.

Normalized Cut

)(

1

)(

1),(),(

BvolAvolBAcutBANcut

Normalized cut or balanced cut:

Finds better cut

Normalized Cut

Volume of set (or association):

VtAutuwVAassocAvol

,),(),()(

A

B

Normalized Cut

Volume of set (or association):

Define normalized cut: “a fraction of the total edge connections to all the nodes in the graph”:

),(

),(

),(

),(),(

VBassoc

BAcut

VAassoc

BAcutBANcut

VtAutuwVAassocAvol

,),(),()(

),(

),(

),(

),(),(

VBassoc

BBassoc

VAassoc

AAassocBANassoc

A

B

A

B

Define normalized association: “how tightly on average nodes within the cluster are connected to each other”

A

B

01 DyTSubject to:

Observations(I)

Maximizing Nassoc is the same as minimizing Ncut, since they are related:

How to minimize Ncut? Transform Ncut equation to a matricial form. After simplifying:

),(2),( BANassocBANcut

Dyy

yWDyxNcut

T

T

yx

)(min)(min

Rayleigh quotient

j

jiWiiD ),(),( j

jiWiiD ),(),(

NP-Hard!NP-Hard!yy’s values are ’s values are

quantizedquantized

Instead, relax into the continuous domain by solving generalized eigenvalue system:

Which gives: Note that so, the first eigenvector is

y0=1 with eigenvalue 0.

The second smallest eigenvector is the real valued solution to this problem!!

Observations(II)

DyyWD )(

01)( WD

maxy yT D W y subject to yT Dy 1 min

Algorithm

1. Define a similarity function between 2 nodes. i.e.:

2. Compute affinity matrix (W) and degree matrix (D).3. Solve4. Use the eigenvector with the second smallest

eigenvalue to bipartition the graph.5. Decide if re-partition current partitions.

Note: since precision requirements are low, W is very sparse and only few eigenvectors are required, the eigenvectors can be extracted very fast using Lanczos algorithm.

DyyWD )(

2

2

2)()(

2

2

2)()(

,X

ji

I

ji XXFF

ji ew

Discretization

Sometimes there is not a clear threshold to binarize since eigenvectors take on continuous values.

How to choose the splitting point? a) Pick a constant value (0, or 0.5).

b) Pick the median value as splitting point.

c) Look for the splitting point that has the minimum Ncut value:1. Choose n possible splitting points.

2. Compute Ncut value.

3. Pick minimum.

Use k-eigenvectors

Recursive 2-way NcutNcut is slow. We can use more eigenvectors to re-partition the graph, however:

Not all eigenvectors are useful for partition (degree of smoothnessdegree of smoothness). Procedure: compute k-means k-means with a high kk. Then follow one of these

procedures:a) Merge segments that minimize kk-way NcutNcut criterion.

b) Use the k segments and find the partitions there using exhaustive search.

Compute Q (next slides).

1 1 .2 0

1 1 0 -.2

.2 0 1 1

0 -.2 1 1

.71

.69

.14

0

0

-.14

.69

.71

e1

e2

e1 e2

Toy examples

Images from Matthew Brand (TR-2002-42)

Example (I)

EigenvectorsEigenvectors

SegmentsSegments

Example (II)

* Slide from Khurram Hassan-Shafique CAP5415 Computer Vision 2003

Seg

men

tsS

egm

ents

OriginalOriginal

Outline

1. Introduction

2. Graph terminology and representation.

3.3. ““Min cuts” and “Normalized cuts”.Min cuts” and “Normalized cuts”.

4.4. Other segmentation methods using Other segmentation methods using eigenvectors.eigenvectors.

5. Conclusions.

Other methods

Average associationUse the eigenvector of W associated to the

biggest eigenvalue for partitioning.Tries to maximize:

Has a bias to find tight clusters. Useful for gaussian distributions.

B

BBassoc

A

AAassoc ),(),(

A

B

Other methods

Average cutTries to minimize:

Very similar to normalized cuts.We cannot ensure that partitions will have a

a tight within-group similarity since this equation does not have the nice properties of the equation of normalized cuts.

B

BAcut

A

BAcut ),(),(

Other methods

Other methods

20 points are randomly distributed from 0.0 to 0.512 points are randomly distributed from 0.65 to 1.0

Normalized cutNormalized cut

Average cutAverage cut

Average associationAverage association

Other methods

Scott and Longuet-Higgins (1990).Scott and Longuet-Higgins (1990). V contains the first eigenvectors of W. Normalize V by rows. Compute Q=VTV Values close to 1 belong to the same cluster.

Second evSecond evSecond evSecond evFirst evFirst evFirst evFirst ev QQQQWWWWDataDataDataData

Other applications

Costeira and Kanade (1995).Costeira and Kanade (1995). Used to segment points in motion. Compute M=(XY). The affinity matrix W is compute as W=MTM. This trick

computes the affinity of every pair of points as a inner product. Compute Q=VTV Values close to 1 belong to the same cluster.

DataDataDataData MMMM QQQQ

Other applications

Face clustering in meetings. Grab faces from video

in real time (use a face detector + face tracker).

Compare all faces using a distance metric (i.e. projection error into representative basis).

Use normalized cuts to find best clustering.

Outline

1. Introduction

2. Graph terminology and representation.

3. “Min cuts” and “Normalized cuts”.

4. Other segmentation methods using eigenvectors.

5.5. Conclusions.Conclusions.

Conclusions

Good news: Simple and powerful methods to segment images. Flexible and easy to apply to other clustering

problems.

Bad news: High memory requirements (use sparse matrices). Very dependant on the scale factor for a specific

problem.2

2

2)()(

),( X

ji XX

ejiW

Thank you!Thank you!

The End!

Examples

2

2

2)()(

,X

ji XX

ji ew

Spectral Clutering

Images from Matthew Brand (TR-2002-42)

Spectral clustering

Makes use of the spectrum of the similarity matrix of the data to cluster the points.

Solve clustering for

affinity matrix

Solve clustering for

affinity matrix

w(i,j) distance node i to node j

Graph terminology

Similarity matrix: Degree of node: j

jii wd , jiwW ,

Volume of set: Graph cuts:

top related