Self-Validated Labeling of MRFs for Image Segmentation Wei Feng 1,2, Jiaya Jia 2 and Zhi-Qiang Liu 1 1. School of Creative Media, City University of Hong.

Post on 18-Dec-2015

216 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

Transcript

Self-Validated Labeling of MRFs for Image Segmentation

Wei Feng

1,2, Jiaya Jia

2 and Zhi-Qiang Liu

1

1. School of Creative Media, City University of Hong Kong2. Dept. of CSE, The Chinese University of Hong Kong

Accepted by IEEE TPAMI

Outline

Motivation Graph formulation of MRF labeling Graduated graph cuts Experimental results Conclusion

Outline

Motivation Graph formulation of MRF labeling Graduated graph cuts Experimental results Conclusion

Self-Validated Labeling

Common problem: segmentation, stereo etc.

Self-validated labeling: two parts Labeling quality: accuracy (i.e., likelihood) and

spatial coherence Labeling cost (i.e., the number of labels)

Bayesian framework: to minimize the Gibbs energy (equivalent form of MAP)

coherencelikelihoodEEE

Motivation

Computational complexity remains a major weakness of the MRF/MAP scheme

Robustness to noise Preservation of soft boundaries Insensitive to initialization

Motivation

Self-validation: How to determine the number of clusters? To segment a large number of images Global optimization based methods are

robust, but most are not self-validated Split-and-merge methods are self-

validated, but vulnerable to noise

Motivation

For a noisy image consisting of 5 segments

Let’s see the performance of the state-of-the art methods

Motivation

Normalized cut (NCut)

[1]

Unself-validated segmentation (i.e., the user needs to indicated the number of segments, bad)

Robust to noise (good) Average time: 11.38s (fast, good) NCut is unable to return satisfying result when fe

eded by the right number of segments 5; it can produce all “right” boundaries, mixed with many “wrong” boundaries, only when feeded by a much larger number of segments 20.

[1] J. Shi and J. Malik, “Normalized cuts and image segmentation”, PAMI 2000.

Motivation

Bottom-up methods E.g., Mean shift [2] E.g., GBS [3]

Self-validated (good) Very fast (< 1s, good) But, sensitive to noise

(bad)

[2] D. Comaniciu and P. Meer. “Mean shift: A robust approach towards feature space analysis”, PAMI 2002.[3] P. F. Felzenszwalb and D. P. Huttenlocher. “Efficient graph based image segmentation”, IJCV 2004.

Motivation

Data-driven MCMC[4]

Self-validated (good) Robust to noise

(good) But, very slow (bad)

[4] Z. Tu and S.-C. Zhu, “Image segmentation by data-driven Markov chain Monte Carlo”, PAMI 2002.

Motivation

As a result, we need a self-validated segmentation method, which is fast and robust to noise.

Our method: graduated graph mincut Tree-structured graph cuts (TSGC) Net-structured graph cuts (NSGC) Hierarchical graph cuts (HGC)

Time #Seg

TSGC 2.96s 5

NSGC 5.7s 5

HGC 2.01s 6

Motivation

[5] C. D’Elia, G. Poggi, and G. Scarpa, “A tree-structured Markov random field model for Bayesian image segmentation,” IEEE Trans. Image Processing, vol. 12, no. 10, pp. 1250–1264, 2003.

[5]

Outline

Motivation Graph formulation of MRF labeling Graduated graph cuts Experimental results Conclusion

Graph Formulation of MRFs

Graph formulation of MRFs (with second order neighborhood system N2): (a) graph G = <V,E> with K segments {L1, L2 . . . LK } and observation Y; (b) final labeling corresponds to a multiway cut of the graph G.

Graph Formulation of MRFs

Property: Gibbs energy of segmentation Seg(I) can be defined as

MRF-based segmentation ↔ multiway (K-way) graph mincut problem (NP-complete, K=2 solvable)

Outline

Motivation Graph formulation of MRF labeling Graduated graph cuts Experimental results Conclusion

Graduated Graph Mincut

Main idea To gradually adjust the optimal labeling acco

rding to the Gibbs energy minimization principle.

A vertical extension of binary graph mincut (in constrast to horizontal extension, α-expansion and α-β swap)

Graduated Graph Mincut

Binary Labeling of MRFs

Binary Labeling of MRFs

Tree-structured Graph Cuts

Tree-structured Graph Cuts

Tree-structured Graph Cuts

: (over-segmentation)

Net-structured Graph Cuts

Net-structured Graph Cuts

Net-structured Graph Cuts

Hierarchical Graph Cuts

Hierarchical Graph Cuts

Graduated Graph Cuts

Summary An effective tool for self-validated labeling

problems in low level vision. An efficient energy minimization scheme by

graph cuts. Converting the K-class clustering into a

sequence of K−1 much simpler binary clustering. Independent to initialization Very close good local minima obtained by α-

expansion and α-β swap

Segmentation Evolution

Iter #1Iter #2Iter #3Iter #4Mean image

Outline

Motivation Graph formulation of MRF labeling Graduated graph cuts Experimental results Conclusion

Comparative Results

Comparative Experiments

Robustness to Noise

Robust to noise

Preservation of Soft Boundary

Consistency to Ground Truth

Coarse-to-Fine Segmentation

Performance Summary

Outline

Motivation Graph formulation of MRF labeling Graduated graph cuts Experimental results Conclusion

Conclusion

An efficient self-validated labeling method that is very close to good local minima and guarantees stepwise global optimum

Provides a vertical extension to binary graph cut that is independent to initialization

Ready to apply to a wide range of clustering problems in low-level vision

Thanks!Thanks!

top related