Top Banner
1 Non-Parametric Bayesian Dictionary Learning for Sparse Image Representations Mingyuan Zhou, Haojun Chen, John Paisley, Lu Ren, 1 Guillermo Sapiro and Lawrence Carin Department of Electrical and Computer Engineering Duke University, Durham, NC, USA 1 Department of Electrical and Computer Engineering University of Minnesota, Minneapolis, MN, USA NIPS, December 2009 Zhou 2009
22

Non-Parametric Bayesian Dictionary Learning for Sparse

Feb 26, 2022

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Non-Parametric Bayesian Dictionary Learning for Sparse

1

Non-Parametric Bayesian Dictionary Learningfor Sparse Image Representations

Mingyuan Zhou, Haojun Chen, John Paisley, Lu Ren,1Guillermo Sapiro and Lawrence Carin

Department of Electrical and Computer EngineeringDuke University, Durham, NC, USA

1Department of Electrical and Computer EngineeringUniversity of Minnesota, Minneapolis, MN, USA

NIPS, December 2009

Zhou 2009

Page 2: Non-Parametric Bayesian Dictionary Learning for Sparse

2

Outline

• Introduction

• Dictionary learning

• Model and inference

• Image denosing

• Image inpainting

• Compressive sensing

• Conclusions

NIPS 2009

Page 3: Non-Parametric Bayesian Dictionary Learning for Sparse

3

Introduction

• Sparse representations: simple models, interpretable dictionary elements and sparse coefficients.

• Applications: Image denosing, inpainting, and compressive sensing.

• “Off-the-shelf” bases/dictionaries.

• Over-complete dictionary matched to the signals of interest may improve performance.

NIPS 2009

Page 4: Non-Parametric Bayesian Dictionary Learning for Sparse

4

Introduction: sparse coding

• Objective function:Given and

• Exact solution: a NP-hard problem• Approximate solutions:

Greedy algorithms (OMP)Convex relaxation approaches (Lars, Lasso, BCS)

• Sparse representation under an appropriate dictionary: data recovery

NIPS 2009

Page 5: Non-Parametric Bayesian Dictionary Learning for Sparse

5

Introduction: dictionary learning

• “Off-the-shelf” bases/dictionariesDFT, DCT, Wavelet

Simple, fast computation

• Dictionaries adapted to the data under testImproved performance

Better interpretation

NIPS 2009

Page 6: Non-Parametric Bayesian Dictionary Learning for Sparse

6

• Global objective function

• Sparse coding stage (fix the dictionary)

• Dictionary update stageMethod of optimal direction, MOD (fix the sparse codes):

K-SVD (fix the sparsity pattern, rank-1 approximation):

Dictionary Learning: General Approach

NIPS 2009

Page 7: Non-Parametric Bayesian Dictionary Learning for Sparse

7

• Restrictions of previous dictionary learning approaches:The noise variance or sparsity level are assumed to be known.The size of the dictionary need to be set a priori.Only point estimates are provided.

• How to relax these restrictions?Introduce a non-parametric Bayesian dictionary learning approach.Use sparsity promoting priors instead of enforcing the sparsitylevel/noise variance.Preset a large dictionary size and let the data itself infer an appropriate dictionary size.

Dictionary Learning: Restrictions and solutions

NIPS 2009

Page 8: Non-Parametric Bayesian Dictionary Learning for Sparse

8

• Representation (naive form):

• Beta process formulation:

• Binary weights:

• Representation (with pseudo weights):

Dictionary Learning with Beta process Priors

NIPS 2009

Page 9: Non-Parametric Bayesian Dictionary Learning for Sparse

9

• Data are fully observed

• Data are partially observed

Hierarchical model

NIPS 2009

Page 10: Non-Parametric Bayesian Dictionary Learning for Sparse

10

• Full likelihood

• Gibbs Sampling Inference

Hierarchical model

NIPS 2009

Page 11: Non-Parametric Bayesian Dictionary Learning for Sparse

11

• MODtwo stages: dictionary learning, sparse coding.

• K-SVDtwo stages: dictionary learning (enforced sparsitypattern), sparse coding.

• Dictionary learning with beta process priorsthree stages: dictionary learning (enforced sparsitypattern), sparsity pattern update, pseudo weights update.

• The three models have apparent differences in the level of exploiting previous obtained information.

Model comparison

NIPS 2009

Page 12: Non-Parametric Bayesian Dictionary Learning for Sparse

12

• Partitioning the whole date set to be

Instead of directly calculating

we first calculate

The posterior is then used as prior for D for calculating

Sequential learning for large data sets

NIPS 2009

Page 13: Non-Parametric Bayesian Dictionary Learning for Sparse

13

• The noise variance/sparsity level need not be known.• The dictionary size is automatically inferred.• Training data are not required.• The average sparsity level of the representation is

inferred from the data itself, and based on the posterior, each sample has its own unique sparse representation.

• A single model applicable for gray-scale, RGB, and hyperspectral image denoising & inpainting.

Non-Parametric Bayesian Dictionary Learning

NIPS 2009

Page 14: Non-Parametric Bayesian Dictionary Learning for Sparse

14

Image denoising

Noisy imageKSVD Denoising

mismatched varianceKSVD Denoising

matched variance BPFA Denoising Dictionaries

NIPS 2009

Page 15: Non-Parametric Bayesian Dictionary Learning for Sparse

15

80% Pixels Missing

50% Pixels Missing

Image inpainting

Original image Restored imageCorrupted image Dictionary

Original image Restored imageCorrupted image Dictionary

NIPS 2009

Page 16: Non-Parametric Bayesian Dictionary Learning for Sparse

16

• 480*321 RGB image, 80% missing

RGB image inpainting

Original imageRestored imageCorrupted image Dictionary

Learning round

PSN

R

NIPS 2009

Page 17: Non-Parametric Bayesian Dictionary Learning for Sparse

17

RGB image inpainting

Original image

Restored image

Corrupted image

Dictionary

NIPS 2009

Page 18: Non-Parametric Bayesian Dictionary Learning for Sparse

18

Corrupted image, 10.9475dB

spec

tral b

and_

100

Original imageRestored image,25.8839dB

Corrupted image, 10.9475dB

spec

tral b

and_

1

Original imageRestored image,25.8839dB

Hyperspectral image inpainting

150*150*210 hyperspectral urban image95% missing

NIPS 2009

Page 19: Non-Parametric Bayesian Dictionary Learning for Sparse

19

Hyperspectral image inpainting

845*512*106 hyperspectral image98% missing

100 200 300 400 500

100

200

300

400

500

600

700

800

100 200 300 400 500

100

200

300

400

500

600

700

800

100 200 300 400 500

100

200

300

400

500

600

700

800

100 200 300 400 500

100

200

300

400

500

600

700

800

Spectral band 50 Spectral band 90

Original Restored Original RestoredNIPS 2009

Page 20: Non-Parametric Bayesian Dictionary Learning for Sparse

20

Compressive sensing

Image size: 480 by 320, 2400 8 by 8 patches153600 coefficients are estimated

3 3.5 4 4.5 5 5.5 6 6.5 7 7.5

x 104

0.2

0.25

0.3

0.35

0.4

0.45

0.5

Number of Measurements

Rel

ativ

e R

econ

stru

ctio

n E

rror

PSBP BPDP BPBPBCSFast BCSBasis PursuitLARS/LassoOMPSTOMP-CFAR

3 3.5 4 4.5 5 5.5 6 6.5 7 7.5

x 104

0

0.05

0.1

0.15

0.2

0.25

0.3

Number of Measurements

Rel

ativ

e R

econ

stru

ctio

n E

rror

PSBP BPDP BPOnline BPBPBCSFast BCSBasis PursuitLARS/LassoOMPSTOMP-CFAR

DCT Learned Dictionary

NIPS 2009

Page 21: Non-Parametric Bayesian Dictionary Learning for Sparse

21

Conclusions

• Non-parametric Bayesian dictionary learning.• Gray-scale, RGB, and hyperspectral Image

denoising, inpainting, and compressive sensing.• Automatically inferred dictionary size, noise

variance and sparsity level.• Dictionary learning and data reconstruction on

the data under test.• A generative approach for data recovery from

redundant noisy and incomplete observations.

NIPS 2009

Page 22: Non-Parametric Bayesian Dictionary Learning for Sparse

22

References

NIPS 2009