Top Banner
From Learning Models of Natural Image Patches to Whole Image Restoration Daniel Zoran Interdisciplinary Center for Neural Computation Hebrew University of Jerusalem Yair Weiss School of Computer Science and Engineering Hebrew University of Jerusalem Presented by Eric Wang Duke University 6/18/2012
14

From Learning Models of Natural Image Patches to Whole Image Restoration

Feb 24, 2016

Download

Documents

kreeli

From Learning Models of Natural Image Patches to Whole Image Restoration. Daniel Zoran Interdisciplinary Center for Neural Computation Hebrew University of Jerusalem Yair Weiss School of Computer Science and Engineering Hebrew University of Jerusalem Presented by Eric Wang - PowerPoint PPT Presentation
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: From Learning Models of Natural Image Patches to Whole Image Restoration

From Learning Models of Natural Image Patches to Whole Image Restoration

Daniel ZoranInterdisciplinary Center for Neural Computation

Hebrew University of Jerusalem

Yair WeissSchool of Computer Science and Engineering

Hebrew University of Jerusalem

Presented by Eric WangDuke University

6/18/2012

Page 2: From Learning Models of Natural Image Patches to Whole Image Restoration

Introduction• Patch based learning on images has significant computational

advantages over learning dictionaries over the entire image.

• A primary concern of this paper is the affect the choice of dictionary priors have on the performance of the model.

• This paper addresses 3 main questions– (1) Do priors that give high likelihoods yield better patch restoration

performance?– (2) Do priors that give high likelihoods yield better whole-image

restoration performance? – (3) Can we learn better priors?

Page 3: From Learning Models of Natural Image Patches to Whole Image Restoration

Motivation and Patch Restoration• Answer to question (1): Priors with higher likelihoods will yield

improved per-patch denoising performance, as shown with several popular priors

PSNR of restoration on patches vs. dictionary (prior) likelihoods. Trained on 50,000 8x8 patches of natural images of faces. Tested on unseen patches.

Page 4: From Learning Models of Natural Image Patches to Whole Image Restoration

From Patches to Whole Image Restoration• Good patch-based image restoration does not guarantee high

quality image restoration, many reconstruction methods can generate significant artifacts.

Page 5: From Learning Models of Natural Image Patches to Whole Image Restoration

Expected Patch Log-Likelihood • Choosing random patches can provide a solution to minimize

artifacting, but has the issue that most random patches will have low likelihood to a given dictionary.

• This paper presents an optimization algorithm that maximizes expected patch log-likelihood (EPLL) with the constraint that the reconstructed image be close to the original image.

• The EPLL under prior p is defined as

• Where is a mask that extracts the ith random patch from image x.

Page 6: From Learning Models of Natural Image Patches to Whole Image Restoration

Cost Function• Let be the corruption model on the image where x

is a vectorized image, A defines the corruption model and y is the vectorized noisy observation.

• The cost function to minimize is then

• Direct optimization of this cost function is intractable, so an alternative method called half quadratic splitting is used, where are a set of per-patch auxiliary variables.

Page 7: From Learning Models of Natural Image Patches to Whole Image Restoration

The Corruption Model • The choice of the matrix A is determined by the application.

• For denoising, A is an identity matrix, and is the noise precision

• For deblurring, A is a convolution matrix with a known kernel

• For inpainting, A is a diagonal matrix with zeros for the missing elements.

Page 8: From Learning Models of Natural Image Patches to Whole Image Restoration

Optimization• The EPLL optimization involves two steps: (1) solving for

given

• And (2) solving for given , which is dependent on the dictionary and involves solving a MAP estimate of the most likely dictionary element for a particular patch.

• is either set by hand or set to where is the estimated noise standard deviation in .

Page 9: From Learning Models of Natural Image Patches to Whole Image Restoration

Image Restoration• Answer to question (2): It is shown that priors with higher

patch likelihoods yield improved whole image restoration

Page 10: From Learning Models of Natural Image Patches to Whole Image Restoration

Building a better dictionary via the GMM• Gaussian zero mean data can usually be well represented by

the top-m eigenvectors of its covariance matrix.

• This paper proposes clustering the patches (pixels) via a GMM. Patches sharing a cluster also share a dictionary.

Sample dictionaries for six mixture components

Page 11: From Learning Models of Natural Image Patches to Whole Image Restoration

Building a better dictionary via the GMM• Answer to question (3): The GMM prior outperforms other

priors in both patch and whole-image restoration.

PSNR values, 200 GMM components

Page 12: From Learning Models of Natural Image Patches to Whole Image Restoration

Building a better dictionary via the GMM• The GMM prior is also shown to outperform the ICA prior in

reconstruction with noise level using EPLL.

Page 13: From Learning Models of Natural Image Patches to Whole Image Restoration

Image Denoising• 68 images from the Berkeley dataset, 8x8 patches,

comaprison is in PSNR

Page 14: From Learning Models of Natural Image Patches to Whole Image Restoration

Image Deblurring• 68 images from the Berkeley dataset with known blur kernels

and 1% white Gaussian noise (comparison is in PSNR)