This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
British Machine Vision Conference (BMVC) 2012 September 6, 2012
LOW-COMPLEXITY SINGLE-IMAGE SUPER-RESOLUTIONBASED ON NONNEGATIVE NEIGHBOR EMBEDDING
Marco Bevilacqua1,2, Aline Roumy1, Christine Guillemot1,Marie-Line Alberi Morel2
1Inria Rennes - Bretagne Atlantique 2Alcatel-Lucent - Bell Labs France
Overview
I Single-image Super-ResolutionI What is it?I Methods usedI Neighbor embedding SR
I Proposed algorithmI [KP1] Feature representationI [KP2] Neighbor embedding methodI [KP3] Choice of the dictionary
I Results
I Conclusions
BMVC 2012 - Low-Complexity Single-Image SR September 6, 2012 - 2
Single-image Super-Resolution
What is it?Given a low resolution (LR) input image, the aim is to produce anenhanced upscaling (high resolution - HR).
Our targetI Design a low-complexity yet efficient single-image SR
algorithm
BMVC 2012 - Low-Complexity Single-Image SR September 6, 2012 - 3
Methods
Single-image SR
I Inverse problem methods
I Machine Learning methodsI Example-based methods ( ⇒ correspondences of LR/HR
“patches” )I Direct local Learning (Support Vector Regression, Ridge
Regression...)I Nearest Neighbor (NN) estimation
BMVC 2012 - Low-Complexity Single-Image SR September 6, 2012 - 4
Nearest Neighbor SR1. Dictionary learning Learning correspondences of LR/HR patches
I training images ⇒ external dictionary[Freeman et al., 2002, Yang et al., 2008, Chang et al., 2004]
I correspondences learnt exploiting image self-similarities[Glasner et al., 2009]
2. NN search Search for best matching patches for the LR inputpatches
3. Weight computation Compute the weights of the patchcombinations by using the selected LR candidates
4. HR patch reconstruction Combine the corresponding HRcandidates to reconstruct the HR output patches, according tothe weights computed
I The whole procedure is carried out in a feature spaceI One-pass or multi-pass approach
BMVC 2012 - Low-Complexity Single-Image SR September 6, 2012 - 5
LLE-based Nearest Neighbor SR
In [Chang et al., 2004] an example-based SR algorithm, inspired byLLE (Locally Linear Embedding), is proposed:
I The weights for the linear combination (3. Weightcomputation), like in LLE, are the results of a LS problemthat minimizes the approximation error with a sum-to-1constraint (SUM1-LS)
wi = arg minw‖xi
t − X idw‖2 s.t. 1T w = 1 . (1)
I Gradient features are used for the LR patches;mean-subtracted luma values are used as features for the HRpatches.
BMVC 2012 - Low-Complexity Single-Image SR September 6, 2012 - 6
Key points
Three key points
KP1 Features used to represent the LR and HR patches
KP2 Method used to compute the neighbor embedding (i.e. theweights of the patch combination)
KP3 Nature of the dictionary: external or “internal”?
BMVC 2012 - Low-Complexity Single-Image SR September 6, 2012 - 7
[KP1] Representation by featuresEach patch is represented by a feature vector
Role of the featuresI To catch the most salient information in the LR patches in order to
predict the HR detailsI To enforce the hypothesis of manifold similarity
Various possibilitiesI Simple luminance valuesI Centered luminance values (with mean removal)I Gradient valuesI . . .
BMVC 2012 - Low-Complexity Single-Image SR September 6, 2012 - 8
[KP1] Analysis of the featuresF1 1st order gradientF2 Centered luminance values
F1+F2 Concatenation of F1 and F2
I All curves present a fall (even dramatic in case of F2)I We decide to use F2: ”low-cost” and best performing overall → Can
we avoid the fall?
BMVC 2012 - Low-Complexity Single-Image SR September 6, 2012 - 9
[KP1] Why the fall?Observation 1d : dimension of the LR vectors; K : number of neighbors⇒ The ith neighborhood matrix X i
d has the highest possible rank w.h.p., i.e.ri = min(d − 1,K ).
Observation 2For K = d the SUM1-LS problem is equivalent to a square linear system, as wehave d equations in K = d unknowns ⇒ unique solution in the LR domain.Here, experimentally we have a “critical point” in the performance.⇒ The fall is because of an overfitting problem!
BMVC 2012 - Low-Complexity Single-Image SR September 6, 2012 - 10
[KP2] A nonnegative embedding?
I Idea: replace the sum-to-1 equality constraint by an inequalityconstraint to avoid the unique solution problem
I ⇒ Patches reconstructed only by additive combinationsaccording to the “intuitive notion of combining parts to forma whole”
I The LS problem (1) becomes a nonnegative least squares(NNLS) problem:
wi = arg minw‖xi
t − X idw‖2 s.t. w ≥ 0 . (2)
BMVC 2012 - Low-Complexity Single-Image SR September 6, 2012 - 11
[KP2] Analysis of the weightsI Distribution of the weights
SUM1-LS NNLSI Distance between the actual LR weights and the “ideal” HR weights
BMVC 2012 - Low-Complexity Single-Image SR September 6, 2012 - 12
[KP3] Choice of the dictionary
Two possibilitiesI build an external dictionary from a set of training images (from the
original HR images, generate the LR versions, and extract HR and LRpatches, respectively)
I learn the patch correspondences in a pyramid of recursively scaled images,starting from the LR input image, in the way of [Glasner et al., 2009]
Internal DB Ext DB “esa” Ext DB “wiki”Image Scale PSNR DB size PSNR DB size PSNR DB size