Top Banner
Nonlinear Wavelet Image Processing: Variational Problems, Compression, and Noise Removal through Wavelet Shrinkage * Antonin Chambolle 1 , Ronald A. DeVore 2 , Nam-yong Lee 3 , and Bradley J. Lucier 4 Abstract This paper examines the relationship between wavelet-based image processing algorithms and variational problems. Algorithms are derived as exact or approximate minimizers of variational prob- lems; in particular, we show that wavelet shrinkage can be considered the exact minimizer of the following problem: given an image F de- fined on a square I , minimize over all g in the Besov space B 1 1 (L 1 (I )) the functional kF - gk 2 L 2 (I) + λkgk B 1 1 (L 1 (I)) . We use the theory of nonlinear wavelet image compression in L 2 (I ) to derive accurate error bounds for noise removal through wavelet shrinkage applied to images corrupted with i.i.d., mean zero, Gaussian noise. A new signal-to- noise ratio, which we claim more accurately reflects the visual per- ception of noise in images, arises in this derivation. We present ex- tensive computations that support the hypothesis that near-optimal shrinkage parameters can be derived if one knows (or can estimate) only two parameters about an image F : the largest α for which F B α q (Lq (I )), 1/q = α/2+1/2, and the norm kF k B α q (Lq (I)) . Both theoretical and experimental results indicate that our choice of shrinkage parameters yields uniformly better results than Donoho and Johnstone’s VisuShrink procedure; an example suggests, how- ever, that Donoho and Johnstone’s SureShrink method, which uses a different shrinkage parameter for each dyadic level, achieves lower error than our procedure. 1. Introduction This paper has several objectives. The first is to de- scribe several families of variational problems that can be solved quickly using wavelets. These variational problems take the form: given a positive parameter λ and an image, a signal, or noisy data f (x) defined for x in some finite do- main I , find a function ˜ f that minimizes over all possible functions g the functional (1) kf - gk 2 L2(I ) + λkgk Y , where kf - gk L2(I ) := Z I |f (x) - g(x)| 2 dx 1/2 * A shorter version of this paper appeared in the IEEE Trans- actions on Image Processing, v. 7, 1998, pp. 319–335. 1 CEREMADE (CNRS URA 749), Universit´ e de Paris– Dauphine, 75775 Paris CEDEX 16, France, Antonin.Chambolle@ ceremade.dauphine.fr. Supported by the CNRS. 2 Department of Mathematics, University of South Carolina, Columbia, SC 29208, [email protected]. Supported in part by the Office of Naval Research, Contract N00014-91-J-1076. 3 Department of Mathematics, Purdue University, West Lafay- ette, IN 47907-1395, [email protected]. Supported in part by the Purdue Research Foundation. 4 Department of Mathematics, Purdue University, West Lafay- ette, IN 47907-1395, [email protected]. Supported in part by the Office of Naval Research, Contract N00014-91-J-1152. Part of this work was done while the author was a visiting scholar at CEREMADE, Universit´ e de Paris–Dauphine, Paris, France. is the root-mean-square error (or more generally, differ- ence) between f and g, and kgk Y is the norm of the ap- proximation g in a smoothness space Y . The original image f could be noisy, or it could simply be “messy” (a medical image, for example), while ˜ f would be a denoised, seg- mented, or compressed version of f . The amount of noise removal, compression, or segmentation is determined by the parameter λ; if λ is large, then necessarily kgk Y must be smaller at the minimum, i.e., g must be smoother, while when λ is small, g can be rough, with kgk Y large, and one achieves a small error at the minimum. These types of variational problems have become fairly common in image processing and statistics; see, e.g., [32]. For example, Rudin-Osher-Fatemi [33] set Y to the space of functions of bounded variation BV(I ) for im- ages (see also [1]), and non-parametric estimation sets Y to be the Sobolev space W m (L 2 (I )) of functions all of whose mth derivatives are square-integrable; see the monograph by Wahba [34]. In fact, Y can be very general; one could, for example, let Y contain all piecewise constant func- tions, with kgk Y equal to the number of different pieces or segments of g; this would result in segmentation of the original image f . Indeed, Morel and Solimini [31] argue that almost any reasonable segmentation algorithm can be posed in this form. Techniques like this are also known as Tikhonov regularization; see [2]. In [12] we considered (1) in the context of interpolation of function spaces; in that theory, the infimum of (1) over all g is K(f,λ,L 2 (I ),Y ), the K-functional of f between L 2 (I ) and Y . A fast way of solving (1) is required for practical al- gorithms. In [12], we noted that the norms of g in many function spaces Y can be expressed in terms of the wavelet coefficients of g. In other words, if we choose an (orthogo- nal or biorthogonal) wavelet basis for L 2 (I ), and we expand g in terms of its wavelet coefficients, then the norm kgk Y is equivalent to a norm of the wavelet coefficients of g; see, for example, [30], [14], or [23]. In [12] we proposed that by choosing kgk Y to be one of these norms and by calculating approximate minimizers rather than exact minimizers, one can find efficient com- putational algorithms in terms of the wavelet coefficients of the data f . In particular, we showed how choosing Y = W m (L 2 (I )) and approximately minimizing (1) leads to wavelet algorithms that are analogous to well-known linear algorithms for compression and noise removal. In particular, we find that the wavelet coefficients of ˜ f are simply all wavelet coefficients of f with frequency below a fixed value, determined by λ. Additionally, we proposed choosing Y from the family
23

Nonlinear Wavelet Image Processing: Variational Problems, Compression, and Noise Removal through Wavelet Shrinkage

Jun 19, 2023

Download

Documents

Nana Safiana
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.