Weighted and extended total variation for image restoration and decomposition A. El Hamidi a,Ã , M. Me ´ nard b,Ã , M. Lugiez a,b , C. Ghannam a,b a MIA, Av. M. Cre´peau, La Rochelle, France b L3I, Av. M. Cre´peau, La Rochelle, France article info Article history: Received 2 April 2009 Received in revised form 15 October 2009 Accepted 17 October 2009 Keywords: Convex and non-convex regularization Texture decomposition Chambolle’s projection Weighted total variation Extended total variation abstract In various information processing tasks obtaining regularized versions of a noisy or corrupted image data is often a prerequisite for successful use of classical image analysis algorithms. Image restoration and decomposition methods need to be robust if they are to be useful in practice. In particular, this property has to be verified in engineering and scientific applications. By robustness, we mean that the performance of an algorithm should not be affected significantly by small deviations from the assumed model. In image processing, total variation (TV) is a powerful tool to increase robustness. In this paper, we define several concepts that are useful in robust restoration and robust decomposition. We propose two extended total variation models, weighted total variation (WTV) and extended total variation (ETV). We state generic approaches. The idea is to replace the TV penalty term with more general terms. The motivation is to increase the robustness of ROF (Rudin, Osher, Fatemi) model and to prevent the staircasing effect due to this method. Moreover, rewriting the non-convex sublinear regularizing terms as WTV, we provide a new approach to perform minimization via the well-known Chambolle’s algorithm. The implementation is then more straightforward than the half-quadratic algorithm. The behavior of image decomposition methods is also a challenging problem, which is closely related to anisotropic diffusion. ETV leads to an anisotropic decomposition close to edges improving the robustness. It allows to respect desired geometric properties during the restoration, and to control more precisely the regularization process. We also discuss why compression algorithms can be an objective method to evaluate the image decomposition quality. & 2009 Elsevier Ltd. All rights reserved. 1. Introduction 1.1. Motivation In many problems of image analysis, we have an observed image f, representing a real scene. f contains texture v and/or noise w. Texture is characterized as some repeated pattern of small scale details. Noise is characterized as uncorrelated random patterns. The rest of the image, u, contains homogeneous regions and sharp edges. The image processing task is to extract the most meaningful information from f. Given a noisy sample of some true data, the goal of restoration is to recover the best possible estimate of the original true data, using only the noisy sample. Restoration is usually formulated as an inverse problem. The most basic image restoration problem is denoising, which is a well- known ill-posed problem. In essence, to determine a single solution, one introduces the constraint that the solution must be smooth, in the intuitive sense that similar inputs must correspond to similar outputs. The problem is then cast as a variational problem in which the variational integral depends both on the data and on the smoothness constraint (regularization term). Denoising models can also be regarded as a decomposition of the image into structural parts, u (homogeneous regions and sharp edges), and noise, f u, which can contain oscillating patterns such as noise and texture, v þ w. Following the ideas of Meyer, image decomposition can also differentiate the texture, v, of the noise, w. The textured component is completely repre- sented using only two functions ðg 1 ; g 2 Þ. Three main successful approaches are usually considered to solve the denoising problem: wavelet-based techniques, non- linear partial–differential equations, and image decomposition. Wavelet-based techniques: Different from filtering-based clas- sical methods, wavelet-based methods can be viewed as trans- form-domain point processing. They can achieve a good tradeoff between noise reduction and feature preservation. Donoho and Johnstone [20] developed the method of wavelet shrinkage denoising. The method attempts to reject noise by thresholding ARTICLE IN PRESS Contents lists available at ScienceDirect journal homepage: www.elsevier.com/locate/pr Pattern Recognition 0031-3203/$ - see front matter & 2009 Elsevier Ltd. All rights reserved. doi:10.1016/j.patcog.2009.10.011 Ã Corresponding authors. E-mail addresses: aelhamid@univ-lr.fr (A. El Hamidi), michel.menard@univ-lr.fr (M. Me ´ nard). Pattern Recognition 43 (2010) 1564–1576