Evolving wavelet and scaling numbers for optimized image compression: forward, inverse, or both? A comparative study Brendan Babb a , Frank Moore a , Shawn Aldridge a , Michael Peterson b a University of Alaska Anchorage, 3211 Providence Dr., Anchorage, AK 99508 b University of Hawaii at Hilo, 200 W. Kawili St., Hilo, HI 96720 ABSTRACT The 9/7 wavelet is used for a wide variety of image compression tasks. Recent research, however, has established a methodology for using evolutionary computation to evolve wavelet and scaling numbers describing transforms that outperform the 9/7 under lossy conditions, such as those brought about by quantization or thresholding. This paper describes an investigation into which of three possible approaches to transform evolution produces the most effective transforms. The first approach uses an evolved forward transform for compression, but performs reconstruction using the 9/7 inverse transform; the second uses the 9/7 forward transform for compression, but performs reconstruction using an evolved inverse transform; the third uses simultaneously evolved forward and inverse transforms for compression and reconstruction. Three image sets are independently used for training: digital photographs, fingerprints, and satellite images. Results strongly suggest that it is impossible for evolved transforms to substantially improve upon the performance of the 9/7 without evolving the inverse transform. Keywords: Image compression, wavelets, genetic algorithms 1. INTRODUCTION The biorthogonal 9/7 wavelet transform 6 has become the workhorse of the image compression community. The Joint Photographic Experts Group’s JPEG2000 (J2K) image compression standard 12 , for example, uses a five-level 9/7 wavelet to achieve substantially improved compression performance (in comparison to the JPEG standard) without introducing blocking artifacts. J2K is widely used for compression of medical imagery 7 : the current Digital Imaging and Communication in Medicine (DICOM) standard provides direct support for J2K image compression 11 , and this support has the potential to substantially reduce the massive storage and communications requirements of state-of-the-art Picture Archiving and Communication Systems (PACS). Other imaging standards based upon the 9/7 include the Federal Bureau of Investigation (FBI) fingerprint compression standard, which achieves at least 15:1 compression without noticeable loss of detail 5 . Wavelets 13 are defined using two sets of numbers, known as scaling and wavelet numbers. For the 9/7 wavelet, the scaling (h1) and wavelet (g1) numbers defining the low-pass and high-pass analysis filters, respectively, for the compression transform (rounded to five decimal places) are h1 = [0.03783, -0.02385, -0.11062, 0.37740, 0.85270, 0.37740, -0.11062, -0.02385, 0.03783] g1 = [0.06454, -0.04069, -0.41809, 0.78849, -0.41809, -0.04069, 0.06454] Similarly, the scaling (h2) and wavelet (g2) numbers defining the low-pass and high-pass synthesis filters for the reconstruction transform of this bi-orthogonal wavelet are h2 = [-0.06454, -0.04069, 0.41809, 0.78849, 0.41809, -0.04069, -0.06454] g2 = [0.03783, 0.02385, -0.11062, -0.37740, 0.85270, -0.37740, -0.11062, 0.02385, 0.03783] Quantization 8 – the process of approximating a signal using a relatively small number of bits – introduces permanent, irreversible information loss. Since 2004, our research has focused upon using evolutionary computation (EC) to evolve sets of wavelet and scaling numbers describing new transforms capable of reducing the mean squared error (MSE) observed in reconstructed signals subjected to quantization error, while continuing to match or exceed the compression capabilities of wavelets. We began by using a genetic algorithm (GA) to optimize reconstruction transforms only, and demonstrated modest improvements over wavelet reconstruction transforms at various quantization levels 10 . We then expanded our GA to simultaneously evolve matched compression and reconstruction transform pairs 2 , and