Ankita Hundet et al Int. Journal of Engineering Research and Applications www.ijera.com ISSN : 2248-9622, Vol. 4, Issue 4( Version 7), April 2014, pp.103-107 www.ijera.com 103 | Page Survey for Image Representation Using Block Compressive Sensing For Compression Applications Ankita Hundet, Dr. R.C. Jain, Vivek Sharma Department of Information Technology SATI Vidisha, India Department of Information Technology SATI Vidisha, India Department of Information Technology SATI Vidisha, India Abstract Compressing sensing theory have been favourable in evolving data compression techniques, though it was put forward with objective to achieve dimension reduced sampling for saving data sampling cost. In this paper two sampling methods are explored for block CS (BCS) with discrete cosine transform (DCT) based image representation for compression applications - (a) coefficient random permutation (b) adaptive sampling. CRP method has the potency to balance the sparsity of sampled vectors in DCT field of image, and then in improving the CS sampling efficiency. To attain AS we design an adaptive measurement matrix used in CS based on the energy distribution characteristics of image in DCT domain, which has a good impact in magnifying the CS performance. It has been revealed in our experimental results that our proposed methods are efficacious in reducing the dimension of the BCS-based image representation and/or improving the recovered image quality. The planned BCS based image representation scheme could be an efficient alternative for applications of encrypted image compression and/or robust image compression. KeywordsBlock compressive sensing, Coefficient random permutation, compressive sensing, random sampling, Image representation, I. INTRODUCTION Recent years have seen important interest in the paradigm of compressed sensing (CS) which permit, in certain conditions, signals to be sampled at sub-Nyquist rates via linear projection onto a random basis while still enabling exact reconstruction of the original signal. As applied to 2D images, though, CS faces some challenges including a computationally expensive reconstruction process and huge memory required to store the random sampling operator. in recent times, numerous fast algorithms have been developed for CS rebuilding, while the latter challenge was addressed in [1] using a block-based sampling operation. Additionally in projection- based Landweber iterations were planned to achieve fast CS reconstruction while simultaneously imposing smoothing with the goal of improving the reconstructed-image quality by eliminating blocking artifacts. Sparse representation and compressive sensing establishes a more rigorous mathematical framework for studying high-dimensional data and ways to discover the structures of the data, giving rise to a large range of efficient algorithms. When transmitting data over insecure bandwidth-limited channels, data compression and encryption is for all time needed. An encryption algorithm converts the data from comprehensible to incomprehensible structure, thus making the encrypted data difficult to compress using any of the classical compression algorithms, which relies on intelligence embedded in the data. Hence, traditionally the encryption always follows compression. While such a scheme is suitable for most of the applications, there are some applications which need encryption to be carried out before compression [2]. Consider for example, an information owner and a network operator who does not trust each other. In such a case, to protect his content, the information owner encrypts his data before giving it to network operator. Due to the bandwidth limitation the network operator is forced to compress this encrypted data stream [3]-[5]. II. PROBLEM FORMULATION The idea of CS is to represent the signal by non adaptive random projections to reduce the sampling rate, which is considered as an advantage of CS. However, the main challenges existed in CS for practical applications include how to reduce efficiently measurement rate with preserving good recovered image quality and decreasing the implementation complexity. To address these problems, many researches [6] have reported to use the prior knowledge of the sampled signal to enhance the performance of CS. Although most of them focused on decoder-based reconstruction optimization, encoder-based sampling optimization may have important sense for addressing these problems. Block-based sampling for fast CS of RESEARCH ARTICLE OPEN ACCESS