AbstractIn this paper, we propose a novel approach for image segmentation via fuzzification of Rènyi Entropy of Generalized Distributions (REGD). The fuzzy REGD is used to precisely measure the structural information of image and to locate the optimal threshold desired by segmentation. The proposed approach draws upon the postulation that the optimal threshold concurs with maximum information content of the distribution. The contributions in the paper are as follow: Initially, the fuzzy REGD as a measure of the spatial structure of image is introduced. Then, we propose an efficient entropic segmentation approach using fuzzy REGD. However the proposed approach belongs to entropic segmentation approaches (i.e. these approaches are commonly applied to grayscale images), it is adapted to be viable for segmenting color images. Lastly, diverse experiments on real images that show the superior performance of the proposed method are carried out. KeywordsEntropy of generalized distributions, entropy fuzzification, entropic image segmentation. I. INTRODUCTION MAGE segmentation is an elementary and significant component in many applications such as image analysis, pattern recognition, medical diagnosis and currently in robotic vision. However, it is one of the most difficult and challenging tasks in image processing, and determines the quality of the final results of the image analysis. Instinctively, image segmentation is the process of dividing an image into different regions such that each region is homogeneous while not a union of any two adjacent regions. An additional requirement would be that these regions have a correspondence to real homogeneous regions belonging to objects in the scene. Various algorithms using different approaches have been proposed for image segmentation. These approaches include local edge detection (e.g. [1]), deformable curves (e.g. [2]), morphological region-based approaches (e.g. [3-5]), global optimization approaches on energy functions and stochastic Samy Sadek is currently with the Institute for Electronics, Signal Processing and Communications, Otto-von-Guericke University Magdeburg, Magdeburg, CO 39106 Germany(corresponding author to provide phone: +493916711472; fax: +49-3916711231; e-mail: Samy.Bakheet@ ovgu.de). Ayoub Al-Hamadi, Axel Panning ,and Bernd Michaelis are also with the Institute for Electronics, Signal Processing and Communications, Otto-von- Guericke University, Magdeburg, CO 39106 Germany (e-mail: Ayuob.Al- Hamadi@ovg.de). Usama Sayed is with the Electrical Engineering Department, Assuit University, CO 34565 Egypt (e-mail: usama @aun.edu.eg). model-based methods (e.g. [6-8]). Recent developments of statistical mechanics based on a concept of nonextensive entropy have intensified the interest of investigating a possible extension of Shannon’s entropy to Information Theory [9]. This interest appears mainly due to similarities between Shannon and Boltzmann/Gibbs entropy functions. The nonextensive entropy is a new proposal in order to generalize the Boltzmann/Gibbs’s traditional entropy to nonextensive systems (i.e. strong correlated systems are good candidates to be nonextensive). In this theory a new parameter α is introduced as a real number associated with the nonextensivity of the system. In this paper we propose a new approach for image segmentation which applies for the first time fuzzy conception on the Rènyi entropy of generalized distributions. Our work for image segmentation does better in comparison to the most recent entropic methods [10]. The remainder of the paper is organized as follows. In the next section, the essential concepts of Rènyi entropy of generalized distributions and nonextensive systems are addressed. Then, the proposed approach is elaborately described in section 3. Section 4 presents the experimental results that show the performance of the proposed approach. Finally, section 5 is dedicated for outlining the benefits of the proposed approach and concluding the paper. II. ENTROPY OF GENERALIZED DISTRIBUTIONS In 1948 Shannon [11] redefined the entropy of Boltzmann/Gibbs as a measure of uncertainty regarding the information content of a system. He defined an expression for measuring the amount of information produced by a process. Let     be a finite discrete probability distribution, that is, suppose  0,   1,2, … ,  and  1.  The amount of uncertainty of the distribution P, that is, the amount of uncertainty concerning the outcome of an experiment, the possible results of which have the probabilities , ,..., is called the entropy of the distribution and is usually measured by the quantity    , ,..., , introduced by Shannon and defined by , , ,  (1) It is easy to see that the Shannon entropy for the conjunction of two distributions P and Q satisfies A New Approach to Image Segmentation via Fuzzification of Rènyi Entropy of Generalized Distributions Samy Sadek, Ayoub Al-Hamadi, Axel Panning, Bernd Michaelis, Usama Sayed I World Academy of Science, Engineering and Technology 56 2009 598