New Measures of Entropy and Divergence for Fuzzy Distribution G.S.BUTTAR Department of Mathematics, DAV University, Jalandhar, Punjab, India. gurcharanbuttar@gmail.com Abstract- In this paper new measures of fuzzy entropy and fuzzy directed divergence have been developed and studied their essential properties for authenticity. The measures provide a healthy contribution to the literature of fuzzy measures and find marvelous applications in distinct fields. Keywords- Fuzzy set theory, Fuzzy entropy, Concavity, Fuzzy divergence, Convex function. I. INTRODUCTION In information theory, Shannon(1948) as the first person to depict the word ‘entropy’ to measure uncertain degree of randomness in probability distribution. De Luca and Termini (1972) proposed a non-probabilistic measure of entropy consequent to Shannon’s (1948) entropy given by ( ) ( ) ( ) ( ) ( ) ( ) 1 ( ) log 1 log 1 n A i A i A i A i i HA x x x x μ μ μ μ = =- + - - (1) A measure of divergence ( : ) DPQ usually known as cross entropy or directed divergence. The nature of the measure is probabilistic. The measure is defined as the incongruity of a probability distribution P from a different probability distribution Q. In short, it tries to measure the distance of P from Q. The most significant and functional measure of directed divergence recurrently applied in the literature of information theory is owing to Kullback and Leibler(1951) and is defined by the below expression: 1 ( : ) ln n i i i i p DPQ p q = = (2) Zadeh (1965) who published his paper “Fuzzy Sets”, the theory of fuzzy sets came into existence. Fuzzy set theory is a potent tool to model imprecise and vague situation where probability set theory ceases to deliver. Taking into account, the inspiration of fuzzy sets introduced by Zadeh(1965), Bhandari and Pal(1993) developed a non-probabilistic measure of directed divergence parallel to Kullback and Leibler’s(1951) divergence measure and is given by: 1 ( ) 1 ( ) ( : ) ( ) log (1 ( )) log ( ) 1 ( ) n A i A i A i A i i B i B i x x DAB x x x x μ μ μ μ μ μ = - = + - - (3) It has tremendous applications in many areas such as signal and image processing, decision making, pattern recognition etc. Corresponding to Havrada and Charvat’s(1967) measure of directed divergence in probability distribution, Kapur(1997) presented the expressions of fuzzy divergence measures as written below: 1 1 1 ( ) ( ) 1 ( : ) 1 (1 ( )) (1 ( )) 1 n A i B i i A i B i x x D AB x x α α α α α μ μ α μ μ - - = = - + - - - , 0, 1 α α > (4) Many new non-probabilistic measures of entropy have been expressed by Kapur (1995), Yager(1979), Klir and Folger(1988), Singpurwalla and Booker(2004), Parkash(1998),Gurdial and pessoa(1977) etc. In section II new International Journal of Computer Science and Information Security (IJCSIS), Vol. 14, No. 10, October 2016 914 https://sites.google.com/site/ijcsis/ ISSN 1947-5500