246 Int. J. Applied Systemic Studies, Vol. 8, No. 3, 2018
Copyright © 2018 Inderscience Enterprises Ltd.
On generalised intutionistic fuzzy divergence
Anjali Munde
Amity College of Commerce and Finance,
Amity University,
Uttar Pradesh, India
Email: anjalidhiman2006@gmail.com
Abstract: Atanassov (1986) defined the notion of intuitionistic fuzzy sets
(IFS), which is a generalisation of the concept of fuzzy sets, introduced by the
Zadeh (1965). Decision makers may not be able to accurately express their
view for the problem as they may not possess a precise or sufficient knowledge
of the problem or the decision makers are unable to discriminate explicitly the
degree to which one alternative are better than others in such cases, the decision
maker may provide their preferences for alternatives to certain degree, but it is
possible that they are not so sure about it. Thus, it is very suitable to express the
decision maker preference values with the use of fuzzy/intuitionistic fuzzy
values rather than exact numerical values or linguistic variables (Szmidt and
Kacprzyk, 2001, 1997, 2000). In the present communication, generalised
measures of intutionistic fuzzy divergence with the proof of their validity are
introduced.
Keywords: fuzzy sets; fuzzy entropy; intuitionistic fuzzy set; intuitionistic
fuzzy divergence.
Reference to this paper should be made as follows: Munde, A. (2018)
‘On generalised intutionistic fuzzy divergence’, Int. J. Applied Systemic
Studies, Vol. 8, No. 3, pp.246–254.
Biographical notes: Anjali Munde works at Amity College of Commerce and
Finance, Amity University Uttar Pradesh. Her areas of interest are fuzzy
information measures, decision making and coding theory. She has published
various research papers in international journals of high repute.
1 Introduction
Although mathematics is the learning of a set of objects, we would unite it to numerous
quantitative measures outlined above the set. Quantitative measure with every entity and
the variance or divergence amongst any two entities are the two essential measures.
Shannon (1948) has explained entropy through probability distribution in a series of
probability distributions in information theory. Kullback and Leibler (1951) introduced
the measure of divergence which is a measure of the degree to which the expected
probability distribution diverges from the exact one. Similar to probability theory Zadeh