International Journal of Mathematical, Engineering and Management Sciences Vol. 6, No. 3, 688-707, 2021 https://doi.org/10.33889/IJMEMS.2021.6.3.043 688 On Entropy-type Measures and Divergences with Applications in Engineering, Management and Applied Sciences C. Koukoumis Lab of Statistics and Data Analysis, Department of Statistics and Actuarial-Financial Mathematics, University of the Aegean, Samos, Greece. E-mail: sasm19008@sas.aegean.gr A. Karagrigoriou Lab of Statistics and Data Analysis, Department of Statistics and Actuarial-Financial Mathematics, University of the Aegean, Samos, Greece. Corresponding author: alex.karagrigoriou@aegean.gr (Received on February 21, 2021; Accepted on March 12, 2021) Abstract In this work we review Entropy-type measures and Divergences, discuss their properties and unfold their diverse applicability. In addition, we compare distances between populations and distributions via weighted Entropy-type measures relying mainly on Relative Entropy and Jeffrey’s Distance with weights. Finally, we introduce the Absolute Weighted Relative Entropy and the Absolute Weighted Jeffrey’s Distance. Two applications are presented for illustration, one from Geosciences and one from Financial Mathematics. Keywords- Entropy, Divergence measures, Weighted Entropy-type measures, Absolute Entropy-type measures, Financial mathematics. 1. Introduction Information theory is a branch of pure and applied sciences that deals with the quantification of information with its roots in modern communication theory where a communication system was formulated as a stochastic process. Tuller (1950) initially and Pierce (1956) later observed the strong similarities between the underlying mechanisms of communication theory and information theory. The evolution of the field as well as the mathematical rigor that governs it are attributed to Fisher (1956), Shannon (1956) and Wiener (1956). The most fundamental measure in information theory is entropy which was first recognized, formulated and defined in statistical mechanics (Fisher, 1936; Shannon and Weaver, 1949) and consequently triggered the enormous development of the field. In this work we review Entropy-type measures and Divergences, discuss their properties and unfold their diverse applicability. In should be noted that the concept of entropy was used firstly in Physics, in the field of thermodynamics (Clausius, 1865) while its statistical definition was developed by Boltzmann (around 1870) but its applications go beyond Physics. In the present work we attempt to approach the entropy from a probabilistic or stochastic viewpoint and combine it with the concept of distance which has numerous applications in Applied Sciences, Financial Mathematics, Engineering or Management Sciences. The concept of divergence is fundamental in data analysis since it quantifies the distance between two populations, two models