Journal of Statistical Planning and Inference 136 (2006) 3659 – 3674 www.elsevier.com/locate/jspi Some results on generalized past entropy Asok K. Nanda , Prasanta Paul Department of Mathematics, Indian Institute of Technology, Kharagpur 721 302, India Received 3 November 2003; accepted 12 January 2005 Available online 22 March 2005 Abstract In the context of information theory, Shannon’s entropy plays an important role. Since this entropy is not applicable to a system that has survived for some units of time, the concept of residual entropy has been developed in the literature. This definition deals with random variable truncated above some t, i.e. the support of the random variable is taken to be (0,t ). In this paper, some ordering and aging properties have been defined in terms of generalized past entropy and their properties have been studied. Quite a few results available in the literature have been generalized. The uniform distribution has been characterized through the generalized past entropy. © 2005 Elsevier B.V.All rights reserved. MSC: 94AXX; 94A17 Keywords: IUL; IUL(); Measure of information; Past entropy; Residual entropy; Reversed hazard rate function 1. Introduction Let X be an absolutely continuous nonnegative random variable having distribution func- tion F(t) = P (X t) and the survival function ¯ F(t) = P(X>t). Suppose X denotes the lifetime of a component/system or of a living organism and f(t) =F (t) denotes the lifetime density function. Shannon (1948) was the first to introduce entropy, known as Shannon’s entropy or Shan- non’s information measure, into information theory. For an absolutely continuous random Corresponding author. E-mail addresses: asok@maths.iitkgp.ernet.in (A.K. Nanda), ppaul@maths.iitkgp.ernet.in (P. Paul). 0378-3758/$ - see front matter © 2005 Elsevier B.V. All rights reserved. doi:10.1016/j.jspi.2005.01.006