Recognition of human actions using motion history information extracted from the compressed video q R. Venkatesh Babu a, * , K.R. Ramakrishnan b,1 a Centre for Quantifiable Quality of Service in Communication Systems, Norwegian University of Science and Technology, O.S. Bragstads plass 2E, N7491 Trondheim, Norway b Department of Electrical Engineering, Indian Institute of Science, Bangalore 560012, India Received 27 May 2003; received in revised form 19 November 2003; accepted 20 November 2003 Abstract Human motion analysis is a recent topic of interest among the computer vision and video processing community. Research in this area is motivated by its wide range of applications such as surveillance and monitoring systems. In this paper we describe a system for recognition of various human actions from compressed video based on motion history information. We introduce the notion of quantifying the motion involved, through what we call Motion Flow History (MFH). The encoded motion information readily available in the compressed MPEG stream is used to construct the coarse Motion History Image (MHI) and the corresponding MFH. The features extracted from the static MHI and MFH compactly characterize the spatio-temporal and motion vector information of the action. Since the features are extracted from the partially decoded sparse motion data, the computational load is minimized to a great extent. The extracted features are used to train the KNN, Neural network, SVM and the Bayes classifiers for recognizing a set of seven human actions. The performance of each feature set with respect to various classifiers are analyzed. q 2003 Elsevier B.V. All rights reserved. Keywords: Action recognition; Compressed domain; Content-based retrieval; Feature extraction; Motion history; Video indexing 1. Introduction Event detection and human action recognition have gained more interest of late, among video processing community because they find various applications in automatic surveillance, monitoring systems [2], video indexing and retrieval, robot motion, human–computer interaction and segmentation [28,30]. One of the important applications of human action recognition is the automatic indexing of video sequences, while most of the multimedia documents available nowadays are in the MPEG [21] compressed form, to facilitate easy storage and transmission, majority of the existing techniques for human action recognition are pixel domain based [35,8,27,5,32,13,1,26] which are computationally very expensive. Hence, it would be efficient if the classification is performed in the MPEG compressed domain without having to completely decode the bit-stream and subsequently perform classification in the pixel domain. This calls for techniques, which can use information available in the compressed domain such as motion vectors and DCT coefficients. In the recent past, we reported a technique for recognizing human actions from compressed video using Hidden Markov Model (HMM) [3], where the time-series features used for training the HMM are directly extracted from the motion vectors corresponding to each frame of the video. Though this approach has proven its ability to classify the video sequences, the extracted time series features are not suitable for other efficient classifiers such as K-nearest neighbors (KNN), Neural networks, SVM and Bayes. In this paper we propose a technique for building coarse Motion History Image (MHI) and Motion Flow History (MFH) from the compressed video and extract features from 0262-8856/$ - see front matter q 2003 Elsevier B.V. All rights reserved. doi:10.1016/j.imavis.2003.11.004 Image and Vision Computing 22 (2004) 597–607 www.elsevier.com/locate/imavis q An earlier, brief version of this paper has appeared in the Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing 2003 (ICASSP-03) [4]. 1 Tel.: þ91-80-293-2441; fax: þ 91-80-360-0444. * Corresponding author. Tel.: þ 47-7359-2746; fax: þ 47-7359-6973. E-mail addresses: venkat@q2s.ntnu.no (R. Venkatesh Babu), krr@ee. isc.ernet.in (K.R. Ramakrishnan).