Human Action Recognition in Realistic Scenes Based on Action Bank Carlos Mart´ ınez, Marcos Baptista, Cristina Losada, Marta Marr´ on ⋆ , and Valeria Boggian. GEINTRA Group, University of Alcal´ a. http://www.geintra-uah.org {carlos.martinez,marcos.baptista,losada,marta,valeria.boggian}@depeca. uah.es Abstract. During the last decades topics such as video analysis and image understanding have acquired a big importance due to its inclusion in applications such as security, intelligent spaces, assistive living and fo- cused marketing. In order to validate all related works different datasets have been distributed within the research community: CAVIAR, KTH, Weizmann, INRIA or MuHAVI are some of the most well-known ex- amples, but in most cases these datasets have not been created for the surveillance application in realistic scenes of interest. Within this context, here we present a work that implements a solution for multiple persons’ action recognition in monocular video sequences, focused on surveillance applications. Besides, it is also presented a newly created dataset with realistic scenes specifically designed for commercial applications. Devel- opment and results of the proposed algorithm and its validation, both within well-known datasets as CAVIAR and KTH and within the one ad-hoc generated for the applications of interest, are discussed in the paper. Keywords: Action Bank, activity recognition, video-surveillance, monoc- ular RGB image processing 1 Introduction The current development of audio and video processing technologies applied to cognitive systems allows automating more complex tasks and in a more accurate way. Nowadays several efforts have been done in order to generate cognitive systems whose aim is to analyse the different events that happen to humans in their typical environments. This process is named scene understanding. Scene understanding techniques are thus applied to accomplish human behaviour or activity analysis, normally based on sequences of human actions. ⋆ This work has been supported by the Spanish Ministry of Economy and Compet- itiveness under project SPACES-UAH (TIN2013-47630-C2-1-R), and by the Uni- versity of Alcal´ a under projects DETECTOR (CCG2015/EXP-019) and ARMIS (CCG2015/EXP-054).