International Conference on New Media (CONMEDIA) 2015 57 ISBN 978-602-95532-9-1 Motion Detection Using Frame Differences Algorithm With The Implementation of Density Adhi Kusnadi Faculty of Information and Communication Technology Multimedia Nusantara University Tangerang, Indonesia Adhi.kusnadi@umn.ac.id Glen Faculty of Information and Communication Technology Multimedia Nusantara University Tangerang, Indonesia glen_lee@hotmail.co.uk Abstract There are several motion detection algorithms, frame differences algorithm is one of them. Normally frame differences algorithm works by comparing all pixels between two images, even if it’s possible to detect motion this way, it’s costly in the terms of CPU usage. Using a modified frame differences algorithm, it is possible to detect motion which can be run in a low-end system and does not require a lot of CPU usage. The frame differences algorithm’s modification is made with the implementation of density. The idea of density comes from the concept of sequence of series. The modified frame differences algorithm doesn’t need to check all pixels when comparing two images therefore it’s faster. The tests were done using three scenarios involving human’s motion detection test, speed test and accuracy test. The results of tests show that the implementation of density accelerates the frame differences algorithm and it’s still possible to detect moving objects. With the implementation of density, frame differences algorithm’s speed increases up to 97.78% and the accuracy of motion detection is up to 89.1%. Keywords: frame differences, density, frames, web camera I. INTRODUCTION Motion detection is done by comparing reference frame(s) to the current frame. There are several algorithms that is possible to compare images and detect motion, such as background substraction, optical flows and frame differences. According to [1] the background substraction algorithm works by substracting the current frame to a reference frame and afterwards, the the result from the substraction is segmented to produce a binary image that highlights the moving regions. Another algorithm, optical flows works by estimating the motion vectors in each frame of the live video sequence and once determined, the motion vectors will be drawn over the moving objects [2]. Optical flows provides all information but it is usually too complex for real time usage and requires special hardware. This study is focused on frame differences algorithm. Study in [3] explains if frame differences algorithm works by comparing the average RGB components of a reference frame to the current frame to find the difference value. This comparison is done for all pixels to find the difference value, the value then will be compared to threshold value, i f it’s bigger than threshold then motion is detected. There are several techniques to determine the reference frame to be compared with the current frame. In this study the dynamic adaptive template matching technique as proposed by [3] is used. The focus is to improve the frame differences algorithm even further, to implement density to frame differences algorithm that uses dynamic adaptive template matching. The idea of density itself came from the concept of sequence and series. The modified frame differences algorithm was implemented as a desktop application. II. UNDERLYING THEORY A. Frame Differences Algorithm Frame differencing is a technique where computer checks for differences between two images [4]. Unlike background substraction algorithm, frame differences algorithm does not require the filtering process to grayscale even if it’s possible to use it. Frame differences algorithm uses RGB information, which are already available in the images themselves. In this algorithm, there are two values that act as comparator to detect motion in pixels and overall image, those values are threshold and sensitivity. Normally, all the pixels between images will be compared. The usage of RGB information means the algorithm will check each pixels for the RGB value. RGB has three components : red, green and blue values that save colour information [5]. Formula (1) shows the calculation to get the average RGB value for the current frame ( ( )) and formula (2) shows the calculation to get the average RGB value for the reference frame ( ( )). Both of these formulas are explained by [3]. ( )  ( )   ( )   ( ) () ( )  ( )   ( )   ( ) () The value of ( ) then will be compared to ( ) to find the difference value. This value must be an absolute value