Neurocomputing 62 (2004) 197–223 www.elsevier.com/locate/neucom Optimizing motion and colour segmented images with neural networks D. Amanatidis a , D. Tsaptsinos a ; ∗ , P. Giaccone b , G. Jones b a School of Mathematics, Kingston University, Penhryn Road, Kingston Upon Thames, Surrey KT1 2EE, UK b School of Computing and Information Systems, Kingston University, UK Received 1 June 2001; received in revised form 1 September 2003; accepted 4 February 2004 Abstract Segmentation of independently moving foreground elements from background, is a very com- mon procedure in digital postproduction. The conventional technique, known as rotoscoping, is carried out manually and is therefore too reliant on human eort. The industry is interested in an automated method that can correctly locate the boundary and be robust given rapid motion and non-static backgrounds. A cellular neural network is presented that labels pixels by estimated motion, colour, neighbouring and previous labels. The method is accurate, labour-saving and many times faster than manual rotoscoping. Moreover, due to the inherent parallelism and the local nature of the network, the whole process can be implemented on hardware boosting up performance. c 2004 Elsevier B.V. All rights reserved. Keywords: Parametric optical ow; Foreground-background segmentation; Spatiotemporal regularization; Neural network optimization 1. Introduction Often it is too dangerous, too expensive or simply impossible to lm actors or objects against the desired background. The most commonly used procedure is bluescreening in which objects are shot against a blue screen which is then replaced electronically with an alternative real or computer-generated background sequence. The screen used is usually blue as this high-frequency colour diracts little around the object edges, * Corresponding author. Tel.: +44-20-85477566; fax: +44-20-85477497. E-mail address: d.tsaptsinos@kingston.ac.uk (D. Tsaptsinos). 0925-2312/$ - see front matter c 2004 Elsevier B.V. All rights reserved. doi:10.1016/j.neucom.2004.02.005