1 The PGPGrid Project Paul Cockshott, Lewis Mackenzie, Viktor Yarmolenko Department of Computing Science, The University of Glasgow, UK Abstract A major aim PGPGrid project aims to parallelise the process of extracting range data from an experimental 3D scanner using the Grid as a vehicle for providing necessary resources. The application is potentially highly parallel but has some unusual features such as rapid spawning of processes in real time and a dynamic inter-process network topology. These characteristics are such as to require enhancement of the usual task migration capabilities of the Globus toolkit. The present paper describes firstly, attempts to estimate the real parallelisability of the scanner application and an effort to develop a Java API based on Milner’s π-calculus which could be used to extend Globus in the manner required to support systems with a dynamic parallel structure. 1. Introduction The PGPGrid project has as a core aim parallelisation of the process of extracting range data from the experimental 3D-Matic TV scanner at Glasgow University [1]. It is a joint project with Peppers Ghost Productions, a 3D computer animation company, and the Edinburgh Parallel Computing Centre. The scanner uses 24 synchronised video cameras, each of resolution 640x480 pixels, to image a subject from many directions at once. The data thus acquired is then used to build up a dynamic 3D model with a spatial resolution of around 4mm and a temporal resolution of 0.04seconds. Software has been developed to allow an animator’s model to be conformed to data captured from a human actor by the scanner. This provides the opportunity to transfer the detailed movement of a real human subject to the equivalent virtual movement of the model. The conformation software was originally developed to work with still models but the aim of the current work is to extend it to moving models of cartoon-like characters. The cameras are organised into groups of three, called pods, each consisting of a pair of 8-bit monochrome cameras and a 16-bit single colour camera. All the monochrome cameras are shuttered and synchronised by a 25Hz master trigger signal. A separate 25Hz trigger, with a controllable phase difference from the first, synchronises the colour cameras. The actor is illuminated with speckle pattern from multiple projectors using an uninterrupted light source for the illumination. The speckle pattern provides synthetic features for the computer vision algorithms used in the ranging process to home in on. Without the speckle there may be insufficient contrast in parts of the human anatomy for effective ranging given the equipment currently being used. Strobe lamps synchronised with the colour cameras are set to ’overflash’ and thereby suppress the speckle pattern during the exposure of the colour cameras.