Geophysical Prospecting, 2008, 56, 467–475 doi:10.1111/j.1365-2478.2008.00700.x Iterative tomographic analysis based on automatic refined picking Claudio Satriano 1 , Aldo Zollo 1 and Charlotte Rowe 2 1 Dipartimento di Scienze Fisiche (RISSC-Lab), Universit ` a di Napoli Federico II, Via Coroglio 156, Naples 80124, Italy, and 2 Los Alamos National Laboratory, Los Alamos, NM 87545, USA Received February 2007, revision accepted February 2008 ABSTRACT The ever-growing size of data sets for active and passive seismic imaging makes the availability of automatic procedures for rapid analysis more and more valuable. Such procedures are especially important for time-critical applications like emergency deci- sions or re-orienting of ongoing seismic surveys. In this paper a new, iterative scheme for 3D traveltime tomography is presented. The technique, based on a tool originally developed for earthquake data, uses cross-correlation to examine waveform similarity and to adjust arrival times on seismic sections. A preliminary set of reference arrival times is first corrected by the cross-correlation lag and then used to build an initial 3D tomographic velocity model through a standard inversion code; traveltimes calcu- lated from this model are then taken as new reference arrivals and the process of pick adjustment is repeated. The result is a tomographic image, upgraded and refined at each iteration of the procedure. The test performed on the waveform data set recorded during the 2001 SERAPIS active seismic survey in the gulfs of Naples and Pozzuoli (Southern Italy) shows that the 3D iterative tomography scheme produces a velocity image of the structure of the Campi Flegrei caldera which is consistent with the results from previous studies, employing just a fraction of the time needed by a human ana- lyst to identify first breaks. We believe that this technique can be effectively employed for rapid analysis of large data-sets within time-critical or time-dependent tasks and for automatic 4D tomographic investigations. INTRODUCTION Modern studies and applications, both in seismology and in seismic exploration, are associated with vast data sets pro- duced by dense seismic networks or by large experiments such as active seismic surveys. The development and the use of automatic, reliable proce- dures to analyse large data sets is a primary requirement, es- pecially in those cases for which rapid analysis is necessary to facilitate time-critical decisions or tasks. These include, but are not limited to, emergency management during volcanic crises, modification of the configuration of active seismic survey or 4D monitoring of an oil/gas/geothermal reservoir. One of the crucial problems in analysing seismic waveforms is the correct identification of the different seismic phases and E-mail: satriano@na.infn.it their arrival (phase picking). The quality of these phase read- ings is a critical factor for both earthquake location and seis- mic exploration. Standard operations generally involve the manual measuring of P-wave and secondary arrivals or au- tomatic, computer-aided identification of these phases using software autopickers (see for example Allen 1982; Baer and Kradolfer 1987; Sleeman and Van Eck 1999; Zhang, Thurber and Rowe 2003a). In seismology, significant improvements in the quality of hypocentre location have been achieved through the use of quantitative cross-correlation for relative doublet and mul- tiplet location (Fremont and Malone 1987; Got, Fr´ echet and Klein 1994; Waldhauser and Ellsworth 2000; Moriya, Niitsuma and Baria 2003) and correlation-based phase repicking (Dodge, Beroza and Ellsworth 1995; Shearer 1997; Rowe et al. 2002; Rowe, Thurber and White 2004). Start- ing from families of similar events, these methods have pro- duced some impressive resolution of seismogenic structures. In C 2008 European Association of Geoscientists & Engineers 467