in S. Fels, P. Poulin, Proc. Graphics Interface 2000, Montreal, Canada. High-Quality Interactive Lumigraph Rendering Through Warping Hartmut Schirmacher, Wolfgang Heidrich, and Hans-Peter Seidel Max-Planck-Institut für Informatik Saarbrücken, Germany http://www.mpi-sb.mpg.de email: {schirmacher,heidrich,hpseidel}@mpi-sb.mpg.de Abstract We introduce an algorithm for high-quality, interactive light field rendering from only a small number of input images with dense depth information. The algorithm bridges the gap between image warp- ing and interpolation from image databases, which rep- resent the two major approaches in image based render- ing. By warping and blending only the necessary parts of each reference image, we are able to generate a single view-corrected texture for every output frame at interac- tive rates. In contrast to previous light field rendering approaches, our warping-based algorithm is able to fully exploit per- pixel depth information in order to depth-correct the light field samples with maximum accuracy. The complexity of the proposed algorithm is nearly independent of the number of stored reference images and of the final screen resolution. It performs with only small overhead and very few visible artifacts. We demon- strate the visual fidelity as well as the performance of our method through various examples. Key words: computer graphics, image based rendering, light fields, Lumigraphs, image databases, image warp- ing, blending 1 Introduction Image based rendering has received a lot of attention dur- ing the last few years, since it provides a means to ren- der realistic images without generating, storing, and pro- cessing complex models of geometry, material, and light present in a scene. Currently there are two major approaches for generat- ing novel views from a set of reference images. One such approach, which can be best described by the term im- age databases, usually resamples and stores the reference images in some way that allows a very efficient interpola- tion of arbitrary views of the scene. The main problem of these techniques is that in order to obtain satisfactory re- sults, they require enormous amounts of data for storage and display. The second approach is called image warping. These kind of algorithms usually store the input data as a scat- tered (and relatively sparse) set of images together with their arbitrary camera parameters. The lack of struc- ture implies higher rendering costs, and also introduces a number of artifacts that are not easily overcome. In this paper, we propose an algorithm which combines aspects of both image databases and warping. We use a light field data structure with quantized, per-pixel depth values. For reconstructing a novel view, we first estimate which region of which reference image will contribute to the final image. Then, we forward-project all the pixels in these regions into the original image plane, but as ob- served from the novel view point. We interpolate the final pixel color from all unoccluded pixels that have been re- projected into the same image plane location. Our approach has several advantages over previous methods. Since only parts of each reference image are reprojected, the complexity of our algorithm is almost independent of the number of reference images. In ad- dition, the reprojection into the reference image plane minimizes distortion and undersampling artifacts. And finally, we can exploit dense depth information in order to perform maximum-accuracy depth-correction without reconstructing a 3D model. This is why the new algo- rithm can produce high quality views at interactive rates from a relatively small set of images. 2 Previous Work The work presented in this paper combines the light field and Lumigraph approaches with warping-based tech- niques. In the following we briefly summarize both areas of image-based rendering. 2.1 Light Fields and Lumigraphs Light fields and Lumigraphs are two related representa- tions that have been independently introduced in [9] and [5]. Both approaches are based on storing samples of the so-called plenoptic function[1] describing the direc- tional radiance distribution for every point in space. Since the radiance is constant along a ray in empty space, the