Eurographics Symposium on Rendering (2004) H. W. Jensen, A. Keller (Editors) Image-Based Stereoscopic Painterly Rendering E. Stavrakis and M. Gelautz Interactive Media Systems Group, Vienna University of Technology, Austria Abstract We present a new image-based stereoscopic painterly algorithm that we use to automatically generate stereoscopic paintings. Our work is motivated by contemporary painters who have explored the aesthetic implications of paint- ing stereo pairs of canvases. We base our method on two real images, acquired from spatially displaced cameras. We derive a depth map by utilizing computer vision depth-from-stereo techniques and use this information to plan and render stereo paintings. These paintings can be viewed stereoscopically, in which case the pictorial medium is perceptually extended by the viewer to better suggest the sense of distance. Categories and Subject Descriptors (according to ACM CCS): I.3.3 [Computer Graphics]: Picture/Image Generation J.5 [Arts Humanities]: Fine Arts 1. Introduction Stereoscopic painting is a special painting technique used by some painters. An artist creates two paintings of his compo- sition, instead of one. The two paintings differ in that they are depicted from two horizontally displaced viewpoints. This canvas pair is to be viewed stereoscopically, so that one painting is viewed by each of the viewer’s eyes. The viewer’s brain is stimulated into fusing the paintings into a final composition, according to the principles of binocular vision [Wan95]. The major advantage of stereoscopic paint- ing is that the artwork no longer remains flat and restricted to the two-dimensionality of the canvas itself. Once fused, elements of the stereoscopic composition appear to protrude in front of the display surface and others recede, making the painting more immersive, and in many cases, also more re- alistic. The purpose of our paper is to present the idea of image- based stereoscopic painterly rendering. Our approach aims to eliminate the limitation of flatness of the 2D pictorial medium [Dur02] by re-introducing the binocular depth cues. We base our technique on images from real scenes that are content rich and enable us to produce more attractive and ap- pealing paintings than those of 3D computer generated envi- ronments. We show how the existence of corresponding in- formation between the two slightly different images can be used to enhance the appearance of the final stereo painting, as well as to speed up the overall rendering process. We identify and discuss details that should be considered when devising painterly rendering algorithms which operate on images of real scenes. Feature Correspondence. Stereo painting requires that depicted features correspond between the two canvases. [SK98] addressed the consistent editing of multiple views of the same scene in a plenoptic approach. In our case, paint- ing elements, i.e. brush strokes, that cannot be matched in both views will inhibit stereo fusion and the viewers may experience discomfort. Also, large non-corresponding areas and deviations in paint color or style will produce similar undesired effects. Even though the brain is able to tolerate a small percentage of inconsistency, which varies from per- son to person, algorithms should strive on providing the best possible correspondence between the two paintings. Randomness. A related issue to feature correspondence between views is the use of randomness. In some non- photorealistic rendering techniques [Sal97][Her98][Lit97], random numbers are used to inject irregularity into the pro- cess of abstraction or stylization. When randomness is used in stereoscopic painterly rendering, it must be as consistent as possible across a stereo image pair, so that irregularity can be equally modeled within both images. Optimizations. Stereo pairs exhibit correlative informa- tion between their images that can be used to optimize a va- riety of algorithms [SGH * 01][ABC * 91] (e.g. compression algorithms or rasterization of stereo pairs). Painterly algo- c The Eurographics Association 2004.