In-Place 3D Sketching for Authoring and Augmenting Mechanical Systems Oriel Bergig * , Nate Hagbi * , Jihad El-Sana * , Mark Billinghurst * The Visual Media Lab, Ben-Gurion University, Israel The HIT Lab NZ, University of Canterbury, New Zealand ABSTRACT We present a framework for authoring three-dimensional virtual scenes for Augmented Reality (AR) which is based on hand sketching. Sketches consisting of multiple components are used to construct a 3D virtual scene augmented on top of the real drawing. Model structure and properties can be modified by editing the sketch itself and printed content can be combined with hand sketches to form a single scene. Authoring by sketching opens up new forms of interaction that have not been previously explored in Augmented Reality. To demonstrate the technology, we implemented an application that constructs 3D AR scenes of mechanical systems from freehand sketches, and animates the scenes using a physics engine. We provide examples of scenes composed from trihedral solid models, forces, and springs. Finally, we describe how sketch interaction can be used to author complicated physics experiments in a natural way. KEYWORDS: In-Place Augmented Reality, free hand sketching, Augmented Reality, 3D content authoring, physical simulation, interaction by sketching, visual language, dual perception. INDEX TERMS: H.5.1 [Multimedia Information Systems]: Artificial, augmented, and virtual realities; I.4.0 [Image Processing and Computer Vision]: Image processing software; K.3.0 [Computers and Education]: Computer Uses in Education. 1 INTRODUCTION Freehand sketching is one of the most ancient of human skills. It has been used from our early days as a natural communication language. Sketching facilitates conveying visual information, while encouraging creativity. It also serves as a natural way for triggering visual thinking, which is essential in many domains. Various applications have exploited the power of computerized sketching, beginning with the pioneering work of Ivan Sutherland’s, SketchPad [1]. Computerized sketching offers various advantages over freehand sketching and allows authoring of complex objects. The visual language of physics is well defined and physics textbooks commonly include abstract diagrams of physical systems to explain the studied material. Teachers usually sketch physical systems on the class whiteboard, and students often sketch to solve physics problems. On the other hand, computer graphics and Augmented Reality (AR) enable three dimensional visualization of, and interaction with, physical systems as if they were real systems in a lab. In this work, we explore the combination of freehand sketching input and AR visualization for authoring physical systems. This combination can assist learning in ways that have not been explored before. Authoring of three-dimensional scenes often requires extensive work. Several interaction techniques have been suggested for authoring physical scenes. Nevertheless, this remains a complicated task for the untrained user. On the other hand, sketching physical systems is easier to most people. State of the art methods exist nowadays for interpreting hand sketches and reconstructing three-dimensional geometric structures from line drawings. It is possible to exploit this knowledge to make the authoring of physical scenes easier and faster by sketching them in two-dimensions. Interpretation of two-dimensional content for authoring three- dimensional scenes has been recently proposed in Augmented Reality. In-Place Augmented Reality (IPAR) [2] content is extracted from a printed-paper using a visual language. One of the key elements of IPAR is the dual perception property, which implies the visual language is understandable to humans without the use of any computerized system, but it also encodes AR content that can be extracted using computer vision methods. Orthographic two-dimensional projections of solid models are inherently a dual perception representation. They encode information about the 3D geometry of a model, which can be extracted and used as AR content. This work combines the advantages of IPAR content authoring with the advantages of sketching. Automatic interpretation of hand sketches suggests a new form of interaction with content, which we name sketch interaction. With sketch interaction, three-dimensional scenes can be authored gradually by adding the 3D representation of sketch elements into the simulation. It is also possible to modify model geometry and properties by sketching. Manipulating models in a scene can be done by combining traditional interaction methods and sketching. For example, positioning models with fiducials and changing their geometry with eraser and pencil. The augmentation of the scene on top of its sketch makes sketch interaction intuitive, since users can observe scene modifications in the same place they are made. (a) (b) Figure 1. (a) Authoring a mechanical system by hand sketching on paper. The sketch is acquired by a webcam. (b) A virtual 3D scene is constructed, augmented and simulated on top of the sketch. The figure was taken during physical simulation. * {bergig, natios, el-sana}@cs.bgu.ac.il mark.billinghurst@hitlabnz.org 87 IEEE International Symposium on Mixed and Augmented Reality 2009 Science and Technology Proceedings 19 -22 October, Orlando, Florida, USA 978-1-4244-5419-8/09/$25.00 ©2009 IEEE