Long Paper
Copyright © 2004 by the Association for Computing Machinery, Inc.
Permission to make digital or hard copies of part or all of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed
for commercial advantage and that copies bear this notice and the full citation on the
first page. Copyrights for components of this work owned by others than ACM must be
honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on
servers, or to redistribute to lists, requires prior specific permission and/or a fee.
Request permissions from Permissions Dept, ACM Inc., fax +1 (212) 869-0481 or e-mail
permissions@acm.org.
© 2004 ACM 1-58113-884-9/04/0006 $5.00
Occlusion based Interaction Methods for
Tangible Augmented Reality Environments
Gun A. Lee
α
, Mark Billinghurst
β
and Gerard Jounghyun Kim
α
α
Virtual Reality Laboratory, Dept. of CSE, POSTECH, Pohang, 790-784, Republic of Korea
β
HIT Lab NZ, University of Canterbury, Private bag 4800, Christchurch, New Zealand
endovert@postech.ac.kr mark.billinghurst@hitlabnz.org gkim@postech.ac.kr
Abstract
Traditional Tangible Augmented Reality (Tangible AR) interfaces
combine a mixture of tangible user interface and augmented
reality technology, complementing each other for novel
interaction methods and real world anchored visualization.
However, well known conventional one and two dimensional
interaction methods such as pressing buttons, changing slider
values, or menu selections are often quite difficult to apply to
Tangible AR interfaces. In this paper we suggest a new approach,
occlusion based interaction, in which visual occlusion of physical
markers are used to provide intuitive two dimensional interaction
in Tangible AR environments. We describe how to implement
occlusion based interfaces for Tangible AR environments, give
several examples of applications and describe results from
informal user studies.
Keywords: tangible augmented reality, user interface, occlusion,
augmented reality, computer human interaction
CCS Categories: H.5.2 [Information Interfaces and Presentation]:
User Interfaces – Interaction styles; I.3.6 [Computer Graphics]:
Methodology and Techniques – Interaction techniques
1 Introduction
Augmented Reality (AR) interfaces involve the overlay of virtual
imagery on the real world. Over the past decade there has been an
evolution in the types of AR interfaces being developed. The
earliest systems were used to view virtual models in a variety of
application domains such as medicine and machine maintenance.
These interfaces provided a very intuitive method for viewing
three-dimensional virtual information, but little support for
creating or modifying the AR content. More recently, researchers
have begun to address this deficiency. The AR modeler of
Kiyokawa et al. [1999] uses a magnetic tracker to allow people to
create AR contents, while the Studierstube [Szalavári and
Gervautz 1997] project uses a pen and tablet for selecting and
modifying AR objects. However, there is still a need for more
intuitive interaction techniques.
We have been developing a new approach to designing AR
interfaces that we refer to as Tangible Augmented Reality [Kato
et al. 2001] (Tangible AR). Tangible AR interfaces are those in
which 1) each virtual object is registered to a physical object and
2) the user interacts with virtual objects by manipulating the
corresponding physical objects.
The physical objects and interactions are equally as important as
the virtual imagery and provide a very intuitive way to interact
with the AR interface. For example, in our Shared Space
[Billinghurst et al. 2000] collaborative AR interface, three-
dimensional virtual objects appear attached to real playing cards.
Several users could manipulate the cards at the same time. When
they put related virtual objects next to each other a simple
animation is shown. The interface was tested by thousands of
users who reported that interaction with the virtual models was
very natural and intuitive, and that they could easily collaborate
with each other. In a later interface, VOMAR, Kato et al. [2001]
showed how more complicated physical interaction techniques
could be used to enable a person to arrange virtual furniture in a
3D scene assembly program. Once again, the use of Tangible AR
techniques made interaction with the virtual content natural and
intuitive.
In these interfaces a computer vision library, ARToolKit
[ARToolKit], is used to track the pose of a head worn camera
relative to physical markers. Real objects can be tagged by these
markers and used as interaction widgets in AR interfaces. This
allows the development of a wide range of interface objects, such
as books that have virtual imagery appearing from the real pages
[Billinghurst et al. 2001], maps that appear overlaid with virtual
terrains [Hedley et al. 2002] or tiles that support rapid prototyping
of aircraft cockpits [Poupyrev et al. 2002].
These interfaces provide very natural 3D interaction techniques
based on six degree of freedom manipulation of real objects.
However there are times when 1D or 2D interaction techniques
are required, such as pushing buttons, moving sliders, or menu
and icon selections. This type of interaction has not been well
studied in a Tangible AR environment.
In this paper we suggest a new approach for one and two
dimensional interaction in Tangible AR interfaces. Our approach
is based on camera-based detection of occlusion of physical
markers. Occlusion based interaction is a low cost, easy to
implement method for 1D and 2D interactions in Tangible AR
environments.
In the remainder of this paper we first talk about other related
methods for 1D and 2D interaction. We then talk about our
approach to occlusion based interaction and present several
examples of occlusion based interfaces. Finally we describe
feedback from informal user studies and outline directions for
future work.
419