!"
I. INTRODUCTION
idair displays which project floating images in free
space have been seen in SF movies for several decades
[1]. Recently, they are attracting a lot of attention as promising
technologies in the field of digital signage and home TV. And
then, novel technologies are developed to render images
hovering in air without special glasses. FogScreen [2] and
Heliodisplay [3] use a thin layer of fog as a projection screen.
Holo [4] provides floating images from an LCD by utilizing a
concave mirror. SeeReal Technologies is working on a
realtime, computergenerated, and 3D holography [5]
through the use of an eyetracking technology for reduction of
calculation amount. With them, you can see a virtual object as
if it is really hovering in front of you. Furthermore, by
applying the visionbased and markerless hand tracking
techniques demonstrated in Holovizio [6] or GrImage [7], you
can handle the projected images with your hands as if it really
exists. Then, tactile feedback will be the next demand. If
tactile feedback is added, the usability of the interaction
systems will be highly improved.
There are three types of conventional strategies for tactile
feedback in free space. The first is attaching tactile devices on
user's fingers and/or palms. Employed devices are, for
example, vibrotactile stimulators (CyberTouch [8]),
motordriven belts (GhostGlove [9]), and pinarray units
(SaLT [10]). In this strategy, the skin and the device are
always in contact and that leads to undesired touch feelings.
The second is controlling the position of tactile devices so that
they contact with the skin only when tactile feedback is
required. In the masterslave system shown in [11], the
encountertype force feedback is realized by the exoskeleton
master hand. The detailed tactile feedback for each finger is
Takayuki Hoshi and Hiroyuki Shinoda are with the Department of
Information Physics and Computing, the Graduate School of Information
Science and Technology, the University of Tokyo, Tokyo, Japan (phone and
fax: +81358416927; email: {star, shino}@alab.t. utokyo.ac.jp).
Daisu Abe is with the Department of Mathematical Engineering and
Information Physics, the Faculty of Engineering, the University of Tokyo,
Tokyo, Japan.
Fig. 1 Developed interaction system. An aerial imaging system, a
Wiimotebased handtracking system, and a noncontact tactile display are
combined. In this figure, the ultrasound is radiated from below. When the
user hits the floating virtual ball, he feels an impact on his palm.
provided by the electrotactile display attached on the finger
part of the master hand. The drawback of this strategy is that it
requires bulky robot arms. The last is providing tactile
feedback from a distance without any direct contact. For
example, airjets are utilized in [12] to realize noncontact
force feedback. Although airjet is effective for rough “force”
feedback, its spatial and temporal properties are quite limited
and it cannot provide detailed “tactile” feedback.
We have proposed a method for producing tactile sensation
with airborne ultrasound [13]. The method renders desired
pressure pattern in free space by using wave field synthesis
with high spatial and temporal resolution. Users can feel the
pressure with their bare hands. In [13], the prototype
consisting of 91 ultrasound transducers was introduced and
the feasibility of the proposed method was discussed. It can
move a focal point along Z axis and the force generated at the
focal point is 0.8 gf.
In this paper, an interaction system is shown which tracks
user's hand and provides tactile feedback when collision
occurs between the hand and a virtual object (Fig. 1). After
that, we try to increase the output force in order to represent
the feeling of impact which is one of the primitive touch
feelings. As a result, the force is sextuplicated (i.e. 4.8 gf) for a
shorttime output. We also try to reduce the air flow generated
around the focal point to make it clear. The methods and
experiments are described.
# $
Takayuki Hoshi, Daisu Abe, and Hiroyuki Shinoda
M
The 18th IEEE International Symposium on
Robot and Human Interactive Communication
Toyama, Japan, Sept. 27-Oct. 2, 2009
TuIAH.2
978-1-4244-5081-7/09/$26.00 ©2009 IEEE 7