HapticSphere: Physical Support To Enable
Precision Touch Interaction in Mobile Mixed-Reality
Chiu-Hsuan Wang Chen-Yuan Hsieh Neng-Hao Yu
†
Andrea Bianchi
‡
Liwei Chan
†
Department of Industrial Design, NTUST, Taiwan
‡
Department of Industrial Design, KAIST, Korea
Department of Computer Science, National Chiao Tung University, Taiwan
ABSTRACT
This work presents HapticSphere, a wearable spherical surface en-
abled by bridging a finger and the head-mounted display (HMD)
with a passive string. Users perceive a physical support on a finger
attached to a string, when extending their arm and reaching out to
the string’s maximum extension. This physical support assists users
in precise touch interaction in the context of stationary and walking
virtual or mixed-reality experiences. We propose three methods
of attachment of the haptic string (directly on the head or on the
body), and illustrate a novel single-step calibration algorithm that
supports these configurations by estimating a grand haptic sphere,
once a head-coordinated touch interaction is established. Two user
studies were conducted to validate our approach and to compare the
touch performance with physical support in sitting and walking con-
ditions in the context of mobile mixed-reality scenarios. The results
show that, in the walking condition, touch interaction with physical
support significantly outperformed the visual-only condition.
Index Terms: Human-centered computing—Visualization—Visu-
alization techniques—Treemaps; Human-centered computing—
Visualization—Visualization design and evaluation methods
1 I NTRODUCTION
Touchscreens provide physical support for fingers during touch in-
teraction, implicitly helping users with haptic feedback and a form
of finger stabilization in the case of input on the go. Unfortunately,
current mixed-reality interaction that employs in-air touch input does
not offer such physical support, which may lead to lowered perfor-
mance. Recent research on virtual reality has demonstrated a range
of active haptic interfaces integrated into wearable devices [13,16] or
hand-held controllers [9, 22] to deliver force feedback directly onto
the users’ fingers. They allow for locally bound physical support at
the finger but are incapable of limiting the arm’s motion, providing
only a localized kinesthetic feedback or tactile sensation.
This work presents HapticSphere, a system that integrates a finger
tracking device with a passive string attached to an HMD, to provide
physical support at the user’s finger for midair touch interaction. The
envisioned application domain is mobile mixed-reality interactions.
We acquire physical support by linking the user’s finger to his or
her body with a passive string that works as a constraint for the
finger’s motion. Figure 1 illustrates an example of touch interaction
constrained by the HapticSphere prototype adding to the user’s
head (e.g., the HMD). The user perceives physical support at the
finger when reaching the maximum extension of the string, at which
moment the physical support provides both input stabilization and
haptic guidance for touch interaction. This support has the shape of
e-mail: [chwang821014, liweichan]@cs.nctu.edu.tw
†
e-mail: jonesyu@ntust.edu.tw
‡
e-mail: andrea@kaist.ac.kr
Figure 1: The physical support enabled by HapticSphere allows users
to perform a precise touch input using in-air selections during mobile
mixed-reality explorations.
a spherical wall surrounding the user, hence the name HapticSphere.
This force feedback assists users during mixed-reality experiences
for precise touch interaction in both static and mobile situations (e.g.,
walking), and it also informs the user that he or she has reached and
clicked the in-air target.
Compared with previous work that employed flexible passive
strings to constrain the full motion of an entire arm [2, 3], in this pa-
per we focus on providing force feedback on the fingertip for precise
input interaction. Our ultimate goal is to support precise touch-input
selections in mixed-reality mobile situations. To achieve this goal,
we present the HapticSphere system and two studies that demon-
strate its accuracy during selections of in-air targets. Specifically,
we propose and study a novel single-step calibration procedure to
acquire the grand haptic sphere, which is adaptable to various wear-
able conditions of the string on the user’s body. We compare three
ways in which to bind the string to the user’s body (e.g., by HMD,
by neck, and by shoulder), and we investigate how physical support
assists touch accuracy for different sized targets and in both sitting
and walking conditions. Study 1 demonstrated the effectiveness of
the grand haptic sphere algorithm and revealed how physical sup-
port assists in-air target acquisition in reducing overshooting errors.
Study 2 reported that, in walking conditions, touch interaction with
physical support significantly outperforms the visual-only condition
in terms of touch accuracy. Our main contribution is the idea of a
wearable spherical surface that allows for physical support for in-air
target acquisition in the context of mobile mixed-reality.
2 RELATED WORK
We review work that enabled wearable force (kinesthetic) feedback
on the hand/finger for mobile virtual and mixed-reality applications,
as well as string-based haptic interactions.
2.1 Wearable Force Feedback Interfaces
Bringing force feedback to virtual reality allows for leveling up im-
mersion and boosting user performance. Recent research proposed a
range of means of adding force feedback on the user’s limbs [18,24],
2019 IEEE Conference on Virtual Reality and
3D User Interfaces
23-27 March, Osaka, Japan
978-1-7281-1377-7/19/$31.00 ©2019 IEEE
331