Demo: Multi-Scale Gestural Interaction for Augmented Reality Barrett Ens University of South Australiay Adelaide, Australia barrett.ens@unisa.edu.au Aaron Quigley University of St. Andrews St. Andrews, Scotland aquigley@st-andrews.ac.uk Hui-Shyong Yeo University of St. Andrews St. Andrews, Scotland hsy@st-andrews.ac.uk Pourang Irani University of Manitoba Winnipeg, Canada pourang.irani@cs.umanitoba.ca Mark Billinghurst University of South Australia Adelaide, Australia mark.billinghurst@unisa.edu.au ABSTRACT We present a multi-scale gestural interface for augmented reality applications. With virtual objects, gestural interactions such as pointing and grasping can be convenient and intuitive, however they are imprecise, socially awkward, and susceptible to fatigue. Our prototype application uses multiple sensors to detect gestures from both arm and hand motions (macro-scale), and fnger gestures (micro-scale). Micro-gestures can provide precise input through a belt-worn sensor confguration, with the hand in a relaxed posture. We present an application that combines direct manipulation with microgestures for precise interaction, beyond the capabilities of direct manipulation alone. CCS CONCEPTS · Human-centered computing Mixed / augmented reality; KEYWORDS microgestures, gesture interaction, augmented reality ACM Reference Format: Barrett Ens, Aaron Quigley, Hui-Shyong Yeo, Pourang Irani, and Mark Billinghurst. 2017. Demo: Multi-Scale Gestural Interaction for Augmented Reality. In Proceedings of SA ’17 Symposium on Mobile Graphics & Interactive Applications . ACM, New York, NY, USA, 2 pages. https://doi.org/10.1145/ 3132787.3132808 1 MULTI-SCALE GESTURES Gestures, including gesticulation, language like, pantomime or em- blematic movements [Wu and Huang 1999], are a natural part of human communication. Interaction designers have long sought sensing technologies that allow hand gestures to be sensed and interpreted, eliminating altogether the need for mechanical input devices. Researchers have incorporated pointing, grasping and wav- ing gestures in numerous contexts. We present a prototype AR interface (fg. 1) that combines interaction on multiple scales, using © ACM, 2014. This is the author’s version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version will be published in SA ’17 Symposium on Mobile Graphics & Interactive Applications , November 27-30, 2017, Bangkok, Thailand © 2017 Copyright held by the owner/author(s). Figure 1: A head-mounted Leap sensor (a) along with a Leap+Soli confguration (b) worn on the user’s belt (c). This confguration allows both direct manipulation and precise control in applications such as a docking task (d). multiple wearable sensors. A Leap Motion [Leap Motion 2017] sen- sor mounted on a HoloLens [Microsoft 2017] allows macro-scale direct manipulation of virtual objects. A belt confguration, which includes a second Leap Motion sensor combined with a Google Soli [Google 2017; Lien et al. 2016], allows fne-scale object manipula- tion using microgestures, when the arm is in a relaxed, low-fatigue posture. This work builds on previous research [Ens et al. 2016; Liu et al. 2015] that allows gesture input with a relaxed arm pos- ture. Whereas the system by Ens et al. [Ens et al. 2016] relies on a ring device, our belt-worn sensor confguration leaves the hand unencumbered and allows richer interaction. We will present sev- eral applications that provide precise, low fatigue interaction with smooth transitions between macro- and micro-gesture scales. For instance, a docking task uses six virtual sliders mapped onto the tips and sides of three diferent fngers, to precisely control six degrees of freedom of a virtual object (fg. 1d).