User Defined Conceptual Modeling
Gestures
Bige Tunçer and Sumbul Khan
Abstract Gesture and speech based interaction offers designers a powerful tech-
nique to create 3D CAD models. Previous studies on gesture based modeling have
employed author defined gestures which may not be very user friendly. The aim
of this study was to collect a data set of user generated gestures and accompanying
voice commands for 3D modeling for form exploration in the conceptual architectural
design phase. We conducted an experiment with 41 subjects to elicit their preferences
in using gestures and speech for twelve 3D CAD modeling referents. In this paper
we present the different types of gestures we found, and present user preferences of
gestures and speech. Findings from this study will be used for the design of a speech
and gesture based Cad modeling interface.
Keywords Conceptual architectural design · Gesture based modeling · Natural
user interface · Gesture studies · Human computer interaction
1 Introduction
This research aims to address the issue of computer support during the conceptual
design stage, when problems are ill-defined and designers formulate the initial param-
eters for an artifact [1]. Conventional CAD systems use graphical user interfaces that
rely on input devices such as mouse and keyboard, which are seen to constrain
human-computer dialogue [2]. Gesture and speech based interaction offers a natural
and flexible interaction technique for designers for 3D modeling during conceptual
B. Tunçer (B )
Architecture and Sustainable Design, Singapore University of Technology
and Design, Singapore, Singapore
e-mail: bige_tuncer@sutd.edu.sg
S. Khan
SUTD-MIT International Design Centre, Singapore University of Technology and Design,
Singapore, Singapore
e-mail: sumbul_khan@sutd.edu.sg
© Springer Nature Singapore Pte Ltd. 2018
J.-H. Lee (ed.), Computational Studies on Cultural Variation and Heredity,
KAIST Research Series, https://doi.org/10.1007/978-981-10-8189-7_10
115