SPECIAL FOCUS PAPER VIRTUAL REALITY AND HAPTICS FOR PRODUCT ASSEMBLY Virtual Reality and Haptics for Product Assembly http://dx.doi.org/10.3991/ijoe.v8iS1.1894 Pingjun Xia, António M. Lopes and Maria Teresa Restivo IDMEC-Polo FEUP, Faculty of Engineering, Porto, Portugal Abstract—Haptics can significantly enhance the user’s sense of immersion and interactivity. An industrial application of virtual reality and haptics for product assembly is described in this paper, which provides a new and low-cost approach for product assembly design, assembly task planning and assembly operation training. A demonstration of the system with haptics device interaction was available at the session of exp.at’11. Index Terms—Haptics Interface, Virtual Reality, Assembly I. INTRODUCTION Virtual reality (VR) and haptics are new innovative and promising technologies that emerged in recent years, and rapidly evolved into wide applications, from medicine to industry, from education to training, from entertainment to military [1]. Virtual assembly is one of the most challeng- ing applications of virtual reality in engineering. The first objective of virtual assembly is to test the feasibility of the assembly operations at the design stage of the product. The second objective is to generate optimal assembly plans including resource allocation, assembly time and cost estimation, assembly operation training and mainte- nance ergonomics [2]. Haptics is particularly important for virtual assembly, because it can increase the users sense of immersion and interaction, help the users to get a better understanding of virtual objects, to feel more secure and more confident in the real world assembly process, and thus improve task efficiency [3]. In this paper, a typi- cal virtual assembly application system based on virtual reality and haptics is introduced for industrial products. II. SYSTEM CHARACTERISTICS Model and data can be integrated with commercial CAD systems. The product, tools and fixtures are de- signed in a commercial CAD system such as Pro/Engineer or SolidWorks, and then an automatic data integration interface can be developed to transform the data and in- formation from CAD to VR, as shown in Fig.1. Four types of data are mainly taken into account, including geometry data, topology data, assembly data and physics data. These models and data are then input into virtual reality envi- ronment. A multi-modal virtual environment must be constructed for virtual assembly simulation, including vision, audio and haptic feedback (Fig.2). A hierarchical constraint- based data model is proposed to represent parts and ob- jects, which is composed of product layer, subassembly layer, part layer, feature layer, surface layer and polygon layer. For elements in the same layer, there exist geometry constraint relationships, and for elements in different layer, there exist hierarchical mapping relationships. Figure 1. Models and data integrated with CAD system Figure 2. Haptics interaction in virtual environment A hierarchical scene graph structure can be also gener- ated. The user can select different layer object to operate, for example, he can select a single part to operate, and he can also select a subassembly as a whole to assemble or disassemble. Because there are multi-modal feedbacks in virtual environment, a multi-thread mechanism is realized. There are three separate threads in the system: haptic ren- dering thread, physical calculation thread and graphical rendering thread. The haptic rendering thread is responsi- ble for communicating with the PHANTOM device, launching at a high priority and high frequency (about 1000 Hz). The physical calculation thread performs all the work including collision detection, physics computation, dynamic simulation of realistic part behaviour, and ge- ometry constraint recognition etc, which is running at a second priority and frequency (about 100Hz). The graphic rendering thread is mainly responsible for visualizing the 12 http://www.i-joe.org