I-T OUCH: A generic multimodal framework for industry virtual prototyping Aur´ elien Pocheville Laboratoire Syst` emes Complexes - CNRS - UEVE 40, rue du Pelvoux, 91020 Evry, France aurelien.pocheville@haptique.com Abderrahmane Kheddar, Kazuhito Yokoi AIST-CNRS Joint Robotics Laboratory JRL IS, Natl Inst of AIST, Tsukuba Central 2, Japan abderrahmane.kheddar@aist.go.jp Abstract— Simulations based on virtual reality techniques make often special arrangements for haptic rendering. In fact, in most cases, haptic rendering drives the design of the simulation engine. This work proposes alternative software architecture to handle multimodal and human centered interactive rendering with a particular emphasis for the computer haptics problem. Namely, the architecture allows handling both haptic devices requirements, in terms of high refresh rates, and physically-based simulations requirements, in terms of CPU time. The developed I-TOUCH framework is designed to address these issues; in the meantime, it provides an open architecture and powerful tools to benchmark robustness of subsequent algorithms. All undergoing developments are being tested with actual industry virtual prototyping scenarios, the complexity of some of which highlights the extent of the fundamental problems to overcome. Keywords Haptic feedback, virtual prototyping, computer haptics, I-TOUCH I. I NTRODUCTION In order to reduce costs while increasing production ef- ficiency, virtual prototyping (VP) (to be seen as a comple- mentary tool to CADM software techniques) is considered as a promising perspective. It is the front end of a product life management process taking into account constraints re- lated to manufacturing, utilization, and maintenance. To fulfill human-centered designs, the VP architecture should allow “digital mock-up” to be interactively explored, manipulated, and tested in various usage scenarios. This highlights the major importance of the human sensory capabilities other than the visual one. Indeed, in most maintenance and assem- bling/disassembling instances, feeding back haptic information to the operator is as much essential as vision or sound. Haptic interaction has two main components: the kinesthetic part, which reflects motions and forces, the tactile part, which reflects touch (surface roughness, shape, thermal exchange, etc.) Our research focuses on many aspects of haptics foundations: human science related (eg. haptic psychophysics) and tech- nology related (eg. interface design and computer haptics). In order to the meet both research and industry transfer purposes, we conceived a multimodal haptic framework, called I-TOUCH, that allows fundamental developments and models to be benchmarked for specific or general use. II. THE I-TOUCH FRAMEWORK A software architecture that successfully merges high re- fresh rates, high fidelity multimodal interaction rendering (that is to say visual, haptic and audition) does not exist at this very moment. The integration complexity, and each modality related specific refresh rates, compel to account for one problem at a time and make use of PC based clusters to fulfill the computation time requirements. If we add the option to make such software open and flexible to serve as a research benchmarking tool, it makes it definitely a hard initiative: I- TOUCH is an alternative architecture build on the basis to satisfy both issues. CUSTOM OR BUILT-IN COLLISION DETECTION CUSTOM OR BUILT-IN BEHAVIOR MODEL SOUND HAPTIC DEVICES STEREO GRAPHICS CUSTOM DATA IMPORTED, SOUND, PHYSICAL PROPERTIES ... UTILS Fig. 1. I-TOUCH architecture: a simplified view. As shown in the figure 1, the I-TOUCH framework is designed to be modular. Each of its components can be replaced easily. Modularity is achieved through an object oriented design, implemented in C++. For academic purposes, two behavior models have been implemented and can be switched. Note that the “physics engine loop” is considered to be the most important part of the framework, and that the real world is “interfaced”. One of the major features of this framework is in handling haptic devices. Contrary to almost all available haptic libraries (including commercial ones), I-TOUCH treats a virtual ob- ject linked to any haptic device just like any other objects of the simulation environment. The virtual object is simply “attached” to the device. Then, thanks to its haptic proxies, I-TOUCH takes care of the proper interaction. This technique not only allows greater freedom but make very simple haptic device interchanging. In order to use any kind of passive or