HugMe: Synchronous Haptic Teleconferencing Jongeun Cha, Mohamad Eid, Ahmad Barghout, ASM Mahfujur Rahman, Abdulmotaleb El Saddik School of Information Technology University of Ottawa 800 King Edward Ottawa, Canada jcha@discover.uottawa.ca, {eid, abarghout,kafi, abed}@mcrlab.uottawa.ca ABSTRACT Traditional teleconferencing systems have enabled remote communications via audiovisual modalities. However, in real life, human touch such as encouraging pat plays a fundamen- tal role to physical and emotional communication between persons. This paper presents a synchronous haptic telecon- ferencing system with touch interaction to convey affection and intimacy. We present a preliminary prototype called HugMe. In this system, two remote users could see as well as touch each other. Keywords Teleconferencing, haptics 1. INTRODUCTION With recent advances in video teleconferencing systems such as high-definition (HD) and 3D video, the limit with what can be done with video contents has been reached. Fueled by several exciting discoveries, researchers nowadays have fostered their interest to incorporate the sense of touch in telecommunication systems [3]. For instance, haptics is crucial for interpersonal communication as a means to ex- press affection, intention or emotion; such as a handshake, a hug or physical contact [1]. The incorporation of force feedback in synchronous tele- conferencing multimedia systems has been challenged by the high haptic servo loop (typically 1 kHz) through network. On the other hand, asynchronous tactile playback does not provide real-time interaction. In this paper, we present a synchronous haptic teleconferencing system to enhance the physical intimacy in the remote interaction between users, which works with tolerable bandwidth (30-60Hz) for haptic data. In the system, an active user can see and touch a remote passive user, who is captured in 2.5D, using a 3-dof force feedback device. The passive user could feel the touch on the contacted skin through a haptic jacket that is composed Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. ACM MultiMedia ’09 Beijing, China Copyright 200X ACM X-XXXXX-XX-X/XX/XX ...$10.00. 2.5D Camera Color Image Depth Image Data Manager Marker Detector Network manager Haptic Jacket Driver Contact information Color and depth images Active user Human Model Manager Graphic Renderer Haptic Renderer Data Manager Network Manager Internet Marker positions Depth image Collision information Contact information Active side Passive side Control signal Marker positions Position Force Haptic device Passive user with haptic jacket Figure 1: System block diagram of array of vibrating motors. 2. SYSTEM DESCRIPTION This section describes a one-way version of the HugMe system, in which an active user touches the passive user who will then feel the touch. However, the same concept can be duplicated in the opposite way to enable the mutual touch interaction. Figure 1 shows the system block diagram. In the following, we briefly describe the comprising components of the system and its implementation. 2.1 Depth Video Camera In order to enable to touch the passive user in real-time, we use a depth camera, called ZCam TM1 that captures 2.5D scene in terms of general color images and synchronized gray-scale depth images containing per-pixel depth, namely Depth Image-based Representation (DIBR). DIBR is con- sidered a 2.5D representation in the sense that the depth image has incomplete 3D geometrical information describ- ing the scene from the camera view, and thus the active can 1 http://www.3dvsystems.com