The SmartTool: A system for augmented reality of haptics Takuya NOJIMA, Dairoku SEKIGUCHI, Masahiko INAMI and Susumu TACHI The University of Tokyo {tnojima, dairoku, minami, tachi}@star.t.u-tokyo.ac.jp Abstract Previous research on augmented reality has been mainly focused on augmentation of visual or acoustic information. However, humans can receive information not only through vision and acoustics, but also through haptics. Haptic sensation is very intuitive, and some researchers are focusing on making use of haptics in augmented reality systems. While most previous research on haptics is based on static data, such as that generated from CAD, CT, and so on, these systems have difficulty responding to a changing real environment in real time. In this paper, we propose a new concept for the augmented reality of haptics, the SmartTool. The SmartTool responds to the real environment by using real time sensor(s) and a haptic display. The sensor(s) on the SmartTool measure the real environment then send us that information through haptic sensation. Furthermore, we will describe the prototype system we have developed. 1. Introduction Many augmented reality systems use vision to send information to the human[1]. For example, M. Bajura et al. proposed an augmented reality system for medical application, which uses an ultrasound scanner to scan and visualize the inside of the patient’s body, and the operator can see that image through an HMD[2]. M. Kanbara et al. proposed an augmented reality system that superimposes a virtual image to the real environment by using a stereoscopic video see-through HMD[3]. They also proposed a new registration method between the virtual environment and the real environment, which is another important problem for augmented reality systems. Through vision and acoustics, complex information can be sent to humans, especially when using literal or verbal messages. However, the human needs to interpret that kind of information before he or she can move. This interpretation can be a stressful job in some specific situations like surgical operation or some kind of dangerous situations. In these situations, it is often hard to read, hear, and understand the message. The human can only pay attention to one thing at a time, and paying attention to the visual or acoustic message means less attention to the task itself. Thus, to relieve the necessity of stressful interpretation, a flashing message or an alarm is often used to send redundant information to the human. However, these messages do not have any effect on the task itself, and there is still the delay of human signal processing because the human can only move after interpretation and judgment of the message. To solve that problem, some research has focused on using haptic sensation in augmented reality. Touch is a very intuitive human sensation that does not need interpretation. Besides, when using force sensation, the force could support the human task in a practical way. For example, in surgical operations, there are many vital tissues that should not be damaged in human body. When a surgical tool is in the proximity of such tissue, the blinking message or alarm itself does not have any practical effect on avoiding accidental damage - only the human can do that. In such cases, using haptic sensation could be a solution. Hong et al. proposed an interactive navigation system, which navigates an endoscopic camera in the human colon[4]. They make a potential field inside the colon to navigate an endoscopic camera. The force from that potential field prevents the camera from damaging the tissue, and leads to the target polyp. They made the potential field based on static CT data of the patient’s colon. However, such static data is not always appropriate because the real environment changes dynamically. Simulations can be used to respond to a changing environment, but they are often hard to update in real time. Frank et al. proposed a real time haptic simulation using a finite element method[5], and described the trade-off between the number of nodes and the speed of their algorithm. Mendoza et al. proposed a system to touch deformable virtual objects using physical simulation[6]. For real time display of haptic sensation, they separate their system to a haptic display component and a physical simulation component, but the servo loop of the haptic display component has a frequency of 1kHz, and the physical simulation component has only 10Hz. In addition, registration between static data and the real environment would require great effort. Generally, such