3D Smart User Interactive System with Real-Time Responding Tele-Robotic Proprioceptive Information Tiam Hee Tee, Kok Seng Eu, Kian Meng Yap Alan Marshall Tsung-Han Lee Faculty of Science Technology Sunway University, Bandar Sunway, Malaysia. Email: {12056644, 12058889, kmyap}@sunway.edu.my School of Electronics, Electrical Engineering and Computer Science Queens University Belfast, Belfast, Northern Ireland, UK. Email: a.marshall@qub.ac.uk Department of Computer & Information Science, National Taichung University, Taiwan Email: thlee@ntcu.edu.tw AbstractFeedback of proprioceptive information is essential for many tele-robotic systems, especially those designed to undertake tasks concerning hazardous environments and for efficient out-of-sight remote control applications. Given highly sensitive nature of these applications, even small errors (e.g. less than one degree of displacement in robot posture) can cause unnecessary risk. Thus, accurate feedback of proprioceptive information, as well as a technique to precisely interpret this information, is significant to operator. In this paper, we introduce a framework that uses pulse feedback mechanism to measure the proprioceptive information of a robot operating over real-time wireless communication and represent it in 3D model user interface. The 3D user interface enhances the interpretation of proprioceptive information to help operator to visualize the real-time relative position of the robot. The paper also provides results that demonstrate how the framework allows synchronization between 3D model and tele-robot to be achieved in real-time over wireless communications. Keywords Proprioceptive Information, Tele-robotics, Pulse Feedback, 3D User Interface, Real-time synchronization I. INTRODUCTION Nowadays, tele-operation technologies have been widely applied in many areas including military, medicals industries and even house-hold usages. Tele-operation concerns the manipulation of remote entities, robots or machines by local machines or computers [1]. Tele-robotics involves the remote manipulating of a robot with from a distance which provides the operator with a sense of security and comfort by remaining at a command center. Tele-robotics has typically been used to deal with hazardous materials and in dangerous or remote areas such as nuclear plant and deep ocean exploration, where it is often beyond the abilities of humans to undertake these tasks. An additional factor in these environments is that the robot is generally out-of-sight and so information about its position and status must also be provided to the operator. Visual information in robotics generally refers to First Person View (FPV) of the tele-robot itself. In other words, FPV is to see what the robot sees. FPV can be easily implemented by attaching a camera on the robot, enabling the operator to visualize the surrounding remote environment. Unfortunately, with the FPV visual information, the operator will never notice the proprioceptive information of the robot about its body partsand the angles of each linkage joint. Proprioception is defined as the sense of the relative position and orientation of neighbouring parts of the body [2]. It is the feedback of the body’s dimensional state in order to aid the perception of motion [3]. Given the application of tele-robotics in highly sensitive and hazardous situations, single degree displacement of the robot’s body parts is massively significant to the operator, who relies on the proprioceptive information to make decisions on manipulation. The accuracy of the angular degree of each joint is therefore significantly critical as errors in the displacement of robot may impact on the ability to undertake the remote task and hence affect the safety of the operator or loss revenue. To solve the problem, a Third Person View (TPV) on the robot is required. TPV refers to one point-of-view over the first person. In other words, TPV is to see on the robot itself instead of its surrounding environment. With TPV, the operator can be fully aware of the proprioceptive information about the angles of each linkage joint in the robot. In this paper, a framework is proposed in order to gain an accurate TPV of a robot through pulse feedback which is generated from encoder of the motor and transmitted over a short range wireless personal area network to the operator. This paper shows that this method provides a much more accurate degree of angle and non- flapping degree value. Furthermore, the TPV is presented as a 3D model in a user interface that is simultaneously updated based on pulse-feedback from the robot when its movement has been changed. An additional advantage of interpreting the proprioceptive information in a 3D model user interface is to aid visualization of the current position of robotic arm allowing the operator to easily identify the angle of each linkage joint in the robotic arm and hence give the appropriate commands to control the robot, for example in gripping a highly dangerous object.