A Step towards a Robotic System With Smartphone Working As Its Brain : An Assistive Technology Pritesh Sankhe, Selvia Kuriakose, Uttama Lahiri Electrical Engineering, IIT Gandhinagar Abstract—This paper describes a novel smartphone based navigation application. The smartphone based robotic system is sensitive to both the tactile and head-movement input commands from a user. Here we present our design used for developing a prototype of the robotic system as a proof-of-concept of an assistive technology that could facilitate partially disabled people to navigate effectively. Additionally we designed a usability study where our prototype was validated by seven healthy participants. Such an intelligent robotic system controlled by a smartphone can find a variety of applications based on navigation systems for disabled persons, educational tools especially for children with autism, surveillance and social telepresence. The hardware prototype could then be further used for development purposes to build a variety of applications using Application Program Interfaces (APIs) that can program the robot to do a custom task. Keywords—Human robot interaction, smartphones, navigation, assistive technology. I. INTRODUCTION Independent mobility is very important for individuals, both children and adults [1-2]. Independent mobility and navigation capability increases vocational and educational opportunities, reduces dependence on caregivers and family members, and promotes feelings of self-reliance [3]. For elderly people reduction in navigational mobility is often linked with reduced participation and loss of social connections [4]. This in turn can lead to feelings of emotional loss, reduced self-esteem, isolation, stress, and fear of abandonment [4]. Given the importance of navigational ability, individuals with hemiplegia have been shown to prefer smart manual wheelchair [3]. Literature review shows that several investigators have used alternative input methods, particularly, voice [5], eye gaze [6] and joystick [7]. Satisfactory operation of these interfaces depends to a considerable extent on the user’s ability to speak the input commands clearly and rapidly, the accuracy of calibration of the eye gaze interface and the muscle strength needed to operate the joystick all of which can impose constraints on complex maneuvering and effective navigation [3]. Thus development of user-friendly interfaces that can be used for effective navigation is critical. In our present research, we investigated the use of tactile and head-movement based input commands for navigation purpose. Unlike mouse-based interfaces that require users to visually track an onscreen cursor away from the hand, direct- touch input is advantageous since the input and display are co- located such that users can touch the graphical elements they are interacting with [8]. Again, studies have shown that head tracking is sometimes advantageous in achieving reduced task time [9]. For developing the user interface, we used the multifaceted functionality of smartphones. Recently, there has been a tremendous increase in the spread and use of smartphones. Smartphones have become prevalent in our daily life [10]. Almost everything from bill payments, video calls to locating favorite restaurants is ‘on the go’ and just a click away. These smartphones are evolving extra-ordinarily and are getting incorporated with a variety of hardware sensors and technologies. They offer considerable computational power and are equipped with powerful processors and graphical processing units. They come with multitude of in- built sensors such as accelerometers, cameras, microphone, wireless connectivity (via Bluetooth, WiFi, 3G) and Global Positioning System (GPS). Most importantly they run operating systems and that too at a relatively reasonable price. Literature review indicates that considerable efforts are being made to develop technology-assisted interfaces to promote human-robot interaction. Robotic agents with the ability to sense and respond to user’s feedback are of growing importance. Many complex autonomous robots with different form factors come with an onboard computer [11]. By employing a Smartphone as an onboard computer, the size of a robot can be reduced to a small size without considerably affecting any computational features. Moreover, combining robotic agents with smartphones will allow the robots to learn, adapt and change themselves rapidly. A similar application store for robots, where the bots can quickly add functionalities based on the application they are running would change how the current single purpose systems are perceived [12]. Using a smartphone as the ‘brain’ of a robot is already an active research field with several opportunities and promising possibilities [13]. This type of robotic system could find use as assistive technology developed for people with physical disabilities especially paraplegics. Paraplegics rely on power wheelchairs for mobility, but the hands-free controller systems currently available are obtrusive and expensive. In this paper, we present the design of an experimental prototype of a robotic wheelchair that accepts one’s tactile and head movement based input commands from a smartphone to carryout navigation task. This paper is organized as follows: In Section II we present the system design. In Section III, we describe our design of the usability study. In Section IV, we discuss the results of our usability study and we conclude