Journal of Intelligent and Robotic Systems 34: 301–314, 2002. © 2002 Kluwer Academic Publishers. Printed in the Netherlands. 301 A Wheelchair Steered through Voice Commands and Assisted by a Reactive Fuzzy-Logic Controller GABRIEL PIRES and URBANO NUNES Institute of Systems and Robotics – Polo II, University of Coimbra, 3030 Coimbra, Portugal; e-mail: {gpires, urbano}@isr.uc.pt Abstract. This paper describes new results with a Reactive Shared-Control system that enables a semi-autonomous navigation of a wheelchair in unknown and dynamic environments. The purpose of the reactive shared controller is to assist wheelchair users providing an easier and safer navigation. It is designed as a fuzzy-logic controller and follows a behaviour-based architecture. The implemented behaviours are three: intelligent obstacle avoidance, collision detection and contour following. Intel- ligent obstacle avoidance blends user commands, from voice or joystick, with an obstacle avoidance behaviour. Therefore, the user and the vehicle share the control of the wheelchair. The reactive shared control was tested on the RobChair powered wheelchair prototype [6] equipped with a set of ranging sensors. Experimental results are presented demonstrating the effectiveness of the controller. Key words: voice human–machine interface, shared control, fuzzy control, behaviour-based archi- tecture. 1. Introduction RobChair project aims to apply robotic algorithms on a powered wheelchair in order to improve mobility and safety. Users with severe motor handicaps such as tetraplegia and general muscle degeneration are unable to steer their own wheel- chair through a conventional joystick, often depending on other persons. By en- dowing the wheelchair with new Human–Machine Interfaces (HMI) and increasing the wheelchair navigation autonomy [7], it is possible to contribute to the social independence of this group of wheelchair users. In order to extend the RobChair accessibility, a new interface has been incor- porated on the wheelchair: a voice HMI. The voice/speech is a natural form of communication and suits perfectly for users with severe motor limitations. How- ever, this new interface does not solve completely the steering problem. Low-level voice commands are discrete and give rough direction information. In domes- tic/office environments usually with narrow spaces and dynamic obstacles it is very This research was partially supported by FCT (Portuguese Science and Technology Foundation) under contract POSI/1999/SRI/33594. The first author would like to thank all the support given by the Polytechnic Institute of Tomar.