GyGSLA: A Portable Glove System for Learning Sign Language Alphabet Lu´ ıs Sousa 1 , Jo˜ ao M.F. Rodrigues 1(B ) , Jˆ anio Monteiro 2 , Pedro J.S. Cardoso 1 , and Roberto Lam 1 1 LARSyS and ISE, University of the Algarve, 8005-139 Faro, Portugal luiscarlosrsousa@outlook.com, {jrodrig,pcardoso,rlam}@ualg.pt 2 INEC-ID (Lisbon) and ISE, University of the Algarve, 8005-139 Faro, Portugal jmmontei@ualg.pt Abstract. The communication between people with normal hearing with those having hearing or speech impairment is difficult. Learning a new alphabet is not always easy, especially when it is a sign lan- guage alphabet, which requires both hand skills and practice. This paper presents the GyGSLA system, standing as a completely portable setup created to help inexperienced people in the process of learning a new sign language alphabet. To achieve it, a computer/mobile game-interface and an hardware device, a wearable glove, were developed. When interact- ing with the computer or mobile device, using the wearable glove, the user is asked to represent alphabet letters and digits, by replicating the hand and fingers positions shown in a screen. The glove then sends the hand and fingers positions to the computer/mobile device using a wire- less interface, which interprets the letter or digit that is being done by the user, and gives it a corresponding score. The system was tested with three completely inexperience sign language subjects, achieving a 76 % average recognition ratio for the Portuguese sign language alphabet. Keywords: HCI · Gesture recognition · Sign Language · Assistive technologies 1 Introduction Sign Language (SL) is a communication medium for the deaf and mute people, and Natural User Interface (NUI) is a term used for Human-Computer Interac- tion (HCI) where the interface is invisible or becomes invisible after successive user-immersion levels. Typically relies in nature or human natural elements. Sign language uses manual communication and body language to convey meaning, which can involve simultaneously combining hand shapes, orientation and movement of the hands, arms or body, and facial expressions to fluidly express a speaker’s thoughts. In terms of NUI, currently there are several sensors with the ability of track- ing and recognize body gestures, such as Kinect [6], Leap Motion [8] and Struc- ture Sensor [11]. All these sensors have a great importance to the industry of c Springer International Publishing Switzerland 2016 M. Antona and C. Stephanidis (Eds.): UAHCI 2016, Part III, LNCS 9739, pp. 159–170, 2016. DOI: 10.1007/978-3-319-40238-3 16