Using Mid-Air Gestures To Enhance Collaborative Experiences For Disabled Users On Multi-Touch Tables Chris Creed 1 , John Sear 1 , Russell Beale 1 1 Digital Humanities Hub, European Research Institute, University of Birmingham, Birmingham (UK), B15 2TT {creedcpl, j.a.sear}@bham.ac.uk, r.beale@cs.bham.ac.uk Abstract. Multi-touch tables can help facilitate and improve collaborative interactions yet many users with physical disabilities (e.g. wheelchair users) find it extremely difficult to interact with large tables. This paper presents work-in-progress that is exploring the potential to enhance multi-touch table accessibility to enable disabled users to participate more effectively in collaborative interactions. An overview of a prototype that allows disabled users to manipulate digital content via mid-air finger gestures is provided along with details of upcoming research studies we will be conducting. 1 Introduction Multi-touch tables can help encourage collaborative and shared experiences through enabling multiple users to interact with digital content simultaneously. However, whilst they hold much potential in facilitating collaborative interactions, there remain many users with physical disabilities who find it extremely problematic to interact with them. This is especially true for wheelchair users who find it difficult to park close to tables and as such are severely restricted and constrained in their ability to use a touch table. This, in turn, makes it particularly challenging for disabled users to effectively participate in collaborative interactions with non-disabled users. We are currently exploring a low cost solution to examine whether the use of mid- air finger gestures can make touch tables more accessible to wheelchair users and the impact this has on their ability to actively contribute in collaborative tasks. To investigate this we are making use of the new Leap Motion sensor that enables users to control interfaces using their fingers and other "pointers" (e.g. a pencil or stick). This sensor can "see" a user's hands and can be programmed to detect different mid- air gestures performed by users in a 3D space (e.g. grab, swipe, and wrist rotation). The Leap Motion sensor is also small and light enough that it can comfortably be incorporated into the border around a table thus providing wheelchair users with an unobtrusive tool for accessing all areas of the screen. The combination of multi-touch and mid-air gestures in a single interface holds much potential for enhancing the accessibility of multi-touch tables. In this paper we provide details of a prototype that enables wheelchair users to interact with multi-