Analysis of bit-rate definitions for Brain-Computer Interfaces Julien Kronegg Svyatoslav Voloshynovskiy Thierry Pun Computer Vision and Multimedia Laboratory, Computer Science Dept., University of Geneva, Switzerland Email: {first name.surname}@cui.unige.ch, phone: +41 22 3797628, fax: +41 22 3797780 Abstract - A comparison of different bit-rate definitions used in the Brain-Computer Interface (BCI) community is proposed; assumptions underlying those definitions and their limitations are discussed. Capacity estimates using Wolpaw and Nykopp bit-rates are computed for various published BCIs. It appears that only Nykopp's bit-rate is coherent with channel coding theory. Wol- paw's definition might lead to underestimate the real bit- rate and to infer wrong conclusions about the optimal number of symbols; its use should be avoided. The us- age of a proper bit-rate assessment is motivated and advocated. Finally, it is found that the typical signal-to- noise ratio of current BCIs lies around 0 dB. Keywords: brain-computer interface, bit-rate, informa- tion transfer rate, number of classes, information theory. 1 Introduction A Brain-Computer Interface (BCI) is an input device that allows a user to drive a specific application (e.g. virtual keyboard [10], cursor control [28], robot control [24]) using EEG data induced by thinking to a specific notion or mental state (e.g. mental calculation, imagination of movement, mental rotation of objects). This mental state is then recognized by the machine using a classifier. The first objective BCI performance measure is due to Wol- paw et al in 1998 [28], where the bit-rate, or information- transfer rate, was defined based on Shannon channel theory with some simplifying assumptions. Bit-rates commonly reported range from 5 to about 25 bits/minute [29]. In this article, we compare the bit-rate definitions used in the BCI domain and propose recommendations for optimizing the number of mental states in a BCI. The article is organized as follows: first, a review of the noisy channel theory is presented, as a support to the BCI model described. Existing bit-rates definitions used in the BCI domain are then presented and analyzed. 2 Noisy Channel Theory A channel is a communication medium that allows the transmission of information from a sender A to a re- ceiver B. Due to imperfections in that medium, the transmission process is subject to noise and B might receive information differing from the one emitted by A. The simplest noisy channel is the additive noise channel where the received signal Y is the sum of an emitted signal X and some independent noise Z here assumed Gaussian. Since we deal with real, physical input signals, the input signal energy is limited (which also implies that X has zero mean in order to minimize its energy, E[X 2 ]≤σ X ). The information channel capac- ity is the quantity of reliable information carried by one symbol transmitted through the channel. The channel capacity depends on the input signal distri- bution as well as on the signal-to-noise ratio (SNR) [5]. For continuous input signal and using SNR=10⋅log 10 (σ X 2 /σ Z 2 ), the capacity (in bits/symbol) is: ( ) 2 10 log 1 10 0.5 SNR C = + ⋅ For discrete Pulse Amplitude Modulated input with N symbols of a priori probability p(X=x i )=1/N (denoted p(x i )), the capacity C N is defined by Eq. 2. ( ) ( ) ( ) ( ) ( ) ( ) () ( ) 1 2 2 2 1 2 2 ( ) | | | log 1 | Z N j j j k Z N i N i i i y k y x p y px p y x p y x C p y x px dy p y p y x e σ πσ = +∞ = =−∞ − − = = = ∑ ∑ ∫ 0 1 2 3 4 5 6 7 8 9 10 -20 -10 0 10 20 30 40 50 60 SNR [dB] Information [bits/symbol] N=256 N=128 N=64 N=32 N=16 N=8 N=4 N=3 N=2 continuous gaussian source asymptotical capacity for discrete equiprob. signal N=512 Figure 1: Comparison of the capacity for Gaussian input C and for discrete equiprobable input C N as a function of the number of symbols N and of the SNR. (1) (2) 2005 Int. Conf. on Human-computer Interaction (HCI'05), Las Vegas, Nevada, USA, June 20-23, 2005. http://vision.unige.ch/