Abstract— In this study, a Brain Computer Interface (BCI) based on the P300 oddball paradigm has been developed for spatial navigation control in virtual environments. Functionality and efficacy of the system were analyzed with results from nine healthy volunteers. Each participant was asked to gaze at an individual target in a 3x3 P300 matrix containing different symbolic navigational icons while EEG signals were collected. Resulting ERPs were processed online and classification commands were executed to control spatial movements within the MazeSuite virtual environment and presented to the user online during an experiment. Subjects demonstrated on average, ~89% online accuracy for simple mazes and ~82% online accuracy in longer more complex mazes. Results suggest that this BCI setup enables guided free- form navigation in virtual 3D environments. I. INTRODUCTION RAIN Computer Interface (BCI) systems translate brain- derived non-muscular signals into new pathways of communication and control [1]. These new mechanisms provide a direct channel from brain activity to action, acting as a potential alternative to neuromuscular control in cases where such routes are compromised, or serving in a supplementary nature for healthy individuals. BCI research efforts principally focus on the restoration of communication for patients suffering from crippling neuromuscular diseases such as amyotrophic lateral sclerosis (ALS), and neuroprosthetic control in amputees and spinal cord injury victims. However BCI has been recently extended towards non-disabled individuals with applications in gaming, entertainment, and 3D virtual environments. BCIs designed for use in virtual environments provide several advantages to researchers. Use of interactive feedback from the task, increases protocol engagement and subject motivation, two factors which have been associated with improved BCI performance [2] along with reducing training times [3]. Virtual reality BCI systems have been implemented to prototype control of physical systems such as wheelchair and other robotic systems [4, 5], as well as provide test-bed platforms for further BCI development [6]. The P300 spelling matrix BCI first described by Farwell and Donchin [7] is considered one of the classic BCI systems. It relies on the elicitation of the P300 event-related potential (ERP) through an oddball paradigm of randomly intensified icon rows and columns. The P300 component of Manuscript received March 15, 2012. A. Curtin, H. Ayaz, Y. Liu, and B. Onaral are with the Drexel University School of Biomedical Engineering Science & Health Systems, and P. Shewokis is with the College of Nursing and Health Professions, Drexel University, Philadelphia, PA 19104 , USA Phone: 215-571-3709; Fax: 215-571-3718; e-mail:(abc48, ayaz, yl565, shewokis, banu.onaral )@drexel.edu. the ERP is associated with a positive peak that develops approximately 300ms after a rare stimulus is intensified. By instructing participants to focus on a particular target, the intensification of that target will produce a stronger P300 than non-target intensifications. After online processing, classified results were used to drive a virtual alpha-numeric keyboard. P300-based BCI systems possess the advantages of rapid development, minimal training times and relatively high information transfer rates [8]. The use of P300 for spatial navigation has not been fully explored but has been described for virtual and physical control of wheelchair systems involving automated navigation have been reported in literature [9, 10]. Another study used a P300 system which featured position selection as a component of a virtual environment control system in which the subject’s avatar was automatically moved to a specified virtual destination [11]. Additionally a study adapting a four-direction P300 cursor control system to a four door selection task has been reported [12]. In that study, each room in the virtual environment was identical, and additional icons next to targets were used to identify the desired action rather than information from a spatial task. This paper documents the development and demonstration of a P300 based BCI system that allows guided free-form spatial navigation in complex virtual 3D environments. A new P300 matrix was developed that controls navigation from first person view in the virtual environment. The system was created through the integration of two freely available software packages: the BCI2000 framework [13] and the MazeSuite virtual environment platform [14-16]. II. MATERIALS AND METHODS A. Subjects Ten (9 male, 1 female) healthy right-handed (Edinburg Handedness Inventory[17] LQ = 64±23.75) participants aged 19-23 (mean age=22.1) volunteered to participate in the experiment however one subject was discarded due to technical issues with recording equipment (N=9). Each individual gave written informed consent through documentation approved by the Drexel University IRB and answered demographic and survey questions related to the protocol. Participants were paid for their time, self-selected based on exclusion criteria concerning drug usage and prescription medications known to have psychiatric effects, and all self-reported no prior experience in BCI use or research. B. EEG Setup Data acquisition was completed using a Neuroscan A P300-based EEG-BCI for Spatial Navigation Control Adrian Curtin, Hasan Ayaz, Yichuan Liu, Patricia A. Shewokis, Banu Onaral B