HSI 2013 Sopot, Poland, June 06-08, 2013
978-1-4673-5637-4/13/$31.00 ©2013 IEEE
Abstract. Brain computer interface (BCI) systems allow
interaction with machines through a channel that does not
involve the traditional motor pathways of the human nervous
system. Thus they can be used by people with severe motor
disabilities or those whose limbs are occupied with other
tasks. In BCI systems that recently showed greatest interest of
researchers, electrical brain activity is measured on the scalp,
thus basically they are noninvasive. Using the EEG
measurements as the input to the BCI offers the advantages of
low cost and high time resolution. However, due to small
amplitude of the signal components, relatively high power of
noise and poor spatial resolution, achieving large speed,
accuracy and the number of targets is a challenge. At present,
the steady-state visual evoked potential (SSVEP) BCI
paradigm is believed to provide the most promising way of
optimizing the BCI performance in that sense. A review of the
SSVEP BCI projects is presented, including studies of
biodiversity of human EEG response to visual excitation, as
well as the design of techniques for visual stimulation, EEG
signal acquisition and analysis for best BCI performance. The
review is based both on the literature and results of own
teamwork.
Keywords: Brain-computer interface (BCI), Steady-state
visual evoked potentials (SSVEP), Alternate half-field
stimulation.
I. INTRODUCTION
HE number of “smart” devices and appliances
around us grows quickly in the last decades. Not even
computers, tablets, cellular phones do comprise a processor
with a complex program. Operation and performance of
cars, home appliances, such as washing machine,
microwave oven, TV set, etc. strongly depend on the
computational power and quality of software of the digital
electronic systems embedded in it. The growth of the
number of various applications of computers to daily life
situations is tremendous and continues beyond imagination.
Still, it seems the rate of progress in the performance of the
computational systems is not accompanied by an equally
fast development of the interfaces necessary for
information exchange between machines and their users.
As an example, for more than 40 years the standard
man-machine interfaces for a personal computer are
keyboard, LCD text/graphics display and a mouse.
Especially the keyboard, whose principle of operation has
not changed much since the invention of typewriter in the
middle 19th century, does not meet users expectations.
Even in its touchable version, the keyboard requires from
its user a special training (which in most cases is not
performed, in fact) and the speed of transferring the
information from human to computer through this channel
has not increased for the last 150 years. (May be there is a
little increase in the speed of entering the plain text or
commands – with the use of prompt dictionaries.) Anyway,
using the keyboard requires unnatural repetitive
movements of fingers. Computer use has been associated
with musculoskeletal problems of the upper extremities:
neck, shoulders, arms, etc. Several postures and behaviors
have been suggested as related to these disorders, including
position in which the hands are held, neck and shoulder
position, wrist posture and hyperextension of the fingers, to
name a few [1].
Research work on alternative man-computer interfaces,
better utilizing natural ways of human communication are
then very well justified, if not urgently needed. It is
expected the new approaches will provide better
functionality and speed, as well as will be easier to use and
its usage will not cause severe disorders of the organs of the
human body. Examples of the devices of this type are
joystick, graphics tablet and touchpad, with software to
recognize hands multi-touch and gestures. One should
notice that almost all of the new interfaces (except perhaps
speech recognition and text-to-speech software) require
movements of the user limbs or fingers. However, there are
groups of users who are not able to make such movements.
There are fighter pilots or car drivers, whose limbs are
occupied with some tasks. There are motor-impaired
people, or persons paralyzed after accidents or neurological
diseases who cannot move their limbs, cannot speak, but
whose mind operates normally and they need ways of
communication with the external world.
Thus, there is a need to develop interfaces that would
allow users to enter data into computers without involving
the traditional motor pathways of the human nervous
system. A solution is a brain-computer interface (BCI) [2].
Operation of a BCI is based on analysis of the activity of the
brain and is independent of an activity of muscles or other
nerves. In those interfaces, the intention/will of a user is not
expressed by any movement, gesture or command; it is
rather “guessed” by the analysis of some measured signals
that reflect the brain activity.
Research projects aimed at development of BCI started
about 40 years ago. Some of them have already brought
spectacular results, such as mind-controlled prosthesis
developed at Technical University Graz [3]. Still, the BCI
functionality is far from the expectations. This gives
High-Speed Noninvasive
Brain-Computer Interfaces
A. Materka
1
, P. Poryzała
1
1
Lodz University of Technology, Lodz, Poland,
andrzej.materka@p.lodz.pl, pawel.poryzala@p.lodz.pl
T