BREAKING NEWS

Psychology

Nature

Places

Tuesday, 1 November 2016

Brain-Computer Interfaces in Medicine


Article source : ncbi.nlm.nih.gov
Until recently, the dream of being able to control one's environment through thoughts had been in the realm of science fiction. However, the advance of technology has brought a new reality: Today, humans can use the electrical signals from brain activity to interact with, influence, or change their environments. The emerging field of brain-computer interface (BCI) technology may allow individuals unable to speak and/or use their limbs to once again communicate or operate assistive devices for walking and manipulating objects. Brain-computer interface research is an area of high public awareness. Videos on YouTube as well as news reports in the lay media indicate intense curiosity and interest in a field that hopefully one day soon will dramatically improve the lives of many disabled persons affected by a number of different disease processes.
This review seeks to provide the general medical community with an introduction to BCIs. We define BCI and then review some of the seminal discoveries in this rapidly emerging field, the brain signals used by BCIs, the essential components of a BCI system, current BCI systems, and the key issues now engaging researchers. Challenges are inherent in translating any new technology to practical and useful clinical applications, and BCIs are no exception. We discuss the potential uses and users of BCI systems and address some of the limitations and challenges facing the field. We also consider the advances that may be possible in the next several years. A detailed presentation of the basic principles, current state, and future prospects of BCI technology was recently published.

What is a BCI?

A BCI is a computer-based system that acquires brain signals, analyzes them, and translates them into commands that are relayed to an output device to carry out a desired action. Thus, BCIs do not use the brain's normal output pathways of peripheral nerves and muscles. This definition strictly limits the term BCI to systems that measure and use signals produced by the central nervous system (CNS). Thus, for example, a voice-activated or muscle-activated communication system is not a BCI. Furthermore, an electroencephalogram (EEG) machine alone is not a BCI because it only records brain signals but does not generate an output that acts on the user's environment. It is a misconception that BCIs are mind-reading devices. Brain-computer interfaces do not read minds in the sense of extracting information from unsuspecting or unwilling users but enable users to act on the world by using brain signals rather than muscles. The user and the BCI work together. The user, often after a period of training, generates brain signals that encode intention, and the BCI, also after training, decodes the signals and translates them into commands to an output device that accomplishes the user's intention.

Milestones in BCI Development

Can observable electrical brain signals be put to work as carriers of information in person-computer communication or for the purpose of controlling devices such as prostheses? That was the question posed by Vidal in 1973. His Brain-Computer Interface Project was an early attempt to evaluate the feasibility of using neuronal signals in a person-computer dialogue that enabled computers to be a prosthetic extension of the brain. Although work with monkeys in the late 1960s showed that signals from single cortical neurons can be used to control a meter needle, systematic investigations with humans really began in the 1970s. Initial progress in human BCI research was slow and limited by computer capabilities and our own knowledge of brain physiology. By 1980, Elbert et al demonstrated that persons given biofeedback sessions of slow cortical potentials in EEG activity can change those potentials to control the vertical movements of a rocket image traveling across a television screen. In 1988, Farwell and Donchin showed how the P300 event-related potential could be used to allow normal volunteers to spell words on a computer screen. Since the 1950s, the mu and beta rhythms (ie, sensorimotor rhythms) recorded over the sensorimotor cortex were known to be associated with movement or movement imagery. In the late 1970s, Kuhlman showed that the mu rhythm can be enhanced by EEG feedback training. Starting from this information, Wolpaw et altrained volunteers to control sensorimotor rhythm amplitudes and use them to move a cursor on a computer screen accurately in 1 or 2 dimensions. By 2006, a microelectrode array was implanted in the primary motor cortex of a young man with complete tetraplegia after a C3-C4 cervical injury. Using the signals obtained from this electrode array, a BCI system enabled the patient to open simulated e-mail, operate a television, open and close a prosthetic hand, and perform rudimentary actions with a robotic arm. In 2011, Krusienski and Shih demonstrated that signals recorded directly from the cortical surface (electrocorticography [ECoG]) can be translated by a BCI to allow a person to accurately spell words on a computer screen. Brain-computer interface research is growing at an extremely rapid rate, as evidenced by the number of peer-reviewed publications in this field over the past 10 years (Figure 1).
FIGURE 1
Brain-computer interface articles in the peer-reviewed scientific literature. Over the past 15 years, BCI research, which was previously confined to a few laboratories, has become an extremely active and rapidly growing scientific field. Most articles ...

Physiologic Signals Used By BCIs

In principle, any type of brain signal could be used to control a BCI system. The most commonly studied signals are the electrical signals produced mainly by neuronal postsynaptic membrane polarity changes that occur because of activation of voltage-gated or ion-gated channels. The scalp EEG, first described by Hans Berger in 1929, is largely a measure of these signals. Most of the early BCI work used scalp-recorded EEG signals, which have the advantages of being easy, safe, and inexpensive to acquire. The main disadvantage of scalp recordings is that the electrical signals are significantly attenuated in the process of passing through the dura, skull, and scalp. Thus, important information may be lost. The problem is not simply theoretical: epileptologists have long known that some seizures that are clearly identifiable during intracranial recordings are not seen on scalp EEG. Given this possible limitation, recent BCI work has also explored ways of recording intracranially.
Small intracortical microarrays like the one implanted in the previously mentioned case of tetraplegia may be embedded in the cortex. These intracortical microarray systems can record the action potentials of individual neurons and the local field potentials (essentially a micro-EEG) produced by a relatively limited population of nearby neurons and synapses. The disadvantages of such implants are the degree of invasiveness, with the need for craniotomy and neurosurgical implantation, the restricted area of recording, and the still unanswered question of the long-term functional stability of the recording electrodes. In addition to scalp EEG and intracortical BCIs, ECoG-based BCIs use another approach to record brain signals. These BCIs use signals acquired by grid or strip electrodes on the cortical surface or stereotactic depth macroelectrodes that record intraparenchymally or from within the ventricles. These electrode arrays have the advantage of recording intracranially and can record from larger areas of the brain than intracortical microarrays. However, these electrodes also need neurosurgical implantation, and the question of long-term electrode signal recording stability is as yet unanswered. Each of these methods has its own strengths and weaknesses. Which ones are best for which purposes and which user populations remains to be seen. As BCIs come into clinical use, the choice of the recording method is likely to depend in considerable measure on the needs of the individual BCI user and the technological support and resources available (Table).
TABLE
Brain Signal Recording Techniques to Control Brain-Computer Interface Systems
The advance of functional neuroimaging techniques with high spatiotemporal resolution now provides potential new methods for recording brain signals to control a BCI. Magnetoencephalography (MEG) measures mainly the magnetic fields generated by electrical currents moving along pyramidal cell axons. The mu rhythm as detected by MEG was used by a sensorimotor BCI system to control a computer cursor.Modulation of the posterior alpha rhythm as recorded by MEG was felt to produce satisfactory control of a 2-dimensional BCI task. Functional magnetic resonance imaging (fMRI) and functional near-infrared imaging (fNIR) measure the blood oxygenation of a cerebral region and correlate with neural activity. Lee et al demonstrated that control of a robotic arm only through the person's thought processes was possible using a real-time fMRI-based BCI. These BCI methods are in the early phases of research and development. MEG and fMRI are at present extremely expensive and cumbersome, and fMRI and fNIR have relatively slow response times. Thus, the potential value of these newer functional imaging methods for BCI purposes remains uncertain (although fMRI could prove valuable in locating appropriate locations for implantations of microelectrode arrays).

Components of a BCI System

The purpose of a BCI is to detect and quantify features of brain signals that indicate the user's intentions and to translate these features in real time into device commands that accomplish the user's intent (Figure 2). To achieve this, a BCI system consists of 4 sequential components: (1) signal acquisition, (2) feature extraction, (3) feature translation, and (4) device output. These 4 components are controlled by an operating protocol that defines the onset and timing of operation, the details of signal processing, the nature of the device commands, and the oversight of performance. An effective operating protocol allows a BCI system to be flexible and to serve the specific needs of each user.
FIGURE 2
Components of a BCI system. Electrical signals from brain activity are detected by recording electrodes located on the scalp, on the cortical surface, or within the brain. The brain signals are amplified and digitized. Pertinent signal characteristics ...

Signal Acquisition

Signal acquisition is the measurement of brain signals using a particular sensor modality (eg, scalp or intracranial electrodes for electrophysiologic activity, fMRI for metabolic activity). The signals are amplified to levels suitable for electronic processing (and they may also be subjected to filtering to remove electrical noise or other undesirable signal characteristics, such as 60-Hz power line interference). The signals are then digitized and transmitted to a computer.

Feature Extraction

Feature extraction is the process of analyzing the digital signals to distinguish pertinent signal characteristics (ie, signal features related to the person's intent) from extraneous content and representing them in a compact form suitable for translation into output commands. These features should have strong correlations with the user's intent. Because much of the relevant (ie, most strongly correlated) brain activity is either transient or oscillatory, the most commonly extracted signal features in current BCI systems are time-triggered EEG or ECoG response amplitudes and latencies, power within specific EEG or ECoG frequency bands, or firing rates of individual cortical neurons. Environmental artifacts and physiologic artifacts such as electromyographic signals are avoided or removed to ensure accurate measurement of the brain signal features.

Feature Translation

The resulting signal features are then passed to the feature translation algorithm, which converts the features into the appropriate commands for the output device (ie, commands that accomplish the user's intent). For example, a power decrease in a given frequency band could be translated into an upward displacement of a computer cursor, or a P300 potential could be translated into selection of the letter that evoked it. The translation algorithm should be dynamic to accommodate and adapt to spontaneous or learned changes in the signal features and to ensure that the user's possible range of feature values covers the full range of device control.

Device Output

The commands from the feature translation algorithm operate the external device, providing functions such as letter selection, cursor control, robotic arm operation, and so forth. The device operation provides feedback to the user, thus closing the control loop.

Current Electrophysiologic BCI Systems

BCIs That Use Scalp-Recorded EEG

Noninvasive EEG-based BCIs are the most widely researched approach owing to the minimal risk involved and the relative convenience of conducting studies and recruiting participants. The applications to date are generally limited to low-degree-of-freedom continuous movement control and discrete selection. Sensorimotor rhythms have been used to control cursors in 1, 2, and 3 dimensions, a spelling device, conventional assistive devices, a hand orthosis, functional electrical stimulation (FES) of a patient's hand, robotic and prosthetic devices, and a wheelchair. Two-dimensional cursor control has also been achieved via attention modulation.
Because of its relative ease of implementation and performance, one of the most researched BCI paradigms is the visual P300 speller, which has been demonstrated successfully in both healthy and disabled persons for typing, Internet browsing, guidance of a wheelchair along predetermined paths, and other applications. Like the P300 evoked response, steady-state visual evoked potentials are innate and require no training, but they are capable of providing faster response times. On the other hand, P300-based BCIs are much less dependent than steady-state visual evoked potentials based BCIs on eye-movement control. Steady-state visual evoked potentials have been used for binary selection, both discrete and continuous control of a cursor in 2 dimensions, prosthesis control, FES, spelling, and environmental control. For patients with impaired vision, various auditory and tactile paradigms have been investigated. A few studies are now focused on the critical need to move BCI systems out of the laboratory and into patients' homes, which raises many complex patient, caregiver, and implementation issues.
In addition, some researchers are exploring the use of BCIs in neurorehabilitation. The hypothesis is that BCIs can augment current rehabilitation therapies by reinforcing and thereby increasing more effective use of impaired brain areas and connections. Studies in stroke patients have shown that, with a motor relearning intervention, EEG features change in parallel with improvement in motor function and that sensorimotor rehabilitation using BCI training and motor imagery may improve motor function after CNS injury. It also appears that combining a BCI with FES or assistive robotics may aid motor relearning in stroke patients. Brain-computer interface-based therapy might provide a useful complement to standard neurorehabilitation methods and might lower cost by reducing the need for the constant presence of a rehabilitation therapist.

BCIs That Use ECoG Activity

ECoG activity is recorded from the cortical surface, and thus it requires the implantation of a subdural or epidural electrode array. ECoG records signals of higher amplitude than EEG and offers superior spatial resolution and spectral bandwidth. In addition to the lower-frequency (<40 Hz) activity that dominates the EEG, ECoG includes higher-frequency (ie, >40-Hz gamma band) activity up to 200 Hz and possibly higher. Gamma activity is important because it exhibits very precise functional localization; is highly correlated with specific aspects of motor, language, and cognitive function; and is linked to the firing rates of individual neurons and to blood-oxygen level–dependent signals detected by fMRI.
Individual finger, hand, and arm movements have been decoded successfully from ECoG. ECoG-based BCIs have controlled 1- or 2-dimensional cursor movements using motor or sensory imagery or working memory (dorsolateral prefrontal cortex). An ECoG-based BCI can enable users to control a prosthetic hand or to select characters using motor-imagery or the P300 event-related potential. Most recently, ECoG signals measured over speech cortex during overt or imagined phoneme and word articulation were used for online cursor control and were also accurately decoded off-line for potential application to direct speech synthesis.
It has been shown that epidural ECoG can provide BCI control, that ECoG-based BCI performance with fixed parameters is stable over at least 5 days, and that motor imagery–based BCI control using locations over motor cortex can produce ECoG changes exceeding those produced by actual movements. A study in monkeys found that ECoG recordings and the performance of the model used to decode movement remained stable over several months. These results suggest that ECoG is likely to prove practical for long-term BCI use.

BCIs That Use Activity Recorded Within the Brain

Hochberg et al are continuing clinical trials using a 96-electrode microarray implanted in the right precentral gyrus of patients with tetraplegia (Figure 3). These trials have demonstrated control of a robotic arm, computer cursor, lights, and television, using imagined arm movements. They have recently demonstrated that accurate cursor control performance was still obtainable 1000 days after implantation. Current research is exploring the use of this system for the control of prosthetic limbs and brain-actuated FES of paralyzed muscles.
FIGURE 3
Intracortical microelectrode array and its placement in a patient with tetraplegia. A, The 100-microelectrode array on top of a US penny. B, The microelectrode array in a scanning electron micrograph. C, The preoperative axial T1-weighted magnetic resonance ...
Kennedy et al are continuing clinical trials of a system that uses intracortical microelectrodes encapsulated in glass cones, in which the neurites grow into the cones to provide stable and robust long-term recording. In 1998, this technology was implanted in a patient with locked-in syndrome after a brainstem stroke. During the 4-year trial the patient learned to control a computer cursor. Current research seeks to restore speech by implanting the device in the speech motor area and decoding phonemes from imagined speech.
In recent studies, 2 patients with stereotactic depth electrodes implanted in the hippocampus before epilepsy surgery were able to use signals from these electrodes to accurately control a P300-based BCI speller.
Ongoing studies in a number of laboratories are working toward achieving natural control of devices such as a prosthetic arm using electrode microarrays implanted in the motor cortex or other cortical areas of nonhuman primates. Plans are under way in several centers to translate these studies into human trials.

The Current Status of BCI Research and Development

At present, the striking achievements of BCI research and development remain confined almost entirely to the laboratory, and the bulk of work to date comprises data gathered from able-bodied humans or animals. Studies in the ultimate target population of people with severe disabilities have been largely confined to a few limited trials closely overseen by research personnel. The translation of the exciting laboratory progress to clinical use, to BCI systems that actually improve the daily lives of people with disabilities, has barely begun.
This essential task is perhaps even more demanding than the laboratory research that produces a BCI system. It must show that a specific BCI system can be implemented in a form suitable for long-term independent home use, define the appropriate user population and establish that they can use the BCI, demonstrate that their home environments can support their use of the BCI and that they do use it, and establish that the BCI improves their lives. This work requires dedicated, well-supported, multidisciplinary research teams that have expertise in the full range of relevant disciplines, including engineering, computer science, basic and clinical neuroscience, assistive technology, and clinical rehabilitation.
There are several headsets with scalp sensors on the market that can be used in conjunction with a personal computer to create a system for controlling third-party software applications. These and similar headsets have been incorporated into several commercial games, some of which claim to enhance focus and concentration via EEG-based neurofeedback. The central issue with these devices is that the nature of the signals they record is not clear. It seems probable that almost all of these devices record mostly nonbrain signals such as electromyographic signals from cranial or facial muscles or electro-oculographic signals from eye movements and blinks. Thus, they are unlikely to be actual BCI systems. One actual BCI that is commercially available is IntendiX (Guger Technologies, Graz, Austria). It is an EEG-based BCI system that implements the classic P300 speller protocol to type messages, produce synthesized speech, or control external devices.

The Future of BCIs: Problems and Prospects

Brain-computer interface research and development generates tremendous excitement in scientists, engineers, clinicians, and the general public. This excitement reflects the rich promise of BCIs. They may eventually be used routinely to replace or restore useful function for people severely disabled by neuromuscular disorders; they might also improve rehabilitation for people with strokes, head trauma, and other disorders.
At the same time, this exciting future can come about only if BCI researchers and developers engage and solve problems in 3 critical areas: signal-acquisition hardware, BCI validation and dissemination, and reliability.

Signal-Acquisition Hardware

All BCI systems depend on the sensors and associated hardware that acquire the brain signals. Improvements in this hardware are critical to the future of BCIs. Ideally, EEG-based (noninvasive) BCIs should have electrodes that do not require skin abrasion or conductive gel (ie, so-called dry electrodes); be small and fully portable; have comfortable, convenient, and cosmetically acceptable mountings; be easy to set up; function for many hours without maintenance; perform well in all environments; operate by telemetry instead of requiring wiring; and interface easily with a wide range of applications. In principle, many of these needs could be met with current technology, and dry electrode options are beginning to become available (eg, from g.tec Medical Engineering, Schiedlberg, Austria). The achievement of good performance in all environments may prove to be the most difficult requirement.
Brain-computer interfaces that use implanted electrodes face a range of complex issues. These systems need hardware that is safe and fully implantable; remains intact, functional, and reliable for decades; records stable signals over many years; conveys the recorded signals by telemetry; can be recharged in situ (or has batteries that last for years or decades); has external elements that are robust, comfortable, convenient, and unobtrusive; and interfaces easily with high-performance applications. Although great strides have been made in recent years and in individual cases microelectrode implants have continued to function over years, it is not clear which solutions will be most successful. ECoG- or local field potential-based BCIs might provide more consistently stable performance than BCIs that rely on neuronal action potentials. Nevertheless, it is possible that major as yet undefined innovations in sensor technology will be required for invasive BCIs to realize their full promise. Much of the necessary research will continue to rely primarily on animal studies before the initiation of human trials.

Validation and Dissemination

As work progresses and BCIs begin to enter actual clinical use, 2 important questions arise: how good a given BCI can get (eg, how capable and reliable) and which BCIs are best for which purposes. To answer the first question, each promising BCI should be optimized and the limits on users' capabilities with it should be defined. Addressing the second question will require consensus among research groups in regard to which applications should be used for comparing BCIs and how performance should be assessed. The most obvious example is the question of whether the performance of BCIs that use intracortical signals is greatly superior to that of BCIs that use ECoG signals, or even EEG signals. For many prospective users, invasive BCIs will need to provide much better performance to be preferable to noninvasive BCIs. It is not yet certain that they can do so. The data to date do not give a clear answer to this key question. On the one hand, it may turn out that noninvasive EEG- or fNIR-based BCIs are used primarily for basic communication, while ECoG- or neuron-based BCIs are used for complex movement control. On the other hand, noninvasive BCIs may prove nearly or equally capable of such complex uses, while invasive BCIs that are fully implantable (and thus very convenient to use) might be preferred by some people even for basic communication purposes. At this point, many different outcomes are possible, and the studies and discussions necessary to select among them have just begun.
The development of BCIs for people with disabilities requires clear validation of their real-life value in terms of efficacy, practicality (including cost-effectiveness), and impact on quality of life. This depends on multidisciplinary groups able and willing to undertake lengthy studies of real-life use in complicated and often difficult environments. Such studies, which are just beginning (eg, by Sellers et al), are an essential step if BCIs are to realize their promise. The validation of BCIs for rehabilitation after strokes or in other disorders will also be demanding and will require careful comparisons with the results of conventional methods alone.
Current BCIs, with their limited capabilities, are potentially useful mainly for people with very severe disabilities. Because this user population is relatively small, these BCIs are essentially an orphan technology: there is not yet adequate incentive for commercial interests to produce them or to promote their widespread dissemination. Invasive BCIs entail substantial costs for initial implantation, plus the cost of ongoing technical support. Although the initial costs of noninvasive BCI systems are relatively modest (eg, $5,000-$10,000), they too require some measure of ongoing technical support. The future commercial practicality of all BCIs will depend on reducing the amount and sophistication of the long-term support required, on increasing the numbers of users, and on ensuring reimbursement from insurance companies and government agencies.
Clear evidence that BCIs can improve motor rehabilitation could greatly increase the potential user population. In any case, if and when further work improves functionality of BCIs and renders them commercially attractive, their dissemination will require viable business models that give both financial incentive for the commercial company and adequate reimbursement to the clinical and technical personnel who will deploy and support the BCIs. The optimal scenario could be one in which BCIs for people with severe disabilities develop synergistically with BCIs for the general population.

Reliability

Although the future of BCI technology certainly depends on improvements in signal acquisition and on clear validation studies and viable dissemination models, these issues pale next to those associated with the problem of reliability. In all hands, no matter the recording method, the signal type, or the signal-processing algorithm, BCI reliability for all but the simplest applications remains poor. Brain-computer interfaces suitable for real-life use must be as reliable as natural muscle-based actions. Without major improvements, the real-life usefulness of BCIs will, at best, remain limited to only the most basic communication functions for those with the most severe disabilities.
Solving this problem depends on recognizing and engaging 3 fundamental issues: the central role of adaptive interactions in BCI operation; the desirability of designing BCIs that imitate the distributed functioning of the normal CNS; and the importance of incorporating additional brain signals and providing additional sensory feedback.
Brain-computer interfaces allow the CNS to acquire new skills in which brain signals take the place of the spinal motor neurons that produce natural muscle-based skills. Muscle-based skills depend for their acquisition and long-term maintenance on continual activity-dependent plasticity throughout the CNS, from the cortex to the spinal cord. This plasticity, which generally requires practice over months or years, enables babies to walk and talk; children to learn reading, writing, and arithmetic; and adults to acquire athletic and intellectual skills.
The acquisition and maintenance of BCI-based skills like reliable multidimensional movement control require comparable plasticity (eg, as described by various investigators). Brain-computer interface operation rests on the effective interaction of 2 adaptive controllers, the CNS and the BCI. The BCI must adapt so that its outputs correspond to the user's intent. At the same time, the BCI should encourage and facilitate CNS plasticity that improves the precision and reliability with which the brain signals encode the user's intent. In sum, the BCI and CNS must work together to acquire and maintain a reliable partnership under all circumstances. The work needed to achieve this partnership has just begun. It involves fundamental neuroscientific questions and may yield important insights into CNS function in general.
The principles that govern how the CNS acquires, improves, and maintains its natural muscle-based functions may be the best guide for designing BCIs. Central nervous system control of motor actions is normally distributed across multiple areas. Cortical areas may define the goal and the overall course of an action; however, the details (particularly high-speed sensorimotor interactions) are often handled at subcortical levels.
Brain-computer interface performance is also likely to benefit from distributed control. For BCIs, the distribution would be between the BCI's output commands (ie, the user's intent) and the application device that receives the commands and converts them into action. The optimal distribution will presumably vary from BCI to BCI and from application to application. Realization of reliable BCI performance may be facilitated by incorporating into the application itself as much control as is consistent with the action to be produced, just as the distribution of control within the CNS normally adapts to suit each neuromuscular action.
The natural muscle-based outputs of the CNS reflect the combined contributions of many brain areas from the cortex to the spinal cord. This suggests that BCI performance might be improved and maintained by using signals from multiple brain areas and by using brain signal features that reflect relationships among areas (eg, coherences). By allowing the CNS to function more as it does in producing muscle-based skills, this approach could improve BCI reliability.
Using signals from multiple cortical and/or subcortical areas might also resolve another obstacle to fully practical BCIs. In current BCIs, the BCI rather than the user typically determines when output is produced. Ideally, BCIs should be self-paced, so that the BCI is always available and the user's brain signals alone control when BCI output is produced. Brain-computer interfaces that use signals from multiple areas are more likely to be sensitive to the current context and thus may be better able to recognize when their output is or is not appropriate.
Finally, current BCIs provide mainly visual feedback, which is relatively slow and often imprecise. In contrast, natural muscle-based skills rely on numerous kinds of sensory input (eg, proprioceptive, cutaneous, visual, auditory). Brain-computer interfaces that control applications involving high-speed complex movements (eg, limb movement) are likely to benefit from sensory feedback that is faster and more precise than vision. Efforts to provide such feedback via stimulators in cortex or elsewhere have begun. The optimal methods will presumably vary with the BCI, the application, and the user's disability (eg, peripheral inputs may often be ineffective in people with spinal cord injuries).

Conclusion

Many researchers throughout the world are developing BCI systems that a few years ago were in the realm of science fiction. These systems use different brain signals, recording methods, and signal-processing algorithms. They can operate many different devices, from cursors on computer screens to wheelchairs to robotic arms. A few people with severe disabilities are already using a BCI for basic communication and control in their daily lives. With better signal-acquisition hardware, clear clinical validation, viable dissemination models, and, probably most important, increased reliability, BCIs may become a major new communication and control technology for people with disabilities—and possibly for the general population also.

Article Highlights

  • ■ A brain-computer interface (BCI) is a computer-based system that acquires brain signals, analyzes them, and translates them into commands that are relayed to an output device to carry out a desired action.
  • ■ In principle, any type of brain signal could be used to control a BCI system. The most commonly studied signals are electrical signals from brain activity measured from electrodes on the scalp, on the cortical surface, or in the cortex.
  • ■ A BCI system consists of 4 sequential components: (1) signal acquisition, (2) feature extraction, (3) feature translation, and (4) device output. These 4 components are controlled by an operating protocol that defines the onset and timing of operation, the details of signal processing, the nature of the device commands, and the oversight of performance.
  • ■ At present, the striking achievements of BCI research and development remain confined almost entirely to the research laboratory. Studies that seek to demonstrate BCI practicality and efficacy for long-term home use by people with disabilities are just beginning.
  • ■ Brain-computer interfaces may eventually be used routinely to replace or restore useful function for people severely disabled by neuromuscular disorders and to augment natural motor outputs for pilots, surgeons, and other highly skilled professionals. Brain-computer interfaces might also improve rehabilitation for people with strokes, head trauma, and other disorders.
  • ■ The future of BCIs depends on progress in 3 critical areas: development of comfortable, convenient, and stable signal-acquisition hardware; BCI validation and dissemination; and proven BCI reliability and value for many different user populations.

References

1. Wolpaw J.R., Wolpaw E.W., editors. Brain-Computer Interfaces: Principles and Practice. Oxford University Press; New York, NY: 2012.
2. Vidal J.J. Toward direct brain-computer communication. Annu Rev Biophys Bioeng. 1973;2:157–180.[PubMed]
3. Fetz E.E. Operant conditioning of cortical unit activity. Science. 1969;163(3870):955–958. [PubMed]
4. Elbert T., Rockstroh B., Lutzenberger W., Birbaumer N. Biofeedback of slow cortical potentials. I. Electroencephalogr Clin Neurophysiol. 1980;48(3):293–301. [PubMed]
5. Farwell L.A., Donchin E. Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. Electroencephalogr Clin Neurophysiol. 1988;70(6):510–523. [PubMed]
6. Gastaut H. Electrocorticographic study of the reactivity of rolandic rhythm. Rev Neurol (Paris)1952;87:176–182. [PubMed]
7. Kuhlman W.N. EEG feedback training: enhancement of somatosensory cortical activity. Electroencephalogr Clin Neurophysiol. 1978;45(2):290–294. [PubMed]
8. Wolpaw J.R., McFarland D.J. Multichannel EEG-based brain-computer communication. Electroencephalogr Clin Neurophysiol. 1994;90(6):444–449. [PubMed]
9. Wolpaw J.R., McFarland D.J., Neat G.W., Forneris C.A. An EEG-based brain-computer interface for cursor control. Electroencephalogr Clin Neurophysiol. 1991;78(3):252–259. [PubMed]
10. Wolpaw J.R., McFarland D.J. Control of a two-dimensional movement signal by a noninvasive brain-computer interface in humans. Proc Natl Acad Sci U S A. 2004;101(51):17849–17854. [PMC free article][PubMed]
11. Hochberg L.R., Serruya M.D., Friehs G.M. Neuronal ensemble control of prosthetic devices by a human with tetraplegia. Nature. 2006;442(7099):164–171. [PubMed]
12. Krusienski D.J., Shih J.J. Control of a visual keyboard using an electrocorticographic brain-computer interface. Neurorehabil Neural Repair. 2011;25(4):323–331. [PMC free article] [PubMed]
13. Vaughan T.M., Wolpaw J.R. The Third International Meeting on Brain-Computer Interface Technology: making a difference. IEEE Trans Neural Syst Rehabil Eng. 2006;14(2):126–127. [PubMed]
14. Berger H. Uber das electrenkephalogramm des menchen. Arch Psychiatr Nervenkr. 1929;87:527–570.
15. Akhtari M., Bryant H.C., Mamelak A.N. Conductivities of three-layer human skull. Brain Topogr.2000;13(1):29–42. [PubMed]
16. Krusienski D.J., Shih J.J. Control of a brain-computer interface using stereotactic depth electrodes in and adjacent to the hippocampus. J Neural Eng. 2011;8(2):025006. [PMC free article] [PubMed]
17. Shih J.J., Krusienski D.J. Signals from intraventricular depth electrodes can control a brain-computer interface. J Neurosci Methods. 2012;203(2):311–314. [PMC free article] [PubMed]
18. Mellinger J., Schalk G., Braun C. An MEG-based brain-computer interface (BCI) Neuroimage.2007;36(3):581–593. [PMC free article] [PubMed]
19. van Gerven M., Jensen O. Attention modulations of posterior alpha as a control signal for two-dimensional brain-computer interfaces. J Neurosci Methods. 2009;179(1):78–84. [PubMed]
20. Lee J.H., Ryu J., Jolesz F.A., Cho Z.H., Yoo S.S. Brain-machine interface via real-time fMRI: preliminary study on thought-controlled robotic arm. Neurosci Lett. 2009;450(1):1–6. [PMC free article][PubMed]
21. Mak J.N., Wolpaw J.R. Clinical applications of brain-computer interfaces: current state and future prospects. IEEE Rev Biomed Eng. 2009;2:187–199. [PMC free article] [PubMed]
22. Wolpaw J.R., Birbaumer N., McFarland D.J., Pfurtscheller G., Vaughan T.M. Brain-computer interfaces for communication and control. Clin Neurophysiol. 2002;113(6):767–791. [PubMed]
23. McFarland D.J., Krusienski D.J., Sarnacki W.A., Wolpaw J.R. Emulation of computer mouse control with a noninvasive brain-computer interface. J Neural Eng. 2008;5(2):101–110. [PMC free article] [PubMed]
24. Kayagil T.A., Bai O., Henriquez C.S. A binary method for simple and accurate two-dimensional cursor control from EEG with minimal subject training. J Neuroeng Rehabil. 2009;6(May 6):14. [PMC free article][PubMed]
25. McFarland D.J., Sarnacki W.A., Wolpaw J.R. Electroencephalographic (EEG) control of three-dimensional movement. J Neural Eng. 2010;7(3):036007. [PMC free article] [PubMed]
26. Doud A.J., Lucas J.P., Pisansky M.T., He B. Continuous three-dimensional control of a virtual helicopter using a motor imagery based brain-computer interface. PLoS One. 2011;6(10):e26322. [PMC free article][PubMed]
27. Neuper C., Muller-Putz G.R., Scherer R., Pfurtscheller G. Motor imagery and EEG-based control of spelling devices and neuroprostheses. Prog Brain Res. 2006;159:393–409. [PubMed]
28. Cincotti F., Mattia D., Aloise F. Non-invasive brain-computer interface system: towards its application as assistive technology. Brain Res Bull. 2008;75(6):796–803. [PMC free article] [PubMed]
29. Pfurtscheller G., Guger C., Muller G., Krausz G., Neuper C. Brain oscillations control hand orthosis in a tetraplegic. Neurosci Lett. 2000;292(3):211–214. [PubMed]
30. Pfurtscheller G., Muller G.R., Pfurtscheller J., Gerner H.J., Rupp R. 'Thought'-control of functional electrical stimulation to restore hand grasp in a patient with tetraplegia. Neurosci Lett. 2003;351(1):33–36.[PubMed]
31. McFarland D.J., Wolpaw J.R. Brain-computer interface operation of robotic and prosthetic devices. Computer. 2010;41:52–56.
32. Muller-Putz G.R., Scherer R., Pfurtscheller G., Rupp R. EEG-based neuroprosthesis control: a step towards clinical practice. Neurosci Lett. 2005;382(1-2):169–174. [PubMed]
33. Galan F., Nuttin M., Lew E. A brain-actuated wheelchair: asynchronous and non-invasive Brain-computer interfaces for continuous control of robots. Clin Neurophysiol. 2008;119(9):2159–2169. [PubMed]
34. Tanaka K., Matsunaga K., Wang H.O. Electroencephalogram-based control of an electric wheelchair. IEEE Trans Robotics. 2005;21:762–766.
35. Hoffmann U., Vesin J.M., Ebrahimi T., Diserens K. An efficient P300-based brain-computer interface for disabled subjects. J Neurosci Methods. 2008;167(1):115–125. [PubMed]
36. Krusienski D.J., Sellers E.W., McFarland D.J., Vaughan T.M., Wolpaw J.R. Toward enhanced P300 speller performance. J Neurosci Methods. 2008;167(1):15–21. [PMC free article] [PubMed]
37. Nijboer F., Sellers E.W., Mellinger J. A P300-based brain-computer interface for people with amyotrophic lateral sclerosis. Clin Neurophysiol. 2008;119(8):1909–1916. [PMC free article] [PubMed]
38. Piccione F., Giorgi F., Tonin P. P300-based brain computer interface: reliability and performance in healthy and paralysed participants. Clin Neurophysiol. 2006;117(3):531–537. [PubMed]
39. Sellers E.W., Donchin E. A P300-based brain-computer interface: initial tests by ALS patients. Clin Neurophysiol. 2006;117(3):538–548. [PubMed]
40. Sellers E.W., Kubler A., Donchin E. Brain-computer interface research at the University of South Florida Cognitive Psychophysiology Laboratory: the P300 Speller. IEEE Trans Neural Syst Rehabil Eng.2006;14(2):221–224. [PubMed]
41. Sellers E.W., Vaughan T.M., Wolpaw J.R. A brain-computer interface for long-term independent home use. Amyotroph Lateral Scler. 2010;11(5):449–455. [PubMed]
42. Vaughan T.M., McFarland D.J., Schalk G. The Wadsworth BCI Research and Development Program: at home with BCI. IEEE Trans Neural Syst Rehabil Eng. 2006;14(2):229–233. [PubMed]
43. Mugler E.M., Ruf C.A., Halder S., Bensch M., Kubler A. Design and implementation of a P300-based brain-computer interface for controlling an internet browser. IEEE Trans Neural Syst Rehabil Eng.2010;18(6):599–609. [PubMed]
44. Pires G., Castelo-Branco M., Nunes U. Visual P300-based BCI to steer a wheelchair: a Bayesian approach. Conf Proc IEEE Eng Med Biol Soc. 2008;2008:658–661. [PubMed]
45. Rebsamen B., Burdet E., Guan C. Controlling a wheelchair indoors using thought. IEEE Intelligent Systems. 2007;22:18–24.
46. Rebsamen B., Guan C., Zhang H. A brain controlled wheelchair to navigate in familiar environments. IEEE Trans Neural Syst Rehabil Eng. 2010;18(6):590–598. [PubMed]
47. Vanacker G., del R Millan J., Lew E. Context-based filtering for assisted brain-actuated wheelchair driving. Comput Intell Neurosci. 2007:25130. [PMC free article] [PubMed]
48. Allison B.Z., McFarland D.J., Schalk G., Zheng S.D., Jackson M.M., Wolpaw J.R. Towards an independent brain-computer interface using steady state visual evoked potentials. Clin Neurophysiol.2008;119(2):399–408. [PMC free article] [PubMed]
49. Kelly S.P., Lalor E.C., Finucane C., McDarby G., Reilly R.B. Visual spatial attention control in an independent brain-computer interface. IEEE Trans Biomed Eng. 2005;52(9):1588–1596. [PubMed]
50. Kelly S.P., Lalor E.C., Reilly R.B., Foxe J.J. Visual spatial attention tracking using high-density SSVEP data for independent brain-computer communication. IEEE Trans Neural Syst Rehabil Eng. 2005;13(2):172–178. [PubMed]
51. Sutter E.E. The brain response interface: communication through visually-induced electrical brain responses. J Microcomputer Applications. 1992;15:31–45.
52. Trejo L.J., Rosipal R., Matthews B. Brain-computer interfaces for 1-D and 2-D cursor control: designs using volitional control of the EEG spectrum or steady-state visual evoked potentials. IEEE Trans Neural Syst Rehabil Eng. 2006;14(2):225–229. [PubMed]
53. Muller-Putz G.R., Pfurtscheller G. Control of an electrical prosthesis with an SSVEP-based BCI. IEEE Trans Biomed Eng. 2008;55(1):361–364. [PubMed]
54. Gollee H., Volosyak I., McLachlan A.J., Hunt K.J., Graser A. An SSVEP-based brain-computer interface for the control of functional electrical stimulation. IEEE Trans Biomed Eng. 2010;57(8):1847–1855.[PubMed]
55. Middendorf M., McMillan G., Calhoun G., Jones K.S. Brain-computer interfaces based on the steady-state visual-evoked response. IEEE Trans Rehabil Eng. 2000;8(2):211–214. [PubMed]
56. Cecotti H. A self-paced and calibration-less SSVEP-based brain-computer interface speller. IEEE Trans Neural Syst Rehabil Eng. 2010;18(2):127–133. [PubMed]
57. Gao X., Xu D., Cheng M., Gao S. A BCI-based environmental controller for the motion-disabled. IEEE Trans Neural Syst Rehabil Eng. 2003;11(2):137–140. [PubMed]
58. Furdea A., Halder S., Krusienski D.J. An auditory oddball (P300) spelling system for brain-computer interfaces. Psychophysiology. 2009;46(3):617–625. [PubMed]
59. Hinterberger T., Neumann N., Pham M. A multimodal brain-based feedback and communication system. Exp Brain Res. 2004;154(4):521–526. [PubMed]
60. Kubler A., Furdea A., Halder S., Hammer E.M., Nijboer F., Kotchoubey B. A brain-computer interface controlled auditory event-related potential (p300) spelling system for locked-in patients. Ann N Y Acad Sci.2009;1157:90–100. [PubMed]
61. Schreuder M., Blankertz B., Tangermann M. A new auditory multi-class brain-computer interface paradigm: spatial hearing as an informative cue. PLoS One. 2010;5(4):e9813. [PMC free article] [PubMed]
62. Brouwer A.M., Erp J.B. A tactile P300 brain-computer interface. Front Neurosci. 2010;4:19.[PMC free article] [PubMed]
63. Chatterjee A., Aggarwal V., Ramos A., Acharya S., Thakor N.V. A brain-computer interface with vibrotactile biofeedback for haptic information. J Neuroeng Rehabil. 2007;4:40. [PMC free article] [PubMed]
64. Cincotti F., Kauhanen L., Aloise F. Vibrotactile feedback for brain-computer interface operation. Comput Intell Neurosci. 2007:48937. [PMC free article] [PubMed]
65. Muller-Putz G.R., Scherer R., Neuper C., Pfurtscheller G. Steady-state somatosensory evoked potentials: suitable brain signals for brain-computer interfaces? IEEE Trans Neural Syst Rehabil Eng. 2006;14(1):30–37. [PubMed]
66. Neuper C., Muller G.R., Kubler A., Birbaumer N., Pfurtscheller G. Clinical application of an EEG-based brain-computer interface: a case study in a patient with severe motor impairment. Clin Neurophysiol.2003;114(3):399–409. [PubMed]
67. Vaughan T.M., Sellers E.W., Wolpaw J.R. Clinical evaluation of BCIs. In: Wolpaw J.R., Wolpaw E.W., editors. Brain-Computer Interfaces: Principles and Practice. Oxford University Press; New York, NY: 2012.
68. Daly J.J., Wolpaw J.R. Brain-computer interfaces in neurological rehabilitation. Lancet Neurol.2008;7(11):1032–1043. [PubMed]
69. Leuthardt E.C., Schalk G., Roland J., Rouse A., Moran D.W. Evolution of brain-computer interfaces: going beyond classic motor physiology. Neurosurg Focus. 2009;27(1):E4. [PMC free article] [PubMed]
70. Murase N., Duque J., Mazzocchio R., Cohen L.G. Influence of interhemispheric interactions on motor function in chronic stroke. Ann Neurol. 2004;55(3):400–409. [PubMed]
71. Daly J.J., Fang Y., Perepezko E.M., Siemionow V., Yue G.H. Prolonged cognitive planning time, elevated cognitive effort, and relationship to coordination and motor control following stroke. IEEE Trans Neural Syst Rehabil Eng. 2006;14(2):168–171. [PubMed]
72. Enzinger C., Ropele S., Fazekas F. Brain motor system function in a patient with complete spinal cord injury following extensive brain-computer interface training. Exp Brain Res. 2008;190(2):215–223.[PubMed]
73. Daly J.J., Cheng R.C., Hrovat K., Litinas K.H., Rogers J.M., Dohring M.E. Development and testing of non-invasive BCI + FES/robot system for use in motor re-learning after stroke. Proc 13th International Functional Electrical Stimulation Society. 2008:166–168.
74. Hermes D., Miller K.J., Vansteensel M.J., Aarnoutse E.J., Leijten F.S., Ramsey N.F. Neurophysiologic correlates of fMRI in human motor cortex. Hum Brain Mapp. 2011 [PubMed]
75. Lachaux J.P., Fonlupt P., Kahane P. Relationship between task-related gamma oscillations and BOLD signal: new insights from combined fMRI and intracranial EEG. Hum Brain Mapp. 2007;28(12):1368–1375.[PubMed]
76. Manning J.R., Jacobs J., Fried I., Kahana M.J. Broadband shifts in local field potential power spectra are correlated with single-neuron spiking in humans. J Neurosci. 2009;29(43):13613–13620. [PMC free article][PubMed]
77. Miller K.J. Broadband spectral change: evidence for a macroscale correlate of population firing rate? J Neurosci. 2010;30(19):6477–6479. [PubMed]
78. Niessing J., Ebisch B., Schmidt K.E., Niessing M., Singer W., Galuske R.A. Hemodynamic signals correlate tightly with synchronized gamma oscillations. Science. 2005;309(5736):948–951. [PubMed]
79. Ray S., Crone N.E., Niebur E., Franaszczuk P.J., Hsiao S.S. Neural correlates of high-gamma oscillations (60-200 Hz) in macaque local field potentials and their potential implications in electrocorticography. J Neurosci. 2008;28(45):11526–11536. [PMC free article] [PubMed]
80. Acharya S., Fifer M.S., Benz H.L., Crone N.E., Thakor N.V. Electrocorticographic amplitude predicts finger positions during slow grasping motions of the hand. J Neural Eng. 2010;7(4):046002.[PMC free article] [PubMed]
81. Kubanek J., Miller K.J., Ojemann J.G., Wolpaw J.R., Schalk G. Decoding flexion of individual fingers using electrocorticographic signals in humans. J Neural Eng. 2009;6(6):066001. [PMC free article][PubMed]
82. Miller K.J., Zanos S., Fetz E.E., der Nijs M., Ojemann J.G. Decoupling the cortical power spectrum reveals real-time representation of individual finger movements in humans. J Neurosci. 2009;29(10):3132–3137. [PubMed]
83. Scherer R., Zanos S.P., Miller K.J., Rao R.P., Ojemann J.G. Classification of contralateral and ipsilateral finger movements for electrocorticographic brain-computer interfaces. Neurosurg Focus. 2009;27(1):E12.[PubMed]
84. Gunduz A., Sanchez J.C., Carney P.R., Principe J.C. Mapping broadband electrocorticographic recordings to two-dimensional hand trajectories in humans: motor control features. Neural Netw.2009;22(9):1257–1270. [PubMed]
85. Schalk G., Kubanek J., Miller K.J. Decoding two-dimensional movement trajectories using electrocorticographic signals in humans. J Neural Eng. 2007;4(3):264–275. [PubMed]
86. Pistohl T., Ball T., Schulze-Bonhage A., Aertsen A., Mehring C. Prediction of arm movement trajectories from ECoG-recordings in humans. J Neurosci Methods. 2008;167(1):105–114. [PubMed]
87. Felton E.A., Wilson J.A., Williams J.C., Garell P.C. Electrocorticographically controlled brain-computer interfaces using motor and sensory imagery in patients with temporary subdural electrode implants: report of four cases. J Neurosurg. 2007;106(3):495–500. [PubMed]
88. Leuthardt E.C., Schalk G., Wolpaw J.R., Ojemann J.G., Moran D.W. A brain-computer interface using electrocorticographic signals in humans. J Neural Eng. 2004;1(2):63–71. [PubMed]
89. Ramsey N.F., van de Heuvel M.P., Kho K.H., Leijten F.S. Towards human BCI applications based on cognitive brain systems: an investigation of neural signals recorded from the dorsolateral prefrontal cortex. IEEE Trans Neural Syst Rehabil Eng. 2006;14(2):214–217. [PubMed]
90. Schalk G., Miller K.J., Anderson N.R. Two-dimensional movement control using electrocorticographic signals in humans. J Neural Eng. 2008;5(1):75–84. [PMC free article] [PubMed]
91. Wilson J.A., Felton E.A., Garell P.C., Schalk G., Williams J.C. ECoG factors underlying multimodal control of a brain-computer interface. IEEE Trans Neural Syst Rehabil Eng. 2006;14(2):246–250. [PubMed]
92. Brunner P., Ritaccio A.L., Emrich J.F., Bischof H., Schalk G. Rapid communication with a ”P300” matrix speller using electrocorticographic signals (ECoG) Front Neurosci. 2011;5:5. [PMC free article][PubMed]
93. Hinterberger T., Widman G., Lal T.N. Voluntary brain regulation and communication with electrocorticogram signals. Epilepsy Behav. 2008;13(2):300–306. [PubMed]
94. Yanagisawa T., Hirata M., Saitoh Y. Real-time control of a prosthetic hand using human electrocorticography signals. J Neurosurg. 2011;114(6):1715–1722. [PubMed]
95. Leuthardt E.C., Gaona C., Sharma M. Using the electrocorticographic speech network to control a brain-computer interface in humans. J Neural Eng. 2011;8(3):036004. [PMC free article] [PubMed]
96. Canolty T., Soltani M., Dalal S.S. Spatiotemporal dynamics of word processing in the human brain. Front Neurosci. 2007;1(1):185–196. [PMC free article] [PubMed]
97. Pei X., Leuthardt E.C., Gaona C.M., Brunner P., Wolpaw J.R., Schalk G. Spatiotemporal dynamics of electrocorticographic high gamma activity during overt and covert word repetition. Neuroimage.2011;54(4):2960–2972. [PMC free article] [PubMed]
98. Leuthardt E.C., Miller K.J., Schalk G., Rao R.P., Ojemann J.G. Electrocorticography-based brain computer interface—the Seattle experience. IEEE Trans Neural Syst Rehabil Eng. 2006;14(2):194–198.[PubMed]
99. Blakely T., Miller K.J., Zanos S.P., Rao R.P., Ojemann J.G. Robust, long-term control of an electrocorticographic brain-computer interface with fixed parameters. Neurosurg Focus. 2009;27(1):E13.[PubMed]
100. Miller K.J., Schalk G., Fetz E.E., den Nijs M., Ojemann J.G., Rao P. Cortical activity during motor execution, motor imagery, and imagery-based online feedback. Proc Natl Acad Sci U S A.2010;107(9):4430–4435. [PMC free article] [PubMed]
101. Chao Z.C., Nagasaka Y., Fujii N. Long-term asynchronous decoding of arm motion using electrocorticographic signals in monkeys. Front Neuroeng. 2010;3:3. [PMC free article] [PubMed]
102. Schalk G. Can electrocorticography (ECoG) support robust and powerful brain-computer interfaces? Front Neuroeng. 2010;3:9. [PMC free article] [PubMed]
103. Simeral J.D., Kim S.P., Black M.J., Donoghue J.P., Hochberg L.R. Neural control of cursor trajectory and click by a human with tetraplegia 1000 days after implant of an intracortical microelectrode array. J Neural Eng. 2011;8(2):025027. [PMC free article] [PubMed]
104. Pohlmeyer E.A., Oby E.R., Perreault E.J. Toward the restoration of hand use to a paralyzed monkey: brain-controlled functional electrical stimulation of forearm muscles. PLoS One. 2009;4(6):e5924.[PMC free article] [PubMed]
105. Kennedy P.R., Bakay R.A., Moore M.M., Adams K., Goldwaithe J. Direct control of a computer from the human central nervous system. IEEE Trans Rehabil Eng. 2000;8(2):198–202. [PubMed]
106. Kennedy P.R., Bakay R.A. Restoration of neural output from a paralyzed patient by a direct brain connection. Neuroreport. 1998;9(8):1707–1711. [PubMed]
107. Bartels J., Andreasen D., Ehirim P. Neurotrophic electrode: method of assembly and implantation into human motor speech cortex. J Neurosci Methods. 2008;174(2):168–176. [PMC free article] [PubMed]
108. Guenther F.H., Brumberg J.S., Wright E.J. A wireless brain-machine interface for real-time speech synthesis. PLoS One. 2009;4(12):e8218. [PMC free article] [PubMed]
109. Chapin J.K., Moxon K.A., Markowitz R.S., Nicolelis M.A. Real-time control of a robot arm using simultaneously recorded neurons in the motor cortex. Nat Neurosci. 1999;2:664–670. [PubMed]
110. Ganguly K., Carmena J.M. Emergence of a stable cortical map for neuroprosthetic control. PLoS Biol.2009;7(7):e1000153. [PMC free article] [PubMed]
111. Musallam S., Corneil B.D., Greger B., Scherberger H., Andersen R.A. Cognitive control signals for neural prosthetics. Science. 2004;305(5681):258–262. [PubMed]
112. Santhanam G., Ryu S.I., Yu B.M., Afshar A., Shenoy K.V. A high-performance brain-computer interface. Nature. 2006;442(7099):195–198. [PubMed]
113. Taylor D.M., Tillery S.I., Schwartz A.B. Direct cortical control of 3D neuroprosthetic devices. Science.2002;296(5574):1829–1832. [PubMed]
114. Velliste M., Perel S., Spalding M.C., Whitford A.S., Schwartz A.B. Cortical control of a prosthetic arm for self-feeding. Nature. 2008;453(7198):1098–1101. [PubMed]
115. OCZ Technology Web site. http://gear.ocztechnology.com/products/description/OCZ_Neural_Impulse_Actuator/index.html Accessed September 5, 2011.
116. Brainfingers Web site. http://www.brainfingers.com/ Accessed September 5, 2011.
117. Emotiv Web site. http://www.emotiv.com/ Accessed September 5, 2011.
118. NeuroSky, Inc Web site. http://www.neurosky.com/ Accessed September 5, 2011.
119. Mattel Mind Flex Web site. http://shop.mattel.com/product/index.jsp?productId=11695206 Accessed January 27, 2012.
120. Uncle Milton Force Trainer Web site. http://unclemilton.com/star_wars_science/ Accessed September 5, 2011.
121. BrainMaster Technologies, Inc Web site. http://www.brainmaster.com/ Accessed September 5, 2011.
122. EEG Info Web site. http://www.eeginfo.com/ Accessed September 5, 2011.
123. EEG Spectrum International Web site. http://www.eegspectrum.com/ Accessed September 5, 2011.
124. Interactive Productline Mindball Web site. http://www.mindball.se/ Accessed September 5, 2011.
125. Guger Technologies Web site. http://www.gtec.at/ Accessed September 5, 2011.
126. Wolpaw J.R. Brain-computer interface research comes of age: traditional assumptions meet emerging realities. J Mot Behav. 2010;42(6):351–353. [PubMed]

Articles from Mayo Clinic Proceedings are provided here courtesy of The Mayo Foundation for Medical Education and Research

Post a Comment

 
Copyright © 2016 Science World
Powered by Makesh kumar.J