|The BrainGate is an example of an invasive BCI. (Photo by Michael Edwards)|
In this final entry in this series, I will elaborate on a topic I mentioned last time about a specific category of neuroprosthetic that allows people to manipulate computer devices with their mind. This technology is known as a brain-computer interface, and it offers enormous potential in terms of disability support.
Broadly speaking, a brain-computer interface is defined as "a method of communication based on neural activity generated by the brain"  -- in other words, a means of connecting the brain to an external device. Where it differs from a neuroprosthetic is that neuroprosthetics can be connected to any part of the nervous system, whereas brain-computer interfaces must be connected to the brain or the central nervous system. Additionally, in most cases neuroprosthetics only connect to a single device, while brain-computer interfaces connect to a more extensive computer system. The distinction between the two fields is not distinct though, and the terms are often used interchangeably.
Brain-computer interfaces can be categorised as either invasive or non-invasive. Invasive BCIs are essentially what we covered in the last entry -- electrodes are surgically implanted inside the grey matter of the brain. These devices are capable of sending electrical signals directly to and from the brain -- applications in the former case include restoration of sight and vision. As for the latter case, devices such as the currently in-development BrainGate  are designed to allow people to control a wheelchair, robotic device or computer by reading their brainwaves.
By comparison, non-invasive BCIs consist of a set of electrodes, called an electroencephalograph (EEG), placed onto the scalp, which read brain signals and transmit them to a computer device . However, the skull blocks a lot of the signals, thus reducing the amount of detail in the information obtained from them, and EEGs are not capable of transmitting signals to the brain. Brain activity can also be measured via magnetic resonance imaging, which can produce much higher-resolution images of brain activity, but since an MRI machine is too huge to be used for regular activity this is more useful for e.g. determining key areas of brain activity when a user thinks of performing a certain action.
In the case of non-invasive BCIs, a common application is moving a robotic arm via thought. Machine learning theory can be applied here -- the user visualises a certain movement, and eventually the software learns to associate the brain signals with this movement -- but as is often the case with machine learning this requires a significant training period. Where brain-computer interfaces differ from other neuroprosthetics is that their applications can be more abstract. For example, a person could visualise a cursor on a computer screen moving, and the software would carry out this action; the same principle could also be applied to a wide variety of actions such as typing. The key challenge of brain-computer interfaces is properly interpreting signals from the brain, but once this obstacle is overcome the technology has enormous potential.
|At the recent Cybathlon event, competitors use non-invasive BCIs to race in a computer game. (Photo by Reuters)|
 Vallabhaneni A, Wang T, He B. Brain-computer interface. InNeural engineering 2005 (pp. 85-121). Springer US.
 BrainGate: Turning Thoughts into Action [Internet]. Cyberkineticsinc.com. [cited 4 November 2016]. Available from: http://www.cyberkineticsinc.com/
 Grabianowski E. How Brain-computer Interfaces Work [Internet]. HowStuffWorks. 2007 [cited 4 November 2016]. Available from: http://computer.howstuffworks.com/brain-computer-interface1.htm
 Lewington L. Cybathlon: Battle of the bionic athletes [Internet]. BBC News. 2016 [cited 4 November 2016]. Available from: http://www.bbc.co.uk/news/technology-37605984