This past week, Elon Musk’s new venture Neuralink made headlines by showing a video of a monkey playing Pong with his mind, controlled by a surgically implanted wireless device that can directly read brain signals and interpret its intended commands.
The Neuralink example and other similar technologies are designed to read and decode neural signals from individual neurons in selected parts of the brain to understand the brain’s outputs.
Specially designed electrodes are surgically implanted into a target region of the brain where the neural signals need to be recorded.
There is a tremendous amount of engineering that goes into developing brain-machine interfaces (BMIs).
The integration of state-of-the-art machine learning to achieve optimized near real-time functionality in BMIs – in other words, adaptive and autonomous ‘smart’ BMIs – is still in its earliest stages.
Such BMIs will be able to adjust their outputs and functions in near real-time to accommodate changing cognitive and physical demands.
In one study researchers demonstrated a proof-of-concept wireless BMI system that took advantage of state-of-the-art flexible electronics and convolution neural networks, one of the most successful approaches to machine learning, in order to allow implanted patients to control a wheelchair.
And now, converging with advances taking place in machine learning, BMIs are on the verge of entirely new capabilities.