Honeywell Aerospace is developing technologies that could help pilots perform simple commands using only their thoughts. The company’s neurobotic control research is an offshoot of its augmented cognition work for military, which wants to find out when soldiers or pilots become task-saturated.
To sense what the brain is up to, scientists use electroencephalography (EEG) to monitor the minor voltage fluctuations that occur when neurons are moving in certain areas of the brain. Honeywell is not so much interested in the sensors, which the consumer electronics industry will likely perfect, but in the software and algorithms that make an assessment of cognitive state.
Once you can accurately sense that neuron movement and the "states" it represents, you can put the thoughts to work.
Santosh Mathan, principal scientist at Honeywell, says one input option is “imagined motor movements”. Close your eyes and imagine you are lifting your left arm and pointing, and you are generating what Mathan calls “imagined movements” which could be used to control something. That approach is a bit problematic however, as it “takes of lot of training to do it right”, and not everyone will be able to have positive results.
More reliable is a method called steady state visual evoked response potential. The idea is that you look at external signals, for example oscillating geometric shapes on a display, with different shapes representing different commands.
Santosh says the technology could be used for non-safety critical functions, although Honeywell two years ago used the evoked response potential idea to fly a Boeing 737 in the simulator. Note that there are shapes for left, right, up and down commands. While interesting, Honeywell doesn’t have any plans to market the idea, at least in the near term.