Computers to date are limited when it comes to human interaction. They lack the ability to read signs other than the words that people speak or write. But experiments by MIT and Tufts with a system known as ‘Brainput’ could change all that. This system triggers when a person becomes overloaded with work and then automatically modifies a computer interface to lighten the load. It does this by using a light, portable brain monitoring device incorporating a technology called functional near-infrared spectroscopy (fNIRS). The technology recognises when a person is multi-tasking and when the data is transmitted it adjusts the user’s workload.
Currently, there are ways that a computer can read when a person is affected by excessive workload. A computer could pick up changes in patterns of typing errors and speed of keystrokes. Computers can also detect facial expressions using its visual capacity. “Brainput tries to get closer to the source, by looking directly at brain activity,” says Erin Treacy Solovey, a postdoctoral researcher at MIT. She presented the results earlier this month at the Computer Human Interaction Conference in Austin, Texas.
Such a system could potentially be used to help drivers, pilots, and supervisors of unmanned aerial vehicles where the risk to human life is high if workload becomes an issue.
Perhaps we can also take a holiday while the fNIRS system keeps the show on the road.