
Mental load: A user tries the Brainput system.
Erin Treacy Solovey
COMPUTING
A Computer Interface that Takes a Load Off Your Mind
A wearable brain scanner could give computers insight into how hard you’re thinking.
- MONDAY, MAY 14, 2012
- BY KATE GREENE
Conversations between people include a lot more than just words. All sorts of visual and aural cues indicate each party’s state of mind and make for a productive interaction.
But a furrowed brow, a gesticulating hand, and a beaming smile are all lost on computers. Now, researchers at MIT and Tufts are experimenting with a way for computers to gain a little insight into our inner world.
Their system, called Brainput, is designed to recognize when a person’s workload is excessive and then automatically modify a computer interface to make it easier. The researchers used a lightweight, portable brain monitoring technology, called functional near-infrared spectroscopy (fNIRS), that determines when a person is multitasking. Analysis of the brain scan data was then fed into a system that adjusted the user’s workload at those times. A computing system with Brainput could, in other words, learn to give you a break.
There are other ways that a computer could detect when a person’s mental workload is becoming overwhelming. It could, for example, log errors in typing or speed of keystrokes. It could also use computer vision to detect facial expressions. “Brainput tries to get to closer to the source, by looking directly at brain activity,” says Erin Treacy Solovey, a postdoctoral researcher at MIT. She presented the results last Wednesday at the Computer Human Interaction Conference in Austin, Texas.
As the research subjects drove their robots toward the strongest Wi-Fi signal, their fNIRS sensors transmitted information about their mental state to the robots. The robots, for their part, were programmed to focus on a state of mind called branching, in which a person is simultaneously working on two goals that require attention. (Previous studies have correlated certain fNIRS signals to this sort of mental state.) When the robots sensed that the driver was branching, they took on more of the navigation themselves.
The researchers found that when the robots’ autonomous mode kicked in, the overall performance of the human-robot team improved. The drivers didn’t seem to notice or get frustrated by the autonomous behavior of the robot when they were multitasking. The researchers also tried increasing the autonomy of the robots when Brainput did not indicate that users were mentally overloaded. When they did this, they found that overall performance decreased. In other words, increased autonomy only helped when users were struggling to cope.
“A good chunk of computer and human-computing interaction research these days is focused on giving computers better senses so they can either implicitly or explicitly augment our intellect and assist with our tasks,” says Desney Tan, a researcher at Microsoft Research. “This work is a wonderful first step toward understanding our changing mental state and designing interfaces that dynamically tailor themselves so that the human-computer system can be as effective as possible.”
Treacy Solovey suggests that such a system could potentially be used to help drivers, pilots, and supervisors of unmanned aerial vehicles. She says future work will investigate other cognitive states that can be reliably measured using fNIRS.
Very interesting. Is this the future? Quite possibly.
This is the first time I’ve seen audio included on a site. Excellent – well done. I have a blind friend and he listens to all his audio on double-speed; it’s too slow otherwise. Maybe this is also something to think about.