Pages

Thursday, February 24, 2011

Computers Get In Touch with Your Emotions


Computers could be a lot more useful if they paid attention to how you felt. With the emergence of new tools that can measure a person's biological state, computer interfaces are starting to do exactly that: take users' feelings into account. So claim several speakers at Blur, a conference this week in Orlando, Florida, that focused on human-computer interaction.
Kay Stanney, owner of Design Interactive, an engineering and consulting firm that works with the Defense Advanced Research Projects Agency and the Office of Naval Research, says that a lot of information about a user's mental and physiological state can be measured, and that this data can help computers cater to that user's needs.
Design Interactive is prototyping Next Generation Interactive Systems, or NexIS, a system that will place biological sensors on soldiers. If a sensor detects that a soldier's pulse is weakening, or determines another problem with her physical state, the system might call for help or administer adrenaline. Similar technology could prove useful in civilian conditions, Stanney says. For example, sensors on air traffic controllers or baggage screeners could help prevent errors or poor performance, she says.
Design Interactive is working on another project called Auto-Diagnostic Adaptive Precision Training for Baggage Screeners (Screen-ADAPT), which would aid in training by using measurements including electroencephalography, eye tracking, and heart-rate monitoring to assess the performance of baggage screeners. The idea is to learn how successful screeners scan an image so others can apply similar techniques.
Stanney admits this is challenging, because not every successful baggage screener does the job in exactly the same way. "This will really come down to the art of the algorithm—what it is that we're trying to optimize," Stanney says. Sensors can already detect when a person is drowsy, distracted, overloaded, or engaged. But it would be ideal to be able to determine other states such as frustration, or even to distinguish between different types of frustration, she says.
Some companies are already applying these ideas. Mercedes, for example, has developed algorithms that watch how a driver operates the steering wheel to identify when he might be drowsy. Stanney says the approach could also make personal computers more useful. For example, a computer might eventually be able to detect when a user is overloaded and then suggest that she focus on one application.
Hans Lee, chief technical officer of EmSense, a San Francisco company  that measures users' cognitive and emotional state for the purpose of market research, says there are plenty of potential applications for a computer that can read a human's mood. "No matter what you do, emotion matters," Lee says.
Lee says studies suggest that 40 percent of people verbally abuse their computers. A device capable of recognizing a user's frustration and addressing it could make workers more efficient, and mean fewer broken monitors.  "What if your computer could apologize to you?" Lee says.

source : here

No comments:

Post a Comment