Device by MIT Grad Can Almost Read Minds
They say that only a small fraction of human communication comes from words; about 60% is body language and about 30% is tone. The irony is that most of us are probably not able to pick up on these subliminal messages very well. That might be why Rana El Kaliouby, an MIT grad student, decided to develop the special software used in the Emotional Social Intelligence Prosthetic, which is a device that helps people interpret body language and facial expressions. The device, upon detecting boredom/inattention in the listener, will vibrate to alert the user.
"The video data is used to decide whether the listener is agreeing, disagreeing, thinking, concentrating, interested, or unsure."
"To train her software El Kaliouby used video of actors able to very clearly define an emotion on film. Now, the system is able to pick out the right emotion 90% of the time when using actor footage, and 64% of the time with video clips of everyday people."