Monday, June 24, 2013

Computer correctly identifies emotions for the first time



By Ben Popper | The Verge 



The field of neuroscience has been animated recently by the use of Functional Magnetic Resonance Imaging, or fMRI. When a person lies in an fMRI machine, scientists can see their brain activity in real time. It’s a species of mind reading that promises to unlock the still mysterious workings of our grey matter.

In April, a team in Japan announced that they could identify when a subject was dreaming about different types of objects like a house, a clock, or a husband. Last November, another group of researchers using this technique was able to predict if gadget columnist David Pogue was thinking about a skyscraper or a strawberry.

What earlier studies couldn’t determine, however, was how the subjects were actually feeling. A new study released today by Carnegie Mellon University, which also draws on fMRI, represents the first time researchers have been able to map people’s emotional state based on their neural activity.

"Emotion is a critical part of our lives, but scientifically speaking, it’s been very hard to pin down," said Karim Kassam, an assistant professor of social and decision sciences and the lead author of the study. The gold standard for understanding how people feel has been, quite simply, to ask them. "But if someone is embarrassed by sexually exciting stimulus or knows their views on racial matters are outside the norm, then this kind of self reporting breaks down."

Led by researchers in CMU’s Dietrich College of Humanities and Social Sciences, the study had a group of actors look at words like anger, disgust, envy, fear, happiness, lust, pride, sadness and shame. As they did so, the actors tried to bring themselves to this emotional state. Their brains were monitored by fMRI and a computer modeled the results.

Based on these scans, the computer model could then correctly guess the emotion of the actors when they were shown a series of evocative photos. Each emotion essentially had a neural signature. The patterns of brain activity the computer learned were not limited to those individuals. Based on the scans of the actor’s brains, the computer model could correctly identify the emotions of a new test subject who had not participated in the earlier trials.

[...]

Read the full article at: theverge.com