Training Computers To Recognize Emotion

Body language has long been an area of study and wonder. The ability to read the natural signals emitted by humans and therefore interpret their underlying thoughts and emotions has long been seen as a powerful tool and has been used in many areas.

However, there has been an increase in effort to develop methods to determine emotion, other than those relying on human analysis and this is being led by the advancement in computer technology.

Nao, a humanoid machine that expresses and detects emotions was launched in 2010. It hunches its shoulders when it feels sad and raises its arms for a hug when it feels happy. It was designed to mimic the emotional skills of a one year old child. It can form bonds with people, detect basic human emotions, detect how close a person comes and remember faces. However this is limited to a robot; how can we harness this ability to detect emotion?

The Affective Computer Research Group, based out of Massachusetts Institute of Technology, is working on computers that can read facial expressions and track basic states like confusion, liking or disliking. The great thing is that this can be done via a webcam, or smartphone camera. It is achieved by identifying the main feature points on the face such as eyes, eyebrows and nose, as well as head movements, and tracks how they move over time. As an extension, the group has developed wearable devices, such as electronic bracelets, which can detect stress or excitement by measuring minimal changes in sweat level.

They claim that, “Emotion measurement technology will be soon ubiquitous”. “It will allow people to communicate in new different ways. It’s a kind of very sophisticated version of the ‘Like’ button on Facebook, potentially taking their experience to a whole new level.” If mobile devices achieve the ability to decipher body language, people will end up sharing much more than they expect when they download Facebook or any other social network, that desire to shine and show the best of themselves might be compromised if their machines can read what really was going on in their minds.

The applications of this technology are far-reaching and go beyond social fears. Medically, for instance, it could help people with autism spectrum disorders to read emotion. Commercially, it could be used to evaluate the effect of an advert by tracking the viewer’s emotional response. How about a speech by a political leader – are they telling the truth? Do they actually believe in what they are communicating? What is the emotional response of the audience?

The advancements being made in this area are phenomenal and the benefits far reaching. But there is an underlying sense of nervousness when realizing that the gap between humans and machines is fast narrowing; machines with human-like emotions? How long will it be before machines will be intelligent enough to manipulate us through our emotions?

We suspect not long at all.

Learn more the author of this post:

Daniel Moeller
I was born in Berlin, Germany, studied Engineering in London and wrote my thesis on emergent computer technology. I now work as an engineer and freelance writer for London-based firms. My interests include blogging about technology, computer science, social media and design in my spare time.