The computer that knows what you're thinking

  • Published
Man with happy face balloon in front of his faceImage source, Thinkstock
Image caption,

Can computers learn to unmask us?

We have all done it; put a rictus grin on our faces while inside we are burning with anger.

It is one of the advantages of being human that we can, when the time is appropriate, conceal our innermost feelings.

But that could be about to change as computers get better not just at recognising faces but also at understanding what the person behind it is actually thinking.

That is the project Daniel McDuff has been working on at the Massachusetts Institute of Technology's Media Lab, where he is developing emotion-reading computers.

It could eventually lead to machines that have emotional intelligence, or even everyday objects that are able to empathise with our moods - a mirror that knows how you feel about the way you look, a fridge that can offer you food that matches your state of mind or a car that recognises when you are stressed.

Joy and fear

The system Dr McDuff is developing works via a basic webcam that detects a range of different facial movements from frowning to smiling.

It translates that into seven of the most commonly recognised emotional states - sadness, amusement, surprise, fear, joy, disgust and contempt.

Image source, Thinkstock
Image caption,

Human emotions are complex so can a computer really read them?

The computer learns from a huge database of four million videos from volunteers and paid-for market researchers in various emotional states and the algorithms are constantly updated and tested against real-world scenarios.

The next stage is to integrate voice analysis and other measures of physical wellbeing such as heart rate and hand gestures.

Cultural differences

Already the data has revealed that there are big differences in emotional responses between men and women and between different age groups and demographics.

"There are significant differences in different countries as to how people express themselves," Dr McDuff told the BBC.

"In collectivist cultures where your family group is more important, people are most expressive in these small groups, whereas in more individualistic cultures like Western Europe the emphasis is more about building relationships with strangers and people tend to be more positive around people who they are less familiar with."

He also found that when people mask their true feelings, the computer can recognise subtle differences.

Image source, Affectiva
Image caption,

The system captures facial expressions

"If they are frustrated they will often smile but that smile is different from when someone is genuinely amused," he said.

He acknowledges that "emotions are very complex and the ways people express emotions change in different settings".

"We won't ever build a computer that is better than an expert at interpreting emotions and, at the moment, the technology is at the level of a young child," he said.

Image source, Thinkstock
Image caption,

Mood-measuring computers could be useful in online education

But the system could still be a valuable tool in a range of situations.

"Children with autism have difficulty processing human emotion and they need to practise communication skills over and over again," said Dr McDuff, who has a family member with the condition.

In fact it was part of his motivation to work in the area of affective computing, as the field is known.

"Imagine if a computer could provide feedback on their communication skills."

Another area that the system could be used in is online education, where it could offer advice to tutors on how students are coping with and understanding the work.

Mental health

In 2012, researchers at North Carolina State University developed facial recognition software that could "read" emotions. When tested on students during tutorial sessions, the software was able to accurately identify the emotions they were feeling and could tell when students were struggling or when they were bored.

The system Dr McDuff and his team have developed has also been tested with the BBC's audience measurement group to offer insights into how people respond to different shows.

A primetime comedy show was among those tested and it went down very differently with people from different demographics, ethnicities and ages, he revealed.

Image source, Thinkstock
Image caption,

Do we want machines to know what we are thinking?

He is now planning to use the technology in the mental health arena, in partnership with Affectiva - an MIT spin-off for which he is research director.

"Previously there has been no way for a clinician to monitor patients between appointments. They can prescribe drugs but they don't know how the patient is doing.

"With this system they could start to track how people respond to treatment."

These scenarios may sound a little Big Brother-like, and Mr McDuff recognises that.

"It is scary to think that someone could measure my emotions without me realising it and so it is crucial that we think about the social impact of such technology," he said.

"It is vital that everyone actively opts in to sharing their data."

Image source, Getty Images
Image caption,

Pepper has been programmed to respond to human emotion

Affective computing is a growing area and firms such as Creative Virtual are developing tools for customer service technology that allow a computer to tell the difference between a customer who is upset and one who is not.

Machines are also being programmed to understand feelings, with Nao robots often used to help communicate with autistic children and Pepper - a robot that can respond to emotions - selling out in Japan.

Google's director of engineering Ray Kurzweil predicts that computers will have emotional intelligence by 2029, by which time he thinks that machines will "be funny, get the joke and understand human emotion".

Neural networks

Tom Austin, lead analyst in artificial intelligence at research firm Gartner, thinks that may be optimistic but is also convinced that affective computing has a lot to offer.

"I've been excited by this area since I read the first paper on it in 1998 but we are only just beginning to understand the level of complexity involved."

Already firms such as IPsoft are offering virtual assistants like Amelia that can contextualise queries and others, such as Arria, IBM Watson and Narrative Science, that can identify different tones in text and speech.

"There is a lot of progress at a research level in identifying emotional states but it is not easy for computers to figure out which one and what the response should be," Mr Austin told the BBC.

To do it well will require a "large body of photos, text, speech to feed the neural networks [artificial computer brains] in order to allow them to identify patterns", he thinks.

And there remain big ethical questions about a future when a mirror can tell when you are sad.

"People have the right to wear a mask in public and these technologies have to give you the ability to control what the computer sees, records and shares."

Related internet links

The BBC is not responsible for the content of external sites.