A computer that understands how you feel may sound like something out of a dystopian sci-fi film, but with the ever-advancing capabilities of artificial intelligence, it may soon become a reality.

Neuroscientists at the University of Colorado Boulder, US, have developed a neural network (a computer system modelled on the human brain and nervous system) that can do just that. Dubbed EmoNet, it has been trained to identify the emotion evoked by a horror film, a scene from nature, or a picture of a puppy.

Senior author of the study and professor of psychology and neuroscience at CU Boulder Tor Wager said:

“Machine learning technology is getting really good at recognising the content of images – of deciphering what kind of object it is. We wanted to ask: Could it do the same with emotions? The answer is yes.”

Researchers believe this could be an important step in the study of emotion.

Lead author and postdoctoral research associate at the Institute of Cognitive Science Philip Kragel explains that the research also sheds light on how images are represented in the human brain:

“A lot of people assume that humans evaluate their environment in a certain way and emotions follow from specific, ancestrally older brain systems like the limbic system. We found that the visual cortex itself also plays an important role in the processing and perception of emotion.”

Starting with an existing neural network called AlexNet, which enables computers to recognise objects, the team adapted it to predict how someone would react to a certain image.

The new network, named EmoNet, was then shown 25,000 images and asked to categorise them into different groups such as horror, awe and surprise.

EmoNet was able to categorise 11 of the different types of emotion, but was better at recognising some than others and had trouble recognising more nuanced emotions such as confusion and surprise.

The neural network was also able to reliably rate the intensity of images, identifying not only the emotion it might illicit but how strong it might be.

Testing the neural network

Researchers also showed EmoNet movie clips for it to categorise as romantic comedies, action films or horror movies, which it could do accurately three-quarters of the time.

3 Things That Will Change the World Today

To further test its accuracy, researchers then brought in 18 human subjects.

Using a functional magnetic resonance imaging (fMRI) machine that measured their brain activity, they were shown 4-second flashes of 112 images. EmoNet saw the same pictures, and when compared to the human volunteers’ brain pattern, gave similar results.

Kragel explains that EmoNet is able to represent emotions in a human-like way:

“We found a correspondence between patterns of brain activity in the occipital lobe and units in EmoNet that code for specific emotions. This means that EmoNet learned to represent emotions in a way that is biologically plausible, even though we did not explicitly train it to do so.”

In the future, researchers hope to use neural networks to help people digitally screen out negative or harmful images. Currently, human content moderators have to manually trawl through violent and disturbing images to tag them to be removed. It could also be applied to improve computer-human interactions and “help advance emotion research”.


Read more: Machine learning could soon be capable of tracking human emotion