Despite the many benefits of AI, the inability to understand the complexities of human emotion has long been highlighted as a reason for why human intervention is still vital to providing the soft skills robots cannot master.
However, it may now be possible to utilise machine learning to successfully track how someone feels.
Researchers from Rotterdam School of Management, Erasmus University have developed a new method to track and measure emotions in the brain using electroencephalography (EEG), a way of monitoring electrical activity in the brain.
Using AI to measure emotions
As part of the research, they collected EEG data from 40 students while they watched videos, in this case video clips from the Pixar movie Up. Each of the clips was intended to trigger a specific emotion: happiness, sadness, fear and disgust, and participants viewed five short clips for each of the emotions.
Based on the frequency and topography of the EEG signal, the machine learning was able to successfully predict which of the emotions the clip being watched corresponded with. Researchers also found that it could be applied on a moment-by-moment basis so could be used to track changes in emotion over time.
Researchers achieved a mean accuracy of 58% when differentiating between all four emotions, and between 71 and 82% accuracy when differentiating between specific emotions.
Due to their subjective nature, finding a way to accurately and unobtrusively measure emotions has proven difficult. Artificial intelligence capable of predicting human emotion may seem unnerving for some, but accurately measuring emotional experiences could be extremely useful for businesses as it could be a valuable tool for tracking customer experience.
It means they can understand the emotional effect their product, commercial, website or customer service has on consumers. It also has potential applications in a clinical setting.
Researcher Esther Eijlers said that it is possible to accurately measure emotions and track emotional experience using this technology:
“The results showed that the algorithm we used can successfully predict which emotions were experienced during the viewing of the video clips.
“The ultimate validation that the classification algorithm worked came when we showed the participants a clip from the animated Pixar movie ‘Up’. The clip tells the story of the lives of a man and woman who get married, and grow old together.
“We estimated the average happy and sad responses across participants, second-by-second during the movie clip. It appeared that the emotional response, which was estimated based on the EEG data, was able to accurately track the main ups and downs of the story.”