December 13, 2019updated 17 Dec 2019 12:51pm

AI Now: Affect recognition should be banned from “important decisions”

By Ellen Daniel

Emotion detecting AI should not be allowed to make important decisions, AI research institute AI Now has warned.

This is one of the recommendations made in the New York University institute’s annual report intended to “ensure that AI systems are accountable to the communities and contexts they are meant to serve”.

Emotional AI, also known as affect recognition, uses artificial intelligence to analyse micro expressions with the aim of identifying human emotion.

A number of companies are commercialising affect recognition technology for a range of applications including recruitment, monitoring students in the classroom, customer services and criminal justice.

For example, HireVue offers software that screens job candidates for different qualities, and BrainCo is developing headbands that claim to detect students’ attention levels.

According to the report, the emotion-detection and recognition market was worth $12bn in 2018, and could grow to over $90bn by 2024. However, despite growing interest in the technology, AI Now warns that it is based on “markedly shaky foundations”.

AI Now warns of affect recognition

Although these technologies have potentially useful applications, the report says that the technology is “at best incomplete and at worst entirely lack validity” failing to reliably identify emotions without considering context and often detecting facial movements which can be misinterpreted.

There is also evidence to suggest that the technology can show bias. For example, a study by Dr Lauren Rhue found that two emotion recognition programmes assigned negative emotional score to black individuals from a data set of 400 photos of NBA players.

The rapid growth of the technology is particularly concerning in circumstances such as criminal justice, with the institute calling for those deploying affect recognition to “scrutinise why entities are using faulty technology to make assessments about character”.

As well as calling for more stringent regulation, the report states that affect recognition should not play a role in decisions such as “who is interviewed or hired for a job, the price of insurance, patient pain assessments, or student performance in school”, with governments prohibiting its use in “high-stakes decision making processes”.

Read More: Behavioral Signals: AI that predicts if you’re going to buy from the emotion in your voice.

Verdict deals analysis methodology

This analysis considers only announced and completed cross border deals from the GlobalData financial deals database and excludes all terminated and rumoured deals. Country and industry are defined according to the headquarters and dominant industry of the target firm. The term ‘acquisition’ refers to both completed deals and those in the bidding stage.

GlobalData tracks real-time data concerning all merger and acquisition, private equity/venture capital and asset transaction activity around the world from thousands of company websites and other reliable sources.

More in-depth reports and analysis on all reported deals are available for subscribers to GlobalData’s deals database.

Topics in this article: ,