The Information Commissioner’s Office (ICO) has warned against organisations using biometrics for emotion analysing tech. These technologies process data on behavioural and physical characteristics such as facial expressions and heartbeats.

ICO is concerned about the risks of using emotion tech to make impactful decisions about people, with the data privacy watchdog worrying that organisations make critical decisions about people without appreciating there is no scientific evidence that emotion tech works, the BBC reported.

Access deeper industry intelligence

Experience unmatched clarity with a single platform that combines unique data, AI, and human expertise.

Find out more

“The inability of algorithms which are not sufficiently developed, to detect emotional cues, means there’s a risk of systemic bias, inaccuracy and even discrimination,” the ICO said in a statement.

Artificial intelligence (AI) is important to emotion tech, but it’s only as good as the underlining data.

“AI can contribute to perpetuate inherent biases, for example associating speech and facial expressions to personality traits,” Laura Petrone, principal analyst at GlobalData, tells Verdict. “The inaccuracy of AI models is still a thorny issue for businesses and regulators. In the future further evidence is likely to push for more regulation [like the UK’s new AI rulebook].”

“Humans aren’t able to make robust links between inner emotions and biometric markers, it is improbable AI could do this,” Caroline Carruthers, CEO and co-founder of data consultancy, Carruthers and Jackson, tells Verdict.

GlobalData Strategic Intelligence

US Tariffs are shifting - will you react or anticipate?

Don’t let policy changes catch you off guard. Stay proactive with real-time data and expert analysis.

By GlobalData

Sceptics say emotion tech is still in its infancy and should be used cautiously.

“It can be very tempting for businesses to believe there’s a silver bullet which will let them use this technology to ‘peek behind the curtain’ to see, for example, what potential hires think,” Alistair Dent, chief strategy officer at AI company Profusion, tells Verdict.

“Not only does this approach severely undermine trust – it’s like asking someone to take a polygraph – it’s also incredibly difficult to interpret results.”

Dent says there is a place for emotion tech in assessing welfare or security. However, using it to monitor staff is risky. Companies should be mindful of ethical ramifications.

The ICO will issue further guidance in Spring 2023.

GlobalData is the parent company of Verdict and its sister publications.