The Centre for Data Ethics and Innovation, the government’s advisory board on the ethical use of AI, has called for changes to be made to social media regulation.

This comes after a year-long review of social media targeting, in which user data is used to determine what content individuals will be shown online, published today.

Although the report acknowledges the positive aspects of social media targeting, it warns that it often operates without enough “transparency and accountability” and that it can lead to the “erosion of autonomy and the exploitation of people’s vulnerabilities”.

It also warns that online targeting has resulted in big tech companies having immense power to influence users’ behaviour, as well as potential impacts on political outcomes and users’ mental health.

The report also examined public attitudes to social media targeting, and found that while many appreciate the convenience it can offer, and do believe targeting should be stopped entirely, they do want to have “meaningful control over how they are targeted” and are concerned by the increased prevalence and sophistication of such systems.

Does artificial intelligence need more regulation?

View Results

Loading ... Loading ...

Analysis of public attitudes, conducted with Ipsos MORI, found that only 29% of people trust platforms to target them in a responsible way.

Roger Taylor, Chair of the Centre for Data Ethics and Innovation, said:

“Most people do not want targeting stopped. But they do want to know that it is being done safely and responsibly. And they want more control. Tech platforms’ ability to decide what information people see puts them in a position of real power. To build public trust over the long-term it is vital for the Government to ensure that the new online harms regulator looks at how platforms recommend content, establishing robust processes to protect vulnerable people. ”

Based on this, the CDEI recommends that the government works to increase the accountability of social media platforms, that regulation is updated to promote “responsibility and transparency and safeguard human rights”, and that regulation should respond to advances in technology.

It also calls for a a code of practice for online targeting, and that independent researchers should have greater access to the data used by online platforms to check that they are being compliant.

Finally, platforms should be required to store online advertising archives for certain types of targeted ads that “pose particular societal risks”, such as political ads.

Dr Bernadka Dubicka, Chair of the Child and Adolescent Faculty at the Royal College of Psychiatrists, said:

3 Things That Will Change the World Today

“We completely agree that there needs to be greater accountability, transparency and control in the online world.  It is fantastic to see the Centre for Data Ethics and Innovation join our call for the regulator to be able to compel social media companies to give independent researchers secure access to their data.”


Read more: Social media advertising: Ban targeted ads not political ads.