Social media companies could face a raft of tougher regulations to ensure that they protect their users from harmful content online under new proposed UK laws. Punitive actions could include blocks, fines and holding social media executives personally liable.
As part of a long-awaited white paper released by the UK government today, online services have been told they need to take more responsibility for illegal content, terrorist propaganda, child sex abuse and cyberbullying that takes place on their platforms.
Currently, content is largely self-regulated by online platforms. Today’s white paper, titled ‘Online Harms’, states that the government will “establish a new statutory duty of care” for online services to ensure companies “take more responsibility for the safety of their users”.
It follows growing pressure on social media companies to better tackle harmful content on its platforms.
Recently, social media firms including Facebook and YouTube came under criticism for failing to remove footage of the Christchurch terrorist attack, in which 50 people were killed and another 50 injured in New Zealand.
Platforms such as Instagram have faced controversy for allowing graphic images of self-harm to reach children.
The Online Harms white paper also highlights the spreading of harmful inaccurate information, such as anti-vaccination posts, as another area to address.
The UK’s digital secretary Jeremy Wright said: “The era of self-regulation for online companies is over. Voluntary actions from industry to tackle online harms have not been applied consistently or gone far enough.
“Tech can be an incredible force for good and we want the sector to be part of the solution in protecting their users.”
He added that those failing to protect its users will “face tough action”.
How will it be enforced?
A UK government white paper’s purpose is to inform people of the government’s intentions on an issue – not to outline the law itself. As such, the exact details for new regulations are not yet formulated and open for consultation.
However, the 102-page Online Harms white paper does suggest how new regulations could work. The document states – somewhat vaguely – that social media firms must take “reasonable steps” to keep users safe.
Failing to do so will result in punitive measures by an independent regulator. That could include “substantial” fines and other disruptive measures such as blocking non-compliant platforms in the UK. Perhaps most controversially, it could also see social media executives held personally liable for failing to protect its users.
However, online services will have limited liability until they become aware of illegal activity and then fail to remove it in “good time”.
Disciplinary action will also be proportionate to the seriousness of the platform’s failure to control harmful content. The UK is also looking into ways to enforce regulations against non-UK firms – most social media firms are US-based – to ensure a “level playing field”.
An as of yet unknown regulator will enforce penalties against firms for non-compliance. Initially, it is likely to be Ofcom. A merger between Ofcom and the UK’s data protection watchdog, the Information Commissioner’s Office, has also been suggested. Or, it could be a newly created independent regulatory body.
Ollie Whitehouse, global chief technical officer at NCC Group, a global cybersecurity firm, said he welcomed the proposed measures.
“By appointing an independent regulator, the government has struck the right balance between enforcing a duty of care that may not otherwise have been adopted, and enabling regulation to respond and keep pace with the constantly evolving online environment,” he said.
Online Harms white paper: Who will be affected?
The new regulations will apply to any company that “allow users to share or discover user-generated content or interact with each other online”.
That includes file hosting sites, messaging services, public discussion forums and search engines, in addition to social media platforms.
There will be a different approach to public and private communications, with “any requirements to scan or monitor content for tightly defined categories of illegal content” not applying to private channels, such as WhatsApp.
The Online Harms white paper proposals have so far received wide support, particularly among child protection groups.
NSPCC CEO Peter Wanless said:
“This is a hugely significant commitment by the Government that once enacted, can make the UK a world pioneer in protecting children online.
“For too long social networks have failed to prioritise children’s safety and left them exposed to grooming, abuse, and harmful content. So it’s high time they were forced to act through this legally binding duty to protect children, backed up with hefty punishments if they fail to do so.”
Barnardo’s Chief Executive, Javed Khan said:
“The Government’s announcement today is a very important step in the right direction. We particularly welcome proposals for a new independent regulator, which should ensure internet bosses make the UK one of the safest places in the world for children to be online.”
Critics have raised concerns that the Online Harms regulations will infringe on freedom of speech and stifle innovation.
Joy Hyvarinen, head of advocacy at the Index on Censorship, a campaign group, said:
“The online harms white paper will set the direction for future internet regulation. Index is concerned that protecting freedom of expression is less important than the government wanting to be seen as ‘doing something’ in response to public pressure. Internet regulation needs a calm, evidence-based approach that safeguards freedom of expression rather than undermining it.”
The government is seeking consultations on its proposals for the next 12 weeks. The last date to submit thoughts on the paper is 1 July 2019.
Verdict deals analysis methodology