The UK government has published its Online Harms Bill which could see social media companies face hefty fines for the spread of harmful content.

First proposed in 2019 by the previous government, the bill sets out guidelines designed to keep UK internet users, particularly children, safer online in a move to create a “new age of accountability”.

The introduction of the bill was prompted by the death of teenager Molly Russell, who took her own life after viewing distressing images of self-harm on social media.

Overseen by Ofcom, the bill will require social media companies such as Facebook and Twitter to remove illegal content related to child abuse, terrorism and suicide, as well as adhering to a new code of conduct that sets out their responsibilities to protect young users. Companies will also have to introduce measure to ensure children are not accessing platforms which are not suitable for them.

The government is also working with the Law Commission on whether the promotion of self harm should be made illegal.

The largest social media sites will also be required to address the handling of content that is legal “but could cause significant physical or psychological harm to adults”, such as misinformation. It will not cover online scams or fraud.

Ofcom will be able to issue fines of up to 10% of a company’s annual global turnover or £18m, whichever is higher.

The regulator will also have the power to block online services in the UK if they do not take adequate action to prevent the dissemination of harmful content, as well as publishing an audit detailing how they are tacking the issue. Companies will not currently face criminal sanctions for breaching the legislation, but it includes “provisions to impose criminal sanctions on senior managers”.

The bill will cover any platform publishing user-generated content online accessible from the UK, as well as private messaging tools. The law will not affect articles and comments sections on news websites.

When it comes to private messaging, the government has said that the legislation will enable Ofcom to “require companies to use technology to monitor, identify and remove tightly defined categories of illegal material relating to child sexual exploitation and abuse” but that this should only be used “as a last resort where alternative measures are not working” in order to protect user privacy.

The UK government has said that the bill will include “safeguards for freedom of expression and pluralism online”.

The Online Harms Bill is due to be introduced in 2021.

Stephen Kelly, chair of Tech Nation said:

“Given our leadership in the application of ethics and integrity in IT, it should be no surprise that the UK is moving decisively to tackle online harms, one of the biggest and most complex digital challenges of our time. Equally, it offers the UK the opportunity to lead a new category of tech, such as “safetech”, building on our heritage of regtech and compliance, which already assure global markets and economies.”


Read More: CMA gives UK government a template for regulating Big Tech.