1. Comment
April 28, 2022

UK’s Online Safety Bill is a bold law that will come at a cost

The UK’s new Online Safety Bill, which recently passed its second reading in Parliament, sets the democratic world’s strictest internet safety legislation, aimed at holding websites accountable if they fail to tackle harmful online content. It represents a bold attempt to introduce a duty of care to limit the spread of illegal content, such as child sexual abuse images, while giving the communication regulator Ofcom the power to impose hefty fines on companies that breach the new rules.

Bill compliance costs will hit smaller companies the most

However, experts have warned that the bill also introduces a number of compliance costs that risk being especially prohibitive for smaller companies. Unlike tech giants, they are neither as well equipped nor as diversified to navigate the uncertainty brought about by the upcoming regulation.

According to the latest impact assessment on the bill, published in March, micro, small, medium, and large businesses that fall into the scope of “user-to-user services and search services” will have to bear a number of costs to adhere to the new regulations. In addition, the assessment says that 180,000 platforms could fall within this scope. They will likely need to spend between GBP9.6 million ($11.94 million) and GBP17.5 million ($21.78 million) to familiarize themselves with the new bill.

Struggling with moderation

The bill applies to companies that host user-generated content and requires every organization within its scope to have systems and processes in place that take down illegal content as soon as they spot it. The law could lead to higher costs for content moderation, as well as a fine of up to 10% of a company’s global turnover for breaking the rules.

While the UK government has said that less than 5% of all UK businesses will be affected by the new rules, smaller companies, such as retailers and ecommerce sites, would have to recruit extra staff to monitor the content posted on their sites, burdening them with additional costs. There is also a risk that the additional costs will restrict new entrants to the social media market, potentially hindering competition in that sector.

Bill puts algorithms in the spotlight

Under the UK’s new regulatory regime, Ofcom will be able to audit algorithms that control users’ experience online, putting companies under pressure to deal with algorithms that steer users towards harmful content. Ofcom said it would require GBP44 million ($55 million) next year, as well as 300 extra staff, to fulfill its duties as set out in the bill.

While algorithmic systems are often opaque and tend to skew towards extreme and toxic material, smaller companies will be less equipped than Big Tech companies to ensure compliance. But the move is in line with increasing algorithm scrutiny worldwide. China is leading the effort; the powerful cybersecurity regulator has already released drafted rules for regulating the use of recommendation algorithms. EU lawmakers are working on updating their digital rulebook to require companies to be more transparent with users on how algorithms make recommendations for what shows up on their feeds.

Fragmented regulation is an additional burden for companies

The new bill also comes at a time when adtech companies, including small companies, have been forced to rethink strategies due to a shifting regulatory environment underpinned by data privacy regulations like the General Data Protection Regulation (GDPR).

In September 2021, the UK government published a consultation setting out proposals to relax data privacy rules post-Brexit to encourage data-driven economic growth and innovation. That possible deregulation will also increase legal uncertainty and risk for businesses. Companies that have adapted their businesses to comply with the GDPR risk being burdened with new compliance rules.

Related Report
img
GlobalData Thematic Research
img
GlobalData Thematic Research