The UK government’s Digital, Culture, Media and Sport select committee has published its long-awaited report on fake news and disinformation, calling for greater regulation of “digital gangsters” such as Facebook.

The investigation began in 2017, but was brought into the public eye after the revelation that Facebook gave access to the data of millions of Facebook users to advertising company Cambridge Analytica. As the result, the final report focuses on both fake news and its ability to influence political opinion, and the misuse of user data to ensure that “people stay in charge of the machines”.

Unsurprisingly, Facebook is the main focus, with the committee criticising the company for being “unwilling to be accountable to regulators around the world” after Mark Zuckerberg refused to appear before parliament, and failing to act quickly to prevent the spread of misinformation on the platform.

What does the fake news report recommend?

The report has made a series of recommendations as to how the UK government should protect citizens from fake news and control the rate at which it spreads.

Although the report may influence government policy, it is not binding, meaning its recommendations may not come to fruition. The government is due to publish a white paper on online harms this year.

Following “Russian activity on Facebook and knowledge of Russian advertisements that ran during the presidential election in America in 2016”, the report calls on the government to reform current electoral communications laws and rules on overseas involvement in UK elections due to fears that fake news could be used to influence election results.

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

It also recommends that social media companies should have an obligation to take down known sources of harmful content, including proven sources of disinformation or fake news.

This follows recent statements by Ministers indicating the government is prepared to regulate social media companies following the death of teenager Molly Russell, who took her own life after viewing images depicting self-harm on Instagram.

However, perhaps the most meaningful recommendation is the call for greater regulation in the face of  “malign forces intent on causing disruption and confusion”. The 108-page report calls for a compulsory Code of Ethics defining what constitutes harmful online content. It recommends that an independent regulator should be responsible for monitoring tech companies, backed by statutory powers to launch legal action against companies that breach the code with the power to issue hefty fines.

It also repeats its recommendation for this new independent regulation to be funded by a levy on tech companies operating in the UK.

Damian Collins MP and Chair of the DCMS Committee believes that the actions of social media companies should be more closely policed:

“It’s time for a radical overhaul of our relationship with big tech companies. For too long we’ve let them live in a world of self-regulation and non-compliance and its citizens who are the victims of that. That’s why in our report we’re recommending the most robust system of regulation on content online we’ve ever seen. I believe the UK can be a world-leader in content regulation. We should have statutory powers to act against companies like Facebook when they fail to act against harmful content.”

What could a code of ethics mean?

The fake news report makes it clear that there is an “urgent need “for regulation to prevent tech companies from expanding “exponentially”. But what could UK-specific regulations mean?

If social media companies are required to adhere to additional regulations in the UK, it may deter them from investing in the country, or may lead to some companies removing their sites from the country altogether to avoid additional regulation or hefty fines.

A less extreme scenario could also result in UK-specific versions of certain sites, similar to those that exist in countries that already have their own rules governing online content. As was the case after the introduction of General Data Protection Regulation (GDPR­) in the EU, this could mean that access to some websites is temporarily disrupted as they work to comply.

In the case of GDPR, the impact of losing out on the European market was enough that many sites have adopted GDPR compliance as standard. However, with the UK representing a much smaller market, it may risk missing out if companies chose to make their sites inaccessible in the region rather than comply.

However, James Monckton, Strategic Communication Director at Verbalisation, believes that the risk of this is small:

“I doubt the websites that exist now will be blocked in a Russia-esque move of control. The British public would not allow it. It is possible that the exiting Social Media companies will have to rebrand to a degree and deal with the backlash. As to UK-specific domains, I again doubt that this will be a move they try and make. If they did so, they would be saying ethical standards in the UK are different to those globally, which goes against fundamental principles of ethics.”

He believes that it may instead prompt marketing campaigns from social media platforms:

“It is more likely that they will re-brand and launch new campaigns that speak to a global ethical standard (they are all about community building after all!).”

Rafael Laguna, CEO of Open-Xchange believes that a code of ethics is not the best solution, and that others should be considered:

“This is not the first time we have heard calls for a code of ethics for tech companies. As recently as last year, Tim Berners-Lee – inventor of the World Wide Web – endorsed a “Magna Carta” for the Internet, which was immediately jumped on by Google and Facebook – the two outstanding creators of the problems outlined in his paper. But I fail to see how it is possible for an independent body to regulate the tech giants while they continue to operate in closed, proprietary systems.

“Trust is the biggest threat facing these platforms today, and with consumer awareness at an all-time high, there is now increasing societal pressure for companies to treat data responsibly and ethically. Only a fully open source-based model could provide the transparency necessary for the regulation proposed by UK MPs – and this would also go a long way in re-establishing trust with users. After all, privacy is not a business model, but a fundamental human right.”