UK MPs are calling for an additional tax on the profits of social companies to combat the “detrimental effect on young people’s mental health and wellbeing”, in the latest call for social media regulation.
This is the result of a year-long inquiry into the impact of social media by the The All Party Parliamentary Group (APPG) on Social Media and Young People’s Mental Health and Wellbeing. Involving consultations with charities, academics, clinicians, industry representatives and young people, the report has uncovered the potential harm of excessive social media use.
The issue hit headlines last year following the news that 14-year-old Molly Russell took her life after viewing distressing content related to self-harm on Instagram, leading many to call for social media companies to do more to protect vulnerable users. The report highlights this, saying that through social media, young people are at a at high-risk of being exposed to graphic content that may be detrimental to mental health.
Although there is not yet a proven link between social media use and poormental health, there is a growing body of evidence to suggest that it is linked to “feelings of anxiety and depression, negative body image, cyber bullying, poor sleep and a fear of missing out”. According to the report 46% of girls said that social media has a negative impact on their self esteem.
To help tackle this, the report calls for a 0.5% levy on the profits of social media companies to fund the creation of a Social Media Health Alliance, intended to review the evidence for the impact of social media on health and wellbeing and “establish clearer guidance for the public”.
According to the BBC, the government is due to publish its own proposals in the next few weeks.
What the report recommends
As well as the introduction of a new levy, the APPG makes a number of recommendations so that appropriate measures are taken to ensure social media is used as a positive tool.
It calls for social media companies to have a duty of care for users aged 24 and under in the form of a “statutory code of conduct” regulated by Ofcom.
It also recommends that the UK Government publish clear guidance advising those under 24 of the dangers on excessive social media use and commission research into the impact of social media on young peoples’ mental health, particularly focusing on the addictive nature of social media and whether this should be officially classified as a disease.
Social media giants are under scrutiny
The report is the latest in a series of recommendations for greater control over the negative aspects of social media. Last week, The House of Lords Communications Committee published a report proposing a set of 10 principles to “shape and frame all regulation of the internet”, and a new Digital Authority to oversee this regulation.
In February, the UK’s Government’s Digital, Culture, Media and Sports Committee published its report on fake news and misinformation, calling for an online a code of conduct.
According to a survey by One Poll, 83% of UK consumers believe that Facebook should be regulated, suggest that greater regulation would be welcomed by many users. But is it possible to regulate the social media “wild west”?
Could social media regulation work?
The pressure on social media companies to take some action against the spread of harmful content, particularly related to young people, has led Instagram to ban images of self harm from its site. Facebook has also taken steps to tackle the problem of false or misleading scientific information spread on its site, confirming last month that it is cracking down on anti-vaccination content. Although this is a step in the right direction, self-governance is not the same as legislation, and it is not yet clear what form tighter social media regulation could take, and whether it will have a meaningful impact on the way in which online giants conduct themselves.
Speaking on the Andrew Marr Show, health secretary Matt Hancock is confident that the UK Government has the power to rein in the likes of Facebook, Youtube and Instagram:
“I think a lot of people feel powerless in this situation, but of course we can act. We are a nation state, parliament is sovereign…we can legislate if we need to. It would be far better to do it in concert with the social media companies, but if we think they need to do things they are refusing to do then we can and we must legislate.”
However, if social media companies are required to adhere to additional regulations in the UK that do not exist elsewhere it may deter them from investing in the country, or may lead to some sites from being blocked altogether to avoid additional regulation. A less extreme scenario could also result in UK-specific versions of certain sites. Combined with the balancing act between what constitutes harmful content, and the fears some have over over-regulation, it is clear that creating social media regulation that limits the spread of harm, while not limiting access to sites altogether, is complex.
In 2017, Sharon White, the chief executive of Ofcom, told The Guardian that the issue of social media regulation is a “complex issue”:
“We feel strongly that the platforms as publishers have got more responsibility to ensure the right content. I don’t think it’s a question of regulation, which I think has a fuzzy boundary with censorship, but I think we feel strongly that the platforms ought to be doing more to ensure their content can be trusted.
“I think it’s a very, very complex issue, where it is easier to identify some of the problems about the lack of trust and I think it’s much harder to see this as a very straight regulatory question.”
However, the sheer scale of social media giants and their influence over both the online and offline world means that greater social media regulation looks likely to be on the horizon.
“We welcome this recommendation of measures for social media companies – more needs to be done by these massive online corporations to protect young people. We live in a digital world, where children can access social networks from a young age. However, we must bear in mind that there are risks that come with this freedom; children can be exposed to cyber-bullying, self-harm and feelings of low self-esteem. Precautionary measures should be taken to minimise this potential harm – and there’s much more that parents can do, too. That includes limiting how much time young children spend online – a huge 87 per cent of parents admit that they currently don’t restrict this.
“Tools and regulations to protect children from inappropriate web content are important, but so is the monitoring of children’s online activities by parents – as well as establishing a duty of care on all social media companies, so that any threats to young people’s wellbeing are minimised.”