There is mounting evidence that social media platforms like Instagram can be harmful to teenagers. Worse still, Facebook knows and shows little plans to protect users of any age on its social media platforms. This is in stark contrast to China-based Douyin which announced in the same week the introduction of ‘youth mode’ on its app to protect its youngest users.

A report in the Wall Street Journal uncovered inhouse research by Facebook with shocking findings. As early as 2019 Facebook’s research showed that body issues were made worse for 1 in 3 teenage girls, increased anxiety and depression in teenagers, and that some traced a desire to kill themselves to Instagram. Facebook did not dispute these findings.

These revelations make the tech-giants’ plans to release Instagram Kids even more problematic, seemingly placing company profits over the safety and wellbeing of its users. In the same week, the CDC’s National Center for Injury Prevention and Control released a study carried out in collaboration with online safety company Bark Technologies. The study tracks how youths’ exposure to previous online behaviours, such as bullying, violence, drug-related content, and hate speech, can predict risk of future suicide and self-harm related behaviour.

The 13-month study of middle and high-school aged children demonstrated the link between eight online risk factors and youth wellbeing. This adds to the growing demonstrating teen wellbeing is linked to how their time is spent on social media platforms.

China to wean its next generation off ‘spiritual opium’ of social media

Long-held concerns about the impacts of technology and social media on the next generation is spurring a wave of regulation in China, both by state authorities and by tech giants.

Young Douyin users now have limits on both the amount of time they can spend on the platform, but also when they can access it. Children under 14 will be limited 40 minutes per day on the short form video app between 6am and 10pm only. The company has also launched new educational content as part of its Youth Mode, including science experiments, museum exhibitions, and history explanations.

This self-regulation by ByteDance, the parent company of both TikTok and Douyin, fits into the grander picture being painted by Chinese state authorities. In February the Chinese Ministry of Education banned children from using their mobile phones in school, and they cannot bring their devices to school without written parental consent.

Authorities state that this is to protect youths’ eyesight, improve their concentration, and prevent internet addiction. Online gaming for those under 18 was also limited to 3 hours per week in August 2021, following its branding as ‘spiritual opium’. Gamers can play for an hour each on Fridays, weekends, and holidays between 8pm and 9pm. This is a further clampdown from earlier rules where children could play 90 minutes per day, and 3 hours on holidays.

Who should be responsible for regulating the internet?

Do Douyin’s current measures mark a healthy middle ground between lacklustre self-awareness and harsh, retaliatory governmental restrictions?

Though acts such as the UK’s Online Safety Bill, which aims to give the regulator Ofcom the power to fine companies that fail to act on potentially harmful online content, it is becoming increasingly clear that such post-haste regulations are already too late.

Safety issues have been embedded in the internet since its inception, and it is unclear who will sufficiently protect the next generation online. Western social media companies are not doing enough to create healthy limits and boundaries and safe social spaces online for their users. In this respect, it is China that is leading the way.