The Cinnamon Challenge, the Ice Bucket Challenge, the Harlem Shake – these are all social media challenges that generated both hype and cringe, often in equal measure.

Internet challenges have been around since the early 2000s, with each subsequent year bringing more absurd iterations. The high frequency of these challenges makes them easy to overlook. They happen in the background, another phenomenon to attribute to ‘generation hashtag’.

While scrolling through countless stumbling attempts at the ‘yoga challenge’, however, you might come across something more shocking. These dangerous challenges easily become viral and present a safety risk, especially for social media’s younger users. TikTok, in particular, has come under fire for its dangerous challenges. An example was  the ‘blackout challenge’. This circulated on the app and showed teens restricting their breath until they passed out for a momentary high.

TikTok announced new plans to keep its users safe

On November 17, TikTok went from downplaying its role in these challenges to addressing them head on with the launch of a new safety center, an internal project, an independent report commissioned by the safeguarding agency Praersidio, and a partnership with child clinical psychiatrist, Dr. Richard Graham, among other experts.

The announcement for these new steps was lengthy, but it fails to target the primary issue; the platform’s algorithms that amplify viral challenges. Its response follows on from similar solutions by other social media companies, for example, Meta’s launch of safety and mental health resources on its platforms following allegations from whistleblower Frances Haugen.

So, what is different about TikTok’s approach? It was pre-emptive. Unlike Meta, TikTok willingly released its internal documents discussing the platform’s potential harms. In this way, the company is attempting to differentiate itself from other social media leaders as the transparent option, and one that is striving to improve user safety.

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

The social media future needs to be transparent

Whether TikTok’s steps will eventually lead to fewer dangerous challenges is still in doubt. By only highlighting the issues instead of addressing their main cause (namely the algorithm), its safety-first approach seems hollow. What is clear from this move is that transparency is now the way forward.

Social media companies will have to be transparent about their shadow world of content moderation, algorithms, and dangerous content to remain trustworthy in the eyes of users and regulators. This is especially important as more and more social media leaders attempt to cross over to ‘super-app’ status in the West—and using the platforms for mobile commerce and peer-to-peer payments will require even more trust. As shown by TikTok, now is the time for transparency.