In its short life, TikTok has had tremendous success. The social media platform, which was launched in China in 2017, had already reached half a billion users by mid-2018.The popular app specialises in short-form video creation, with personalised “for you” feed pages.
The addictive nature of the platform is attributed to its machine learning algorithm which recommends videos to users based on user interests. So, if you engage with a video showing healthy weight loss tips, your feed can become flooded with weight loss content. A highly addictive platform has ensued, making it difficult to stop scrolling and there is growing evidence of digital echo chambers forming. These can warp the views of impressionable teens, especially worrying as according to NewsGuard, some users are as young as nine.
The success of the algorithm
Amid concerns regarding involvement of the Chinese Government, TikTok provided details of its powerful algorithm to improve transparency. Initially, the algorithm gains familiarity with user preferences by showing popular trends on the app. It then tags videos the user engages with and will supply more of that type
It can even go as far as predicting what content the user will like at certain times of day. You no longer need to waste time finding people to follow, the clever algorithm shows you what you want, when you want it, with the goal of keeping users engaged.
When people interact with certain content, they inevitably see more, and their feeds become homogenised. When users only encounter a certain opinion or type of information, a digital echo chamber is produced, distorting perspectives. A concerning trend on TikTok is the growing volume of content related to eating disorders.
Although TikTok has banned some eating disorder hashtags and does provide helpful resources for users searching for these terms, pro-anorexia content is still present on the app. Individuals can get past these bans by using slight misspellings of hashtags. Content can also be subtle, such as “What I eat in a day”, demonstrating diets of under 800 calories and exercise addictions. When individuals interact with this content, the algorithm will aim to push more of it and impressionable teens normalise these behaviours.
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below formBy GlobalData
TikTok is not the only culprit
These issues are not limited to TikTok. A Facebook whistle-blower revealed a report showing that the company was aware that Instagram was pushing pro-anorexia content and worsening the mental health of users. According to the report, 32% of teen girls felt that using Instagram worsened their body image issues. Although both Facebook and TikTok have addressed these concerns by providing well-being resources, this is not enough. More needs to be done to prevent damaging content reaching platforms in the first place.
TikTok uses a moderating service, a step forward to prevent misinformation and dangerous content. Unfortunately, TikTok’s fact checking is outsourced to a third-party company, loosening their oversight. Fact checking should be conducted in-house as certain videos do not directly break community guidelines but are still harmful. TikTok also emphasised the importance of “putting controls into the hands of [their] users”, asking users to report content themselves.This fails to recognise the competitive and engrossing nature of eating disorders.
There is no doubt that TikTok’ s algorithm is key to the success of the platform, but if it wants to avoid the same reputation that other social media platforms have, TikTok will need to take steps to more effectively moderate the content it provides.