A graphic video of a man committing suicide that circulated on TikTok was uploaded as part of a “coordinated attack”, the video-sharing company has said.
On 31 August, a 33-year-old man live-streamed his own death on Facebook. That video was then uploaded to TikTok and other social media platforms, resulting in many young people coming across it and being traumatised by the content.
Today Theo Bertram, director of government relations and Public Policy EMEA at TikTok, said the company’s investigations showed groups operating on the dark web conspired to circulate the video across social media. The clip was also edited into other videos to catch users unaware.
“There is evidence of a coordinated attack and through our investigations, we learned that groups operating on the dark web made plans to raid social media platforms, including TikTok, in order to spread the video across the internet,” said Bertram in an appearance before the UK government’s sub-committee into online harms and disinformation.
“What we saw was a group of users who were repeatedly attempting to upload the video to our platform, splicing it, editing it, cutting it in different ways, and joining the platform in order to try and drive that.”
He said that there was a “huge spike in volume” of the clip being uploaded to TikTok on 6 September. He added that the ways in which the video was viewed showed “quite an unusual pattern”. Usually, videos are discovered in their suggested feed or by clicking on hashtags. The suicide clip instead saw people viewing the TikTok video more directly, he said.
How well do you really know your competitors?
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
TikTok: “We know we have to do better”
Bertram said that TikTok’s “emergency machine learning services kicked in, and they detected the videos, and we quickly removed them”. TikTok said it has banned all the accounts that uploaded the video.
However, this was not in time to prevent many children from viewing the distressing content. One woman told the BBC that her 14-year-old daughter was “in such a state, shaking like a leaf and properly sobbing” after viewing it.
Last night, TikTok wrote to the CEOs of Facebook, Instagram, Google, YouTube, Twitter, Twitch, Snapchat, Pinterest and Reddit to “establish a partnership for dealing with this type of content”.
TikTok, which has 3.7 million active users in the UK and 100 million in the US, said it will look to improve its machine learning systems and human moderation process to prevent something similar happening again.
In its transparency report, published today, TikTok said it removed nearly 105 million videos for policy violations in the first half of 2020.
“We know we have to do better, and our hearts go out to the victim in this case, but we do believe we can do better in future,” said Bertram.