A recent study conducted by the Center for Countering Digital Hate (CCDH) reveals that TikTok constantly shows its users dangerous content, pointing out that this could promote self-harm and eating disorders.
The research analyzed the contents shown by the algorithm, finding the presence of numerous videos that promote dangerously restrictive diets and images that idealize self-harm and speak of suicide in a romantic key.
Research by Center for Countering Digital Hate
The researchers carried out the study by creating several TikTok accounts in various countries around the world. In particular, the United States, the United Kingdom, Canada and Australia were chosen. The accounts have been set as owned by users 13 years old (minimum age to register on the platform).
The second step was to choose the usernames. These, irrelevant as they may seem, they actually interact with the algorithm. The researchers created both “standard” and “vulnerable” accounts e have put words like “loseweight” in their usernames (“lose weight”). A choice that reflects the tendency of young users to choose something with which they strongly identify as a name.
Over the course of the first 30 minutes, the accounts only briefly dwelled on related content body image, eating disorders and suicide. The initial 30 minutes are the time frame with which the algorithm calibrates the user’s preferences. After that, TikTok tends to show the videos it deems most interesting for the account, based precisely on the interactions that took place in those initial 30 minutes.
On “standard” accounts, content about suicide began appearing within three minutes, while eating disorder videos appeared within eight minutes on the platform.
Imran Ahmed, chief executive of CCDH, told The Guardian: “Results are every parent’s nightmare. Young people’s feeds are bombarded with harmful and heartbreaking content, which can have a significant cumulative impact on their understanding of the world around them and on their physical and mental health.
What the TikTok survey revealed: “Dangerous content every 126 seconds”
Most of the content deemed dangerous appeared in the feed Per Te, the main section of the app. Particularly the algorithm tended to show videos of other users sharing their anxieties and insecurities.
The study also reveals another chilling fact: to accounts registered as 13-year-olds, the social network tends to show many videos advertising slimming drinks and “abdominoplasty” surgeries.
The average of dangerous content on TikTok is one every 206 seconds. Videos related to body image, mental health and eating disorders were shown to ‘vulnerable’ accounts three times as often as standard accounts. In addition, content has also appeared on these accounts in which young people discuss various ways of taking their own lives.
TikTok’s response: “The study does not reflect the actual user experience”
After the publication of the study, the response from TikTok arrived promptly. A spokesperson for the company said (via NME):
“This activity and the resulting experience do not reflect genuine behavior or experiences of real people. We consult regularly with healthcare experts and consistently remediate violations of our policies, providing support and access to anyone who needs it. The experience of the content displayed is unique to each individual and we remain focused on promoting a safe and comfortable space for everyone, including people who choose to share the more personal aspects of their lives.”
We remember that TikTok guidelines prohibit content that promotes behavior that could lead to suicide and self-harmas well as material that promotes unhealthy eating habits or behaviors.