Content creators are exploiting Tik Tok's algorithmic flaws and loopholes to promote content potentially harmful for the mental health of other users, a new report finds
An average TikTok user spends 95 minutes per day on the platform - that’s over 1.5 hours swiping through videos that range from 3 seconds to 3 minutes. And as we all know by now, the user base is really young.
Of the over 80 million monthly active users in the US, 60 percent are aged between 16 and 24. And with more than one billion users, this means a lot of impressionable youth are consuming TikTok’s content at an alarming rate worldwide.
But does the platform have adequate filters in place to protect its users from potentially harmful content? Not according to a new report by Within Health, a US-based organisation which offers support to individuals struggling with eating disorders.
TikTok’s algorithm has long been criticised for leading users down a “content rabbit hole,” bringing up similar videos again and again in its main feed (known as the “For You Page") so users are bombarded repeatedly with potentially dangerous content.
While all social media can impact our mental well-being, Joe Mercurio, part of the creative team for Within Health told TRT World that his team “was particularly shocked at how extensive the issues with TikTok are”.
To its credit TikTok has worked to reduce its content loop from turning toxic, but content creators are finding new ways to exploit the algorithmic flaws to promote their own videos, the report says.
Within Health identified eight major problems that amplified unsafe content surrounding eating disorders (ED).
In order to access videos outside of the “For You Page", users can search for content through the app’s varying search channels: hashtag search, user search, video search and even outside the platform through web search.
Many dangerous keywords and terms, such as those related to promoting EDs, are on a “block” list and users also have the option to customise this by further limiting certain words.
For example, the platform bans words such as “anorexic” so if a user does search for this term, a message pops up that reads “you’re not alone” and guides them to a list of resources for eating disorder support.
However, deliberate alternate or misspellings such as “anarexi” and “eating dissorder” can bypass the ban on keywords deemed harmful – for instance, unhealthy eating habits and negative perceptions of body image.
And just how many non-blocked and misspelt words are there? The report compiled a “word cloud” related to eating disorders and found the total volume of views is greater than 1.3 billion.
“These results are concerning because even though TikTok users are purposely blocking certain hashtags and search terms, eating disorder-related content is still making its way to users’ feeds,” Mercurio said.
“It’s important that we bring light to this issue because so many impressionable people on the app might find this content triggering, especially if they are in recovery,” he added.
Keywords can also be tweaked with homoglyphs, characters resembling letters, or the use of accented or foreign versions of English letters.
The report says TikTok's search engine will often match these iterations with their correctly spelled counterparts and allow harmful content in through this channel.
Users can also utilise TikTok’s autocomplete feature to find keywords that evade filtering. They are given options of words to aid their search as well as view counts that signal the associated popularity of each search within the app.
Mercurio explained how users may abuse this feature to find new words that are trending and haven’t been banned yet to reveal hidden content.
“The triggering autocomplete results of searches on our test account were particularly shocking, especially considering one of our searches included only “an” and nearly every autocomplete result had to do with anorexia,” said Mercurio.
Eating disorders are among the deadliest mental illnesses, second only to opioid overdose in the United States. It’s estimated that one death from an eating disorder occurs every 52 minutes, resulting in 10,200 deaths each year.
Shira L Charpentier, Founder of Beyond Rules Recovery, a peer support system for people struggling with eating disorders, told TRT World that social media provides “a breeding ground for comparisons, which is detrimental to anyone struggling with an eating disorder.”
Apps like TikTok have content “about weight loss, exercise, the magic formula for a leaner body, supplements/shakes that will help you burn fat, fitness apps to track anything and everything, crash diets, intermittent fasting, and I could go on and on,” Charpentier said.
And as the platform continues to expand, with 104 million downloads just last month, countless youth are at risk of exposure to this content unless more efficient checks are in place.
From presenting alternative algorithms for users such as ranking videos from “most recent” to implementing a human-monitored review to better understand the actual purpose of viral sounds, Within Health suggested five solutions for TikTok.
Their main suggestion is to expand TikTok’s blocklist and apply it evenly across all search options. They created a crowd-sourced block list “where anyone can submit keywords they think should be blocked to make TikTok a safer space for all.”
But Charpentier warns that “it would be nearly impossible to prevent harmful content from being on social media due to compromising free speech.”
“The responsibility we have as eating disorders experts is educating the public and putting the message of recovery and hope out there, in the chance they will come across those posts,” Charpentier said.
“Take a break from social media and find other ways to spend your time, grow relationships, discover what you’re passionate about, and learn to be okay with just yourself without social media as a filler or distraction.”