TikTok accounts were made for fictitious teen personalities in the U.S., U.K., Canada, and Australia by researchers at the nonprofit Center for Countering Digital Hate. The researchers behind the accounts then "liked" videos pertaining to eating disorders and self-harm to observe how TikTok's algorithm would react.

Within minutes, the hugely popular platform began promoting videos on weight loss and self-harm that included discussions of suicide, images of razor blades, pictures of models with idealized bodies, and videos that featured videos about desirable body types, and self-harm.

According to a report released on Wednesday (Dec. 14) that raises worries about social media and its effects on young people's mental health, TikTok's algorithms are encouraging videos about self-harm and eating disorders to vulnerable youths.

The accounts were fed even more harmful content when the researchers generated accounts with user names that showed a particular predisposition to eating disorders, such as names that included the words "lose weight."

Social media algorithms function by recognizing subjects and content of interest to a user, who is then provided more of the same to maximize their stay on the site. However, social media skeptics claim that the same algorithms that highlight content about a specific sports team, hobby, or dance fad can lead users down a rabbit hole of toxic content.

Josh Golin, executive director of Fairplay, a non-profit that advocates for stronger online protections for kids, says it's a particular issue for teenagers and young children because they spend more time online and are more susceptible to bullying, peer pressure, and harmful content about eating disorders or suicide.

TikTok is not the only platform that fails to shield underage users from hazardous content and intrusive data collecting, he continued.

TikTok contested the findings in a statement from a company representative, claiming that the results were biased since the researchers didn't utilize the platform in the same way as ordinary users. The business added that the type of content a person receives shouldn't be influenced by the name of their account.

Users under the age of 13 are not permitted on TikTok, and videos that promote eating disorders or suicide are prohibited per the platform's official guidelines. TikTok users in the U.S. who look for content about eating disorders are presented with a prompt with links to mental health websites and the National Eating Disorder Association's contact details.

Despite the platform's attempts, researchers at the Center for Countering Digital Hate discovered that TikTok had received billions of views for videos about eating disorders. Researchers discovered that adolescent TikTok users occasionally used coded language to discuss eating disorders in an effort to get beyond TikTok's content moderation.