Advertisement

TikTok algorithm pushes videos that ‘romanticise or normalise' self harm - Study

The Amnesty International research claims TikTok’s business model is “inherently abusive".
Ellen Kenny
Ellen Kenny

07.56 7 Nov 2023


Share this article


TikTok algorithm pushes videos...

TikTok algorithm pushes videos that ‘romanticise or normalise' self harm - Study

Ellen Kenny
Ellen Kenny

07.56 7 Nov 2023


Share this article


TikTok begins showing users videos that ‘romanticise, normalise or encourage’ suicide within minutes of scrolling according to a new study.

The Amnesty International research claims TikTok’s business model is “inherently abusive and privileges engagement to keep users hooked on the platform” to collect data from them.

The two studies carried out in partnership with the Algorithmic Transparency Institute and AI Forensics, used automated accounts posing as 13-year-olds to measure the effects of TikTok’s ‘For You’ recommendation system on young users.

Advertisement

It found after five to six hours on TikTok, almost one in two videos shown were mental health-related and “potentially harmful”. 

Some 73% of content found was "not relevant" to mental health, while 13% covered "lived experience" of mental health and the remaining 14% were posts related to categories such as "emotional distress" and "dramatic description of trauma".

“Between three and 20 minutes into our manual research, more than half of the videos in the ‘For You’ feed were related to mental health struggles with multiple recommended videos in a single hour romanticising, normalising or encouraging suicide,” Amnesty said. 

“There was an even faster “rabbit hole” effect when researchers manually rewatched mental health-related videos suggested to research accounts mimicking 13-year-old users in Kenya, the Philippines and the USA.” 

‘Addictive by Design’ 

Another study ‘Addictive by Design’ used focus groups and simulations of children’s TikTok habits to reveal how “TikTok’s platform design encourages the unhealthy use of the app”. 

One focus group member described the ‘For You’ page as a “rabbit hole”. 

“If one video is able to catch your attention, even if you don’t like it, it gets bumped to you the next time you open TikTok and because it seems familiar to you, you watch it again and then the frequency of it in your feed rises exponentially,” they said.

Mood altering

Another teenager said TikTok’s algorithm can affect their mood. 

“The content I see makes me overthink [even] more, like videos in which someone is sick or self-diagnosing,” they said. 

“It affects my mentality and makes me feel like I have the same symptoms and worsens my anxiety - I don’t even look them up, they just appear in my feed.” 

TikTok data harvesting 

Amnesty International said TikTok’s “rights-abusing data collection practices” are “sustained” by making the app more addictive. 

Amnesty Tech Deputy Programme Director Lauren Armistead said TikTok targets places with less data protection regulations. 

“Children living in countries with weak regulation, including many countries of the Global Majority are subject to the worst abuses of their right to privacy,” she said. 

'Removing content that violates policies'

In a statement to Newstalk, TikTok noted the study by Amnesty found 73% of content found on the platform "was not related to mental health at all".

"The videos that did reference mental health, most focused on sharing lived experiences without romanticising it," it said.

"This demonstrates how TikTok effectively works to create diverse viewing experiences while removing content that violates our policies."

Main image: Person about to click on the TikTok app. Image: Wachiwit / Alamy Stock Photo


Share this article


Read more about

Mental Health TikTok

Most Popular