Saturday, 27 Apr 2024

TikTok accused of failing to act on rising views of eating disorder content

TikTok has been urged to strengthen its content moderation policies to better address harmful eating disorder and suicide content.

A coalition of charities and advocacy groups from the US and the UK have written to Eric Han, Head of Safety TikTok to demand action.

In December, the Center for Countering Digital Hate (CCDH) published a report that suggested the app’s algorithm showed teens ‘eating disorder and self-harm content within minutes’

CCDH re-evaluated the hashtags in January 2023 and found that since November 2022 there have been an additional 1.6 billion views of the eating disorder content.

The groups claim that TikTok had not acted swiftly enough following the publication of the research suggesting the app’s algorithm pushes self-harm and eating disorder content to teenagers within minutes of them expressing interest in the topics.

They also found that TikTok had removed just 7 of the 56 eating disorder hashtags highlighted by the research.

‘Our report, Deadly by Design, showed that hashtags are used to bind together content relating to eating disorders, much of which was dangerous to young people, and that had garnered over 13 billion views,’ said Imran Ahmed, CEO of the Center for Countering Digital Hate.

‘Despite an outcry from parents, politicians and the general public, 3 months later this content continues to grow and spread unchecked on TikTok, with a further 1.5 billion views of eating disorder content on just 49 of the hashtags we analyzed,’

The letter also claimed that TikTok was failing to put adequate protective measures in place for users attempting to access harmful content.

‘We discovered that in the US 66% of the eating disorder hashtags carry a health warning and advice, but in the UK this number drops to a paltry 5%’ said the CCDH.

In the letter to TikTok’s head of safety, the organisations asked the app to take ‘meaningful action’ including: improving moderation of eating disorder and suicide content; working with experts to develop a ‘comprehensive’ approach to removing harmful content; supporting users who may be struggling with eating disorders or suicidal thoughts; and regular reporting on the steps being taken to address those issues.

‘TikTok’s algorithm is the social media equivalent of crack cocaine: it’s refined, highly addictive and leaves a trail of damage in its wake that its producers do not appear to care about,’

TikTok responded by saying that many people struggling with eating disorders or on a recovery journey come to TikTok for support.

‘Our Community Guidelines are clear that we do not allow the promotion, normalisation or glorification of eating disorders, and we have removed content mentioned in this report that violates these rules,’ said a TikTok spokesperson.

According to TikTok’s Community Guidelines Enforcement Report from January to September 2022, of the videos we removed for violating our disordered eating policy, 85% were removed proactively, with 72% receiving no views. Additionally, 81% were removed within 24 hours.

In September 2021, TikTok introduced permanent public service announcements (PSAs) on certain hashtags, such as #whatIeatinaday.

It also started adding search interventions, so when someone searches for words or phrases such as #anorexia, they would be redirected to local support resources, such as the Beat helpline.

Source: Read Full Article

Related Posts