Australasian Science: Australia's authority on science since 1938

'Anorexia coach': sexual predators online are targeting teens wanting to lose weight. Platforms are looking the other way

By Suku Sukunesan, Senior Lecturer in Information Systems, Swinburne University of Technology

Author provided

There’s no shortage of people online looking to exploit and manipulate the vulnerable among us. One such group is anorexia coaches, or “anacoaches”.

They are typically middle-aged, male sexual predators who go online to find impressionable young people to exploit under the guise of providing weight-loss “coaching”.

I have been researching how anacoaches operate. I’ve found they are facilitated by flaws within social media algorithms, as well as large numbers of young people seeking weight-loss help online.


https://images.theconversation.com/files/411732/original/file-20210717-1... 1200w, https://images.theconversation.com/files/411732/original/file-20210717-1... 1800w, https://images.theconversation.com/files/411732/original/file-20210717-1... 754w, https://images.theconversation.com/files/411732/original/file-20210717-1... 1508w, https://images.theconversation.com/files/411732/original/file-20210717-1... 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">

An anacoach message on Tumblr.
Author provided

My ongoing research, coupled with other media reports, indicates opportunity for anacoaches has risen in the past few years. My analysis showed that on Twitter alone there are about 300 unique requests for anacoaches around the world daily.

Anacoaches operate on numerous channels, including established social platforms such as Twitter, TikTok, Tumblr and Kik. Despite this, these platforms haven’t addressed the problem.

Targeting teens

An estimated 4% of Australians, or roughly one million people, are affected by eating disorders. And almost two-thirds (63%) of these people are thought to be female.


https://images.theconversation.com/files/411731/original/file-20210717-1... 1200w, https://images.theconversation.com/files/411731/original/file-20210717-1... 1800w, https://images.theconversation.com/files/411731/original/file-20210717-1... 754w, https://images.theconversation.com/files/411731/original/file-20210717-1... 1508w, https://images.theconversation.com/files/411731/original/file-20210717-1... 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">

Screenshot from TikTok.
Author provided

Teenagers with eating disorders are more likely to experience poor mental health and impaired functioning in social environments — which leaves them more vulnerable to the influence of anacoaches.

Also, research has shown social media use can exacerbate the extent to which teenagers and young adults chase a “thin” ideal.

One study published by a Dutch human rights law group on the predatory behaviours of anacoaches found self-reporting victims had been sexually assaulted and even raped.

And with anacoaching comes the potential for other forms of criminal abuse, such as paedophilia, forced prostitution and even human trafficking.




Read more:
The virtual door to online child sexual grooming is wide open


Social media provides the platform

With the rise of online platforms there has been an emergence of communities pursuing a thin ideal. These networks tend to share content that endorses extreme thinness.

Group identity is formed through interactions and hashtag sharing, with a focus on terms used regularly in the context of eating disorders. Common hashtags include #proana (pro-anorexia), #bonespo (bone inspiration), #edtw (eating disorder trigger warning), #promia (pro bulimia), #bulimia, #thighgap, #uw (ultimate weight), #cw (current weight), #gw (goal weight) and #tw (trigger warning).

As highlighted in my previous research, communication in these communities includes exchanging weight-loss tips, diet plans, extreme exercise plans, imagery of thin bodies and emotional “support”.

Anacoaches lurk in chat forums focused on thin ideals. Each coach will tend to be present in numerous chatrooms, luring teenagers with stories of their past “successes” from coaching.

They market themselves with dubious claims. Some will assign themselves labels such as “strict coach” or “mean coach”. The screenshots below show messages posted on the app Kik.


https://images.theconversation.com/files/410541/original/file-20210709-2... 1200w, https://images.theconversation.com/files/410541/original/file-20210709-2... 1800w, https://images.theconversation.com/files/410541/original/file-20210709-2... 754w, https://images.theconversation.com/files/410541/original/file-20210709-2... 1508w, https://images.theconversation.com/files/410541/original/file-20210709-2... 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">

Screenshot from Kik.
Author provided


https://images.theconversation.com/files/410534/original/file-20210709-1... 1200w, https://images.theconversation.com/files/410534/original/file-20210709-1... 1800w, https://images.theconversation.com/files/410534/original/file-20210709-1... 754w, https://images.theconversation.com/files/410534/original/file-20210709-1... 1508w, https://images.theconversation.com/files/410534/original/file-20210709-1... 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">

Screenshot from Kik.
Author provided

The coaching predominantly involves sharing pictures and videos for nude body checks (or in undergarments), weekly weigh-ins, and enforcing strict rules on what foods to eat and avoid.


https://images.theconversation.com/files/410537/original/file-20210709-2... 1200w, https://images.theconversation.com/files/410537/original/file-20210709-2... 1800w, https://images.theconversation.com/files/410537/original/file-20210709-2... 754w, https://images.theconversation.com/files/410537/original/file-20210709-2... 1508w, https://images.theconversation.com/files/410537/original/file-20210709-2... 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">

Screenshot from Kik.
Author provided

While there’s currently no way to know how long coaching lasts on average, the harms are extensive. Because of the way its content algorithms work, TikTok, which has a massive young following, will start to recommend user accounts centred around eating disorders once such content is initially sought.


https://images.theconversation.com/files/410972/original/file-20210713-2... 1200w, https://images.theconversation.com/files/410972/original/file-20210713-2... 1800w, https://images.theconversation.com/files/410972/original/file-20210713-2... 754w, https://images.theconversation.com/files/410972/original/file-20210713-2... 1508w, https://images.theconversation.com/files/410972/original/file-20210713-2... 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">

Screenshot from TikTok.
Author provided

What is being done?

There are currently not enough regulations in place by platforms to prevent anacoaches from operating, despite an array of reports highlighting the issue.

Best efforts so far have involved Instagram, TikTok and Pinterest filtering out selected words such as “proana” or “thinspo” and banning searches for content that promotes extreme thinness.

A TikTok spokesperson told The Conversation the platform does not allow content depicting, promoting or glorifying eating disorders.

“When a user searches for terms related to eating disorders, we don’t return results and instead we direct them to the Butterfly Foundation and provide them with helpful and appropriate advice. We’ve also introduced permanent public service announcements (PSAs) on related hashtags to help provide support for our community,” the spokesperson said.


https://images.theconversation.com/files/410934/original/file-20210713-2... 1200w, https://images.theconversation.com/files/410934/original/file-20210713-2... 1800w, https://images.theconversation.com/files/410934/original/file-20210713-2... 754w, https://images.theconversation.com/files/410934/original/file-20210713-2... 1508w, https://images.theconversation.com/files/410934/original/file-20210713-2... 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">

Screenshot from TikTok.
Author provided

The spokesperson said accounts found to be engaging in sexual harassment may be banned. Platforms will ban users if they violate user guidelines, but anacoaches will often reappear under a new account name.

According to Twitter, evading account bans is against the rules. Earlier this year Twitter announced it would enable a safety mode that will allow users to turn on the proactive screening of spammy and abusive content. It remains to be seen what role this will play in curbing targeted attacks from anacoaches.

A research-based report released this month by the 5Rights Foundation has detailed how minors online are targeted with sexual and suicide-related content. It references platforms including Twitter, TikTok, Instagram, Snapchat, Facebook, Discord, Twitch, Yubo, YouTube and Omegle.

The research showed children as young as 13 are directly targeted with harmful content online within 24 hours of creating an account online.


https://images.theconversation.com/files/410948/original/file-20210713-2... 1200w, https://images.theconversation.com/files/410948/original/file-20210713-2... 1800w, https://images.theconversation.com/files/410948/original/file-20210713-2... 754w, https://images.theconversation.com/files/410948/original/file-20210713-2... 1508w, https://images.theconversation.com/files/410948/original/file-20210713-2... 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">

Screenshot from TikTok.
Author provided

They may receive unsolicited messages from adults offering pornography, as well as recommendations for eating disorder content, extreme diets, self-harm, suicide and sexualised or distorted body images.

Australia’s policies involving platforms need to be overhauled to ensure platforms adhere to community guidelines and are held accountable when violations occur.

The government should prescribe set rules, informed by the eSafety office, regarding how vulnerable youth online should be helped.

A nuanced intervention approach would generate better outcomes for users with eating disorders as each user would have a different set of circumstances and a different mental health state.

Anacoaches on social media should be considered and dealt with like criminals. And platforms that fail to uphold this should face fines for failing to provide a safe user environment for the vulnerable.

In the past the European Union has fined platforms for allowing terrorist content. Social media giants have also hired contract workers to screen content for examples of terrorism, paedophilia and abuse. This effort should be extended to include anacoaches.


https://images.theconversation.com/files/410535/original/file-20210709-1... 1200w, https://images.theconversation.com/files/410535/original/file-20210709-1... 1800w, https://images.theconversation.com/files/410535/original/file-20210709-1... 754w, https://images.theconversation.com/files/410535/original/file-20210709-1... 1508w, https://images.theconversation.com/files/410535/original/file-20210709-1... 2262w" sizes="(min-width: 1466px) 754px, (max-width: 599px) 100vw, (min-width: 600px) 600px, 237px">

Screenshot taken from Kik.
Author provided

The Conversation approached Tumblr for comment but did not receive replies within the deadline allocated. Popular messaging app Kik was acquired by MediaLab in 2019. The Conversation approached MediaLab for comment but did not receive a response within the allocated timeframe.

The Conversation

Suku Sukunesan receives funding from NHMRC-MRFF examining social media content involving Eating Disorders.


Originally published in The Conversation.