TikTok admits early anti-bullying efforts were ‘blunt’ and ‘wrong’
The Chinese-owned app tried to limit the reach of specific videos, leaked guidelines suggest.
TikTok has admitted it was “wrong” to reduce the visibility of disabled, overweight and LGBTQ+ users on its platform in its early days in an effort to tackle bullying.
German site NetzPolitik.org obtained moderation guidelines for the Chinese-owned app, apparently aimed at protecting those it deemed “highly vulnerable to cyberbullying” if exposed to a wide audience.
The move would limit the reach of specific videos from accounts belonging to people it believed were “susceptible to harassment or cyberbullying based on their physical or mental condition”, the report says.
These are alleged to include those with a facial disfigurement, autism and Down syndrome.
In some cases, this meant videos were only visible in the country where they were uploaded, while others deemed particularly vulnerable were put into a “not recommended” category so their videos would not appear in an algorithmically compiled section of the app where others can discover content.
The report claims the company kept a list of “special users” it also considered at risk of bullying, including those who were “fat and self-confident”, as well as those with a rainbow flag in their biographies or identified as lesbian, gay or non-binary.
TikTok said its approach was “blunt” in its early days as it sought to address bullying, but added that the policy was never designed to last forever.
“Early on, in response to an increase in bullying on the app, we implemented a blunt and temporary policy,” a spokesman said.
“This was never designed to be a long-term solution, and while the intention was good, it became clear that the approach was wrong.
“We want TikTok to be a space where everyone can safely and freely express themselves, and we have long since changed the policy in favour of more nuanced anti-bullying policies.”