TikTok suppressed disabled users’ videos

TikTok logo on phoneImage copyright
Getty Images

Videos made by disabled users were deliberately prevented from going viral on TikTok by the firm’s moderators, the app has acknowledged.

The social network said the policy was introduced to reduce the amount of cyber-bullying on its platform, but added that it now recognised the approach had been flawed.

The measure was exposed by the German digital rights news site Netzpolitik.

Disability rights campaigners said the strategy had been “bizarre”.

A leaked extract from TikTok’s rulebook gave examples of what its moderators were instructed to be on the lookout for:

  • disabled people
  • people with facial disfigurements
  • people with other “facial problems” such as a birthmark or squint
  • Down’s syndrome
  • autism

Such users were “susceptible to bullying or harassment based on their physical or mental condition”, the guidelines added.

According to an unnamed TikTok source quoted by Netzpolitik, the moderators were told to limit viewership of affected users’ videos to the country where they were uploaded.

And in cases where the creators were judged to be particularly vulnerable, it reported that the moderators were ordered to prevent the clips from appearing in the app’s main video feed once they had reached between 6,000 to 10,000 views.

This video feed is auto-generated and personalised for each member. It accounts for where most people spend their time watching others’ content.

Netzpolitik reporter Chris Koever suggested the result was that the Chinese-owned firm had further victimised those affected “instead of policing the perpetrators”.

A spokesman for TikTok admitted it had made the wrong choice.

“Early on, in response to an increase in bullying on the app, we implemented a blunt and temporary policy,” he told the BBC.

“This was never designed to be a long-term solution, and while the intention was good, it became clear that the approach was wrong.

“We have long since removed the policy in favour of more nuanced anti-bullying policies.”

TikTok has not confirmed when it abandoned the measure, but Netzpolitik reported that it was still in force in September.

‘Actively excluded’

UK-based organisations have condemned the discovery.

“It’s good that TikTok has ended this bizarre policy,” Ceri Smith from the disability equality charity Scope said.

“Social media platforms must do more to tackle cyber-bullying, but hastily hiding away a group of users under the guise of protecting them is not the right approach at all.”

Anti-bullying charity Ditch the Label added that it hoped valuable lessons had been learned.

“It is concerning that young people with disabilities have been actively excluded from participating on a platform that prides itself as being fun and inclusive,” said chief executive Liam Hackett.

“This approach is discriminatory and further demonises disability, which we already know attracts a huge amount of abuse and intolerance.”

‘Consult users’

This is the latest in a series of controversies to affect the short-form video app in recent weeks.

In September, the Guardian reported that the app used to restrict or ban political content, including footage of the Tiananmen Square protests, that could be used to criticise the Chinese government.

The firm’s parent company Bytedance subsequently declined to testify to US Congress about its ties to China saying it had not been given enough warning.

Then last Wednesday, TikTok apologised to a US teenager for removing a video in which she had accused China of mistreating its Uighur Muslim population.

Media captionWATCH: Feroza Aziz rejected TikTok’s explanations for blocking her from its app

And it has since emerged that it is being sued by a US student who alleges that the firm surreptitiously transferred “vast quantities of private and personally identifiable user data” from the international version of its app to China. TikTok maintains it only stores US user data in the United States and Singapore.

For its part, Netzpolitik said it hoped the latest revelations would encourage the app to consult users before imposing potentially discriminatory changes.

“Basically any time you try to make a policy, ask those who will be affected by it,” said Ms Koever.

“That would be a good start.”