TikTok deleted 49 million 'rule-breaking' videos
- Published
TikTok says it deleted more than 49 million videos which broke its rules, between July and December 2019.
About a quarter of those videos were deleted for containing adult nudity or sexual activity, the business said, in its latest transparency report, external.
The video-sharing app also revealed it had received about 500 requests for data from governments and police, and had complied with about 480 of them.
The US has suggested it is "looking at" whether to ban the Chinese-owned app.
On Monday, US Secretary of State Mike Pompeo suggested that downloading TikTok would put citizens' "private information in the hands of the Chinese Communist Party".
He added that the US government was considering whether to ban Chinese-owned apps: "We are taking this very seriously. We are certainly looking at it," he said, in a Fox News interview.
The government in India has already banned the app, citing cyber-security concerns.
TikTok is owned by Chinese firm ByteDance. The app is not available in China, but ByteDance operates a similar app, called Douyin, which is available.
TikTok said it had not received any government or police data requests from China, or any requests from the Chinese government to delete content.
On Thursday, the Wall Street Journal published a report, external suggesting the business was thinking about setting up a new headquarters, outside of China.
TikTok told the BBC in a statement: "As we consider the best path forward, ByteDance is evaluating changes to the corporate structure of its TikTok business. We remain fully committed to protecting our users' privacy and security as we build a platform that inspires creativity and brings joy for hundreds of millions of people around the world."
Privacy features
US authorities are examining whether TikTok complied with a 2019 agreement aimed at protecting the privacy of under-13s.
The app says it offers a limited app experience, with additional safety and privacy features for under-13s.
According to TikTok's transparency report:
25.5% of the deleted videos contained adult nudity or sexual acts
24.8% broke its child-protection policies, such as implicating a child in a crime or containing harmful imitative behaviour
21.5% showed illegal activities or "regulated goods"
3% were removed for harassment or bullying
Less than 1% were removed for hate speech or "inauthentic behaviour"
TikTok's transparency report also revealed:
The 49 million deleted videos represented less than 1% of videos uploaded between July and December 2019
98.2% of the deleted videos were spotted by machine learning or moderators before being reported by users
TikTok was only released in 2017 - and because it's so new we know much less about the platform than we do about Facebook, for example.
This report offers at least a little detail about the kind of content it takes down.
There has been lots of focus recently on hate and extremism on platforms such TikTok, but fewer column inches about sexual content or the safety of minors.
Yet around half the videos taken down were in those two categories.
What we don't know, of course, is how much harmful content has been missed by its moderators and machines.