Tiktok reveals it took down over 100m clips in the first half of the year in transparency report

Short video platform Tiktok has revealed it removed over 104m videos in the first half of 2020, and received nearly 1800 legal requests and 10,600 copyright takedown notices.

The figures come from the Byte Dance-owned platform’s first transparency report of 2020, which showed an incredible jump from its first transparent report, which covered the last six months of 2019. Just 49m videos were deleted in those six months, reflecting the growing size of the platform’s user base.

A post written by Brent Thomas, director of public policy, Australia and New Zealand, and Arjun Narayan Bettadapur Manjunath, head of trust and safety for APAC, said the report showed how important safety was to the business, which is currently in the process of being sold in the US.

“Hundreds of millions of people around the world come to TikTok for entertainment, self-expression, and connection. We have no higher priority than promoting a safe app experience that fosters joy and belonging among our growing global community,” read the post.

“We know how vital it is to build trust by being transparent with our community. It’s why we regularly release these reports to hold ourselves accountable to our community and provide insight into the actions we take to help keep TikTok safe for everyone.”

Tiktok says the videos it deleting during the six months make up less than 1% of all content loaded onto the platform. A third of those deleted contained nudity or sexual activities, while 22.3% fell into the category of ‘minor safety’. 19.6% showed ‘illegal activities’ and 13.4% included ‘suicide, self harm and other dangerous acts’. Tiktok recently had to remove thousands of videos from across its platform which showed a graphic suicide, an event it believes is attributable to ‘the dark web’.

Tiktok has more than 700m users globally and it claims that 90% of the videos that were removed from the platform were taken down before they had any views. Particularly in India, Brazil and Pakistan, the platform uses automated systems to monitor its videos, but it also uses human moderation in some markets.

Over the last six months, Tiktok has introduced fact-checking programs in nine markets, including the US, Canada, UK, Australia, France, Italy, Spain, Japan, India to help it verify misleading content about the novel coronavirus, elections, and more. This program will eventually be rolled out to other markets.

Tiktok also experienced an increased volume of legal requests, receiving 1768 requests for user information from 42 countries and 135 from government agencies.

India was the market where the most videos were removed, over 37m, while almost 10m videos were removed in the US.

The announcement coincides with the platform suggesting a ‘global coalition’ of social media companies, with interim head Vanessa Pappas writing to nine social media platforms to propose a Memorandum of Understanding (MOU) that would encourage companies to warn one another of such violent, graphic content on their own platforms. Tiktok didn’t say which platforms had been approached, but it has been clear in the past about the platforms it feels have work to do in the privacy and security space.

The US threatened to remove Tiktok from app stores last weekend after a deal with Microsoft fell through, but that move was delayed when a partnership between Tiktok, Oracle and Walmart was approved. The full details of the new deal have not been released, but it appears Tiktok will create a US subsidiary which will be partially owned by Oracle and Walmart.

Have more information on the article? Want to share an opinion? Just want to reach out? Email Hannah on or get in touch via LinkedIn.


Get the latest media and marketing industry news (and views) direct to your inbox.

Sign up to the free Mumbrella newsletter now.



Sign up to our free daily update to get the latest in media and marketing.