TikTok will start automating video removals for nudity and more in the US
1
Some videos will no longer be flagged for human review
By Jacob Kastrenakes@jake_kJul 9, 2021, 10:49am EDT
SHARE
Illustration by Alex Castro / The Verge
TikTok is trying to speed up its moderation process by removing human reviewers where its automated systems can do the job about as well.
In the US and Canada, the company is going to start using automated reviewing systems to weed out videos with nudity, sex, violence, graphic content, illegal activities, and violations of its minor safety policy. If the system catches a video in those categories, it’ll be pulled right away, and the creator will be given the opportunity to appeal to a human moderator.
Until now, TikTok has run all videos by human moderators in the US before pulling them, a spokesperson tells The Verge. The change is meant in part to limit the “volume of distressing videos” moderators are required to watch and give them more time to spend on trickier clips, like misinformation, that require context to properly review. Moderators for other companies, like Facebook, have developed PTSD-like symptoms from the videos they were required to watch. The announcement is also part of an effort to provide more transparency around moderation,
according to Axios
, which first reported the news.
MOST VIDEOS TIKTOK REMOVES FALL UNDER THESE CATEGORIES
The problem TikTok will face is that automated systems are never perfect, and some communities may be hit hardest by errors in automated takedowns. The app has a history of discriminatory moderation and was recently criticized for pulling the intersex hashtag twice. TikTok says it’s starting to use automation only where it’s most reliable; it’s been testing the tech in other countries, including Brazil and Pakistan, and says only 5 percent of the videos its systems removed actually should have been allowed up.
That’s a low number, but when put in context of how many videos TikTok removes — 8,540,088 videos in the US during the first three months of 2021 — there could end up being tens or hundreds of thousands of videos pulled by mistake. The vast majority of the videos removed fall under the categories TikTok is implementing automated moderation. That said, not all of those videos are going to be routed around human reviewers. A spokesperson says human moderators will still be checking community reports, appeals, and other videos flagged by its automated systems.
TikTok says it’ll be rolling out the automated review system “over the next few weeks.”
NEXT UP IN TECH
Loading comments...
Chorus
Terms of Use Privacy Notice Cookie PolicyDo Not Sell My Personal Info Licensing FAQAccessibility Platform Status
Contact Tip Us Community Guidelines AboutEthics Statement
Vox Media
Advertise with us
Jobs @ Vox Media
© 2021 Vox Media, LLC. All Rights Reserved
TECH REVIEWS SCIENCE CREATORS ENTERTAINMENT VIDEO FEATURES PODCASTS NEWSLETTERS STORE
TECH CREATORS TIKTOK