YouTube has taken down 220 million videos since 2018
Now it’s getting a little less strict on what’s allowed on the site, per The New York Times.
According to a report from The New York Times on Monday, YouTube’s content moderation policies have been quietly relaxed after more lenient new guidelines were introduced in mid-December.
Alphabet’s video platform — which reportedly supported 490,000 jobs and contributed $55 billion to the US economy last year — joins Meta platforms and Elon Musk’s X in revising its approach to content moderation. Unlike those two, however, YouTube hasn’t made any public statement about the move, though a spokesperson told the Times that it continuously updates its guidance for moderators.
New rules
YouTube workers are now advised to keep policy-contravening content up if it’s in the public interest and less than half of the video breaks the site’s code of conduct; previously, that threshold was a quarter of the video. Videos are in the public interest if creators “discuss or debate elections, ideologies, movements, race, gender, sexuality, abortion, immigration, censorship and other issues,” the Times reported from training materials it accessed.
Google’s quarterly transparency report shows that millions of videos are still taken off YouTube every single month.
In Q1 2025, the first quarter after YouTube’s new policies had come into play, the site took down some 8.6 million videos for breaching community guidelines. Worryingly, over half of those videos were removed for breaking YouTube’s child safety terms, though almost 55% of the content was actioned before it received a single view, thanks to YouTube’s automatic flagging system.
Indeed, when automated flagging is taken out of the YouTube moderation picture, there’s not actually all that much left. From the start of 2018, just ~19.9 million video removals came from human detection, with more than 90% stemming from the site’s AI-powered moderation system, which has been accused of being a little trigger-happy in the past.