YouTube Will Be Harder To Avoid “Bad Actors” on its Platform

The company wants to expand its workforce to review content violations and more.


YouTube is working harder every day towards one goal: making its platform safer from “bad actors” that exploit “their openness”. In a blog post by YouTube CEO, Susan Wojcicki, she has talked about several issues that the platform has had over the months regarding user trust and their security. YouTube, according to Wijcicki, has become a force of good in the last decade, and they want to go on this path – helping people, creating content for everyone and keeping people safe. Nevertheless, as the CEO has confessed, lately she has seen “up-close that there can be another, more troubling, side of YouTube’s openness”. She’s talking about a wide range of troubling videos by extremists, children exploitation schemes on the platform and other troubling content that should not have place on YouTube.

YouTube Go

“In the last year, we took actions to protect our community against violent or extremist content, testing new systems to combat emerging and evolving threats. We tightened our policies on what content can appear on our platform, or earn revenue for creators. We increased our enforcement teams. And we invested in powerful new machine learning technology to scale the efforts of our human moderators to take down videos and comments that violate our policies.”

YouTube will hire new moderators to review content

Wijcicki claims that YouTube has learnt its lesson and they will fight violent extremism content and other problematic videos with the full force of their company. In order to do that, they will have more human reviewers reviewing the content, making it harder to violate their policy. YouTube will be ruthless when tackling the most disturbed content, and will flag and remove everything that is against its policies. This includes extremist videos, inadequate videos for children in YouTube Kids and much more.


Please enter your comment!
Please enter your name here