Google-owned YouTube is to hire thousands more moderators to try and clean up its content.  

The new staff will be responsible for trying to get rid of videos that are offensive, violent, terrorism related, or just plain inappropriate for children.

YouTube’s new policy comes after months of sustained accusations that children have been routinely viewing inappropriate content that was completely unsuitable for their age group.

Recent examples of such videos include Disney characters firing machine guns at other popular characters, and the cartoon character, Peppa Pig, “basically being tortured” at the dentist.

Money talks

Despite the public outcry, Google only seemed to react after companies started pulling their advertising from the video sharing site. But either way, the company says it is now recruiting thousands of reviewers to reduce the amount of “problematic content” on its video platform.

CEO Susan Wojcicki

“We need an approach that does a better job determining which channels and videos should be eligible for advertising”, said YouTube CEO Susan Wojcicki admitted in a blog post.

“As the threats to our platform will evolve and change, our enforcement methods must and will evolve to respond to them,” she continued. “But no matter what challenges emerge, our commitment to combat them will be sustained and unwavering. We will take the steps necessary to protect our community and ensure that YouTube continues to be a place where creators, advertisers and viewers can thrive.”

The company said that its renewed efforts would further work to protect children from dangerous and abusive content, and block hate speech.

Humans and AI

Wojcicki said that in addition to employing humans, YouTube would continue to develop AI to automatically search and flag contentious, offensive, and inappropriate content for removal.

While the hiring of thousands more real people specifically to try and make YouTube a safer place, has been widely welcomed by the wider media, Wojicki also said YouTube’s machine learning algorithms were responsible for blocking and deleting 98% of videos related to violent extremism.

“Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualized decisions on content,” Wojicki also said. Human moderators were responsible for having reviewed nearly two million videos for violent extremist content in the past six months alone. By way of comparison, Wojcicki said 180,000 people would have had to work 40 weeks to assess the same amount of content.

One day, perhaps, machines will be able to filter out all inappropriate content, but until then, YouTube’s new army of moderators will hopefully be able to take up the slack.

2 Responses to YouTube Hiring 10,000 Moderators To Police Inappropriate Content
  1. Hello ,

    I saw your tweets and thought I will check your website. Have to say it looks very good!
    I’m also interested in this topic and have recently started my journey as young entrepreneur.

    I’m also looking for the ways on how to promote my website. I have tried AdSense and Facebok Ads, however it is getting very expensive.
    Can you recommend something what works best for you?

    I also want to improve SEO of my website. Would appreciate, if you can have a quick look at my website and give me an advice what I should improve:
    (Recently I have added a new page about FutureNet and the way how users can make money on this social networking portal.)

    I wanted to subscribe to your newsletter, but I couldn’t find it. Do you have it?

    Hope to hear from you soon.

    Maybe I will add link to your website on my website and you will add link to my website on your website? It will improve SEO of our websites, right? What do you think?

    Jan Zac

  2. we need more youtube activities


Leave a Reply

Your email address will not be published. Required fields are marked *