Advertisement

Google expanding moderation team to 10,000 amid concerns over 'inappropriate' YouTube videos

Google has announced plans to increase its moderator team to 10,000 people next year, amid concer...
Newstalk
Newstalk

11.37 5 Dec 2017


Share this article


Google expanding moderation te...

Google expanding moderation team to 10,000 amid concerns over 'inappropriate' YouTube videos

Newstalk
Newstalk

11.37 5 Dec 2017


Share this article


Google has announced plans to increase its moderator team to 10,000 people next year, amid concerns over inappropriate content & child exploitation on YouTube.

Susan Wojcicki, CEO of YouTube, made the announcement while laying out its plans to stop 'abuse of our platform' - saying it will be looking to apply lessons it has learned from tackling violent extremist content to other areas of concern.

The decision comes after the popular video platform - which is owned by Google - faced increasing pressure in recent weeks over content targeting or involving children.

Advertisement

Reuters reported last month that several major advertisers - such as Lidl and Mars - had pulled ads from the platform over 'clips of scantily clad children'.

Other reports over the last month have also highlighted disturbing or inappropriate videos aimed at young audiences.

In a blog post, Ms Wojcicki explained: "Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualised decisions on content."

The company says it is increasing the "network of academics, industry groups and subject matter experts" it consults over such issues.

She added: "We’re also taking actions to protect advertisers and creators from inappropriate content. We want advertisers to have peace of mind that their ads are running alongside content that reflects their brand’s values.

"Equally, we want to give creators confidence that their revenue won’t be hurt by the actions of bad actors."

Previous efforts to tackle 'problematic' content have seen regular YouTube content creators complain about ads being removed from their videos - while major companies pulling their ads is also likely to have a knock-on effect on regular channels looking to earn advertising revenue.

Ms Wojcicki moved to reassure video-makers that they won't be adversely affected by any changes, saying: "We’ve heard loud and clear from creators that we have to be more accurate when it comes to reviewing content, so we don’t demonetise videos by mistake.

"We are planning to apply stricter criteria and conduct more manual curation, while also significantly ramping up our team of ad reviewers to ensure ads are only running where they should."

The efforts to ramp up the number of human moderators comes as major tech giants also attempt to develop means to automatically target inappropriate content - with YouTube saying 98% of videos they remove for violent extremism are flagged by a 'machine-learning algorithm'.


Share this article


Read more about

News

Most Popular