Newstalk
Newstalk

11.37 5 Dec 2017


Share this article


Google has announced plans to increase its moderator team to 10,000 people next year, amid concerns over inappropriate content & child exploitation on YouTube.

Susan Wojcicki, CEO of YouTube, made the announcement while laying out its plans to stop 'abuse of our platform' - saying it will be looking to apply lessons it has learned from tackling violent extremist content to other areas of concern.

The decision comes after the popular video platform - which is owned by Google - faced increasing pressure in recent weeks over content targeting or involving children.

Reuters reported last month that several major advertisers - such as Lidl and Mars - had pulled ads from the platform over 'clips of scantily clad children'.

Other reports over the last month have also highlighted disturbing or inappropriate videos aimed at young audiences.

In a blog post, Ms Wojcicki explained: "Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualised decisions on content."

The company says it is increasing the "network of academics, industry groups and subject matter experts" it consults over such issues.

She added: "We’re also taking actions to protect advertisers and creators from inappropriate content. We want advertisers to have peace of mind that their ads are running alongside content that reflects their brand’s values.

"Equally, we want to give creators confidence that their revenue won’t be hurt by the actions of bad actors."

Previous efforts to tackle 'problematic' content have seen regular YouTube content creators complain about ads being removed from their videos - while major companies pulling their ads is also likely to have a knock-on effect on regular channels looking to earn advertising revenue.

Ms Wojcicki moved to reassure video-makers that they won't be adversely affected by any changes, saying: "We’ve heard loud and clear from creators that we have to be more accurate when it comes to reviewing content, so we don’t demonetise videos by mistake.

"We are planning to apply stricter criteria and conduct more manual curation, while also significantly ramping up our team of ad reviewers to ensure ads are only running where they should."

The efforts to ramp up the number of human moderators comes as major tech giants also attempt to develop means to automatically target inappropriate content - with YouTube saying 98% of videos they remove for violent extremism are flagged by a 'machine-learning algorithm'.


Share this article


Read more about

News

Most Popular

Live: Title

Now playing

00:00:00 / 00:00:00
Added to queue
Removed from queue

On Air

Share

Share


Up next

Episode title
Show
Duration

You currently have no podcasts in your queue.

Go to podcasts

On Air

BOBBY'S LATE BREAKFAST

BOBBY'S LATE BREAKFAST

09:00-11:00

Share

Up next

ON THE RECORD

ON THE RECORD

11:00-13:00

Share

OFF THE BALL

OFF THE BALL

13:00-18:00

Share

TECH TALK

TECH TALK

18:00-19:00

Share

TALKING HISTORY

TALKING HISTORY

19:00-20:00

Share

TALKING BOOKS

TALKING BOOKS

20:00-21:00

Share

UNDER THE COVERS

UNDER THE COVERS

21:00-22:00

Share

THE TOM DUNNE SHOW

THE TOM DUNNE SHOW

22:00-00:00

Share

BEST OF NEWSTALK

BEST OF NEWSTALK

00:00-06:00

Share

BREAKFAST BRIEFING

BREAKFAST BRIEFING

06:00-06:30

Share

BEST OF BREAKFAST BUSINESS

BEST OF BREAKFAST BUSINESS

06:30-07:00

Share

NEWSTALK BREAKFAST

NEWSTALK BREAKFAST

07:00-09:00

Share

BEST OF THE PAT KENNY SHOW

BEST OF THE PAT KENNY SHOW

09:00-12:00

Share

00:00:00 / 00:00:00

Share on