Content on YouTube's most popular channels to be manually reviewed

Channels will need to reach a higher threshold to monitise their videos

Content on YouTube's most popular channels to be manually reviewed

The YouTube logo is seen in this photo illustration | Image: Jaap Arriens/SIPA USA/PA Images

Video-streaming site YouTube has announced changes to how it operates for both monetisation and its Preferred channels.

The company said among the "big changes" is a higher threshold before users can monitise their videos - i.e. run adverts on YouTube.

Previously, channels had to reach 10,000 total views to be eligible.

But the company announced on Tuesday new channels will need to have 1,000 subscribers and 4,000 hours of watch time within the past 12 months to be eligible for ads.

It is also to begin enforcing these new requirements for existing channels from February 20th.

In a statement, YouTube said: "It's been clear over the last few months that we need the right requirements and better signals to identify the channels that have earned the right to run ads.

"Instead of basing acceptance purely on views, we want to take channel size, audience engagement, and creator behavior into consideration to determine eligibility for ads."

It is also to manually review its Google Preferred channels.

Google Preferred was created for YouTube's most engaging channels, and to help customers easily reach bigger audiences.

YouTube added: "We’re changing Google Preferred so that it not only offers the most popular content on YouTube, but also the most vetted.

"Moving forward, the channels included in Google Preferred will be manually reviewed and ads will only run on videos that have been verified to meet our ad-friendly guidelines.

"We expect to complete manual reviews of Google Preferred channels and videos by mid-February in the US and by the end of March in all other markets where Google Preferred is offered."

"Aggressive action"

Susan Wojcicki, CEO of YouTube, said: "Human reviewers remain essential to both removing content and training machine learning systems because human judgment is critical to making contextualized decisions on content.

"Since June, our trust and safety teams have manually reviewed nearly two million videos for violent extremist content, helping train our machine-learning technology to identify similar videos in the future.

"We are also taking aggressive action on comments, launching new comment moderation tools and in some cases shutting down comments altogether.

"In the last few weeks we’ve used machine learning to help human reviewers find and terminate hundreds of accounts and shut down hundreds of thousands of comments."

The actions come after YouTube faced controversies over some of its most popular stars.

Most recently, US vlogger Logan Paul was removed from Google Preferred after he posted a video that included footage of a person who had apparently taken their own life in Japan.

Mr Paul apologised for the video and deleted it.

YouTube said in a series of tweets: "Suicide is not a joke, nor should it ever be a driving force for views."