YouTube is to limit data collection for anyone watching children’s content, regardless of how old they might be.
The social media giant also intends to ban personalised adverts aimed at children.
The move is part of accelerated efforts by YouTube, a subsidiary of Google, to address harmful and borderline content being posted on the video-sharing website.
It follows Google being forced to pay a record $170 million fine and make changes to protect children’s privacy on YouTube, as US regulators said the site had knowingly and illegally harvested personal information from children and used it to profit by targeting them with advertisements.
In a blog post, YouTube said it had redoubled efforts to live up to its responsibility while preserving an open platform.
It said that involves removing content that violates YouTube's policy as quickly as possible; rewarding “trusted, eligible creators and artists”; reducing the spread of content that 'brushes up” against its policy line; and “raising up authoritative voices” when people are looking for breaking news and information.
In the past month alone, YouTube has removed nearly 30,000 videos for breaching its rules on hate speech.
Between April and June this year it removed or suspended 4,069,349 channels, resulting in the termination of 77,460,820 videos.
While the fine imposed by the US Federal Trade Commission makes up only 1% of YouTube’s annual revenue, the terms and conditions it is issuing in response to the fine could have a huge impact on content providers who earn a living through advertisements that play before, during and after their videos.
Features like comments and notifications won’t be available on videos “that have an emphasis on kids characters, themes, toys, or games,” YouTube CEO Susan Wojcicki said in a blog post.
Most videos will not be able to run targeted ads, which could affect profitability of content providers.
“This won’t be easy for some creators,” Wojcicki said in her post, adding that YouTube will be establishing a $100 million fund, disbursed over three years, to help offset any lost advertising revenue for creators who make video content for children.
YouTube instituted new policies in June to address hateful content as it tries to toughen its stance on borderline content.
“Over the next several months, we’ll provide more detail on the work supporting each of these principles," said YouTube.
"This first installment will focus on 'Remove'.
"We've been removing harmful content since YouTube started, but our investment in this work has accelerated in recent years.
"Because of this ongoing work, over the last 18 months we’ve reduced views on videos that are later removed for violating our policies by 80%, and we’re continuously working to reduce this number further.”