Internet searches for child abuse images will be blocked for the first time by Microsoft and Google after months of mounting pressure.
New software is to be introduced that will automatically block 100,000 “unambiguous” search terms which lead to illegal content, Google chief executive Eric Schmidt told the Daily Mail.
The restrictions will be launched in English-speaking countries first, before being expanded to 158 other languages in the next six months.
A further 13,000 search terms linked with child sex abuse will flash up with warnings from Google and charities warning the user that the content could be illegal and pointing them towards help.
Calls for the internet companies to take action against searching for illegal content reached boiling point following the trials in the UK of child killers Mark Bridger and Stuart Hazel earlier this year.
Bridger, who murdered five-year-old April Jones, and Hazel, who killed 12-year-old Tia Sharp, both used the internet to search for child abuse images before the killings.
Google's new technology will also be able to remove up to thousands of copies of an illegal video in one hit.
When a child abuse video is discovered, the software can attach a unique code to it which can remove all copies from the web.
The system is also designed to identify new code words or terms paedophiles start to use and can block search results for these too.
However, the vast majority of this material exists beyond the scope of search engines, according to technology journalist Adrian Weckler.
"The vast majority of it resides in what they call the 'dark web' or the 'deep web' - in other words, hidden networks that are used by people who explot children … very little of it actually turns up on what we know as the world wide web."