Apple announces new safety tools to detect child sexual abuse content on iCloud

Apple announces new safety tools to detect child sexual abuse content on iCloud

Apple has announced a trio of new child safety tools (Apple/PA)

Apple has announced a trio of new child safety tools designed to protect young people and limit the spread of child sexual abuse material (CSAM).

Among the features is new technology that will allow Apple to detect known CSAM images stored in iCloud Photos and report them to law enforcement agencies.

Already a subscriber? Sign in

You have reached your article limit.

Unlimited access. Half the price.

Annual €130 €65

Best value

Monthly €12€6 / month

More in this section

The Business Hub

Newsletter

News and analysis on business, money and jobs from Munster and beyond by our expert team of business writers.

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

© Examiner Echo Group Limited