Government backs EU proposals to scan personal messages for child-sex-abuse images

Government backs EU proposals to scan personal messages for child-sex-abuse images

The Government is supporting an EU proposal to compel tech companies to scan people's private online messages for evidence of child-abuse images and grooming.

The Government has signalled strong support for radical EU proposals that would force tech companies to scan people’s private online communications, including encrypted messages, for child-sexual-abuse imagery and grooming behaviour.

As previously reported in the Irish Examiner, the plans, drafted by the European Commission, have been welcomed by children's groups, but opposed by civil-rights advocates.

Now the Government has said that it backs the proposals, saying it is "right" that member states move to a system of "mandatory" detection and reporting of child-sexual-abuse material (CSAM).

In an information note on the proposed EU regulation, the Department of Justice said the regulation stemmed from an EU strategy on tackling child sexual abuse adopted in July 2020.

It said the proposal introduces a legal structure whereby the providers of online messaging services will be responsible for assessing risks of CSAM circulating on their sites and "detect, report and remove" such abuse.

"This represents a major change from the current system, where some online service providers voluntarily detect and report CSAM, but are not obliged to do so," the department said.

The note said the regulation further proposes to establish a European centre to prevent and counter child sexual abuse and support implementation of the regulation, including by setting up databases of indicators of CSAM, which services will have to use to meet their obligations.

"Ireland's initial view is very supportive of what this proposal seeks to achieve," the department said.

"Child sexual abuse and the creation and propagation of CSAM are extremely serious crimes, which are increasing in scale. It is right that we move to a system of mandatory detection and reporting." 

It said that given the cross-border nature of online services, the Government believes it is necessary from both a criminal-justice and internal-market perspective that action be taken on an EU basis.

It said the country recognises that there are "genuine privacy concerns" around the implications of some of the measures.

The department said the proposal places extensive responsibilities on national authorities and added: "Given that the European headquarters of many large tech companies are located in Ireland, this will mean a particular responsibility on Ireland, in terms of national-implementation measures." The department said there is a "self-evident public good" behind the proposal. This, it said, is particularly so for victims, in getting CSAM depicting them removed from the internet.

"Concerns have been expressed that the public may be impacted by measures that online service providers may be required to take, on foot of a detection order, in respect of CSAM hosted on their platforms,” it said.

The department welcomed the European Commission's recognition that a "fair balance must be struck" between the protection of children and their fundamental rights and the fundamental rights of users of the services.

It said the department has started initial consultations with interested government bodies and expects to get the views of other parties, including victims and industry, in the course of the negotiations.

x

More in this section

Lunchtime News

Newsletter

Keep up with stories of the day with our lunchtime news wrap and important breaking news alerts.

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

© Examiner Echo Group Limited