Tech firms to be forced to scan people's private messages for child abuse under EU plan

Tech firms to be forced to scan people's private messages for child abuse under EU plan

In the first months of the Covid-19 pandemic, the demand for child sexual abuse material rose by up to 25% in some member states, said the European Commission.

Child protection groups have welcomed radical EU proposals which will force tech companies to scan people’s private online communications, including encrypted messages, for child sexual abuse imagery and grooming.

However, civil rights advocates say the proposals amount to “indiscriminate mass surveillance” which would “destroy the right to privacy” — and open encrypted communications to attack by cyber gangs and totalitarian states.

The European Commission has announced detailed proposals which, it said, are aimed at responding to an “overwhelming increase” in child sexual abuse material (CSAM) online and in solicitation of children into sexually abusing themselves or even meeting perpetrators offline.

It said that in the first months of the Covid-19 pandemic, the demand for CSAM rose by up to 25% in some member states and that reports of grooming increased by 16% from 2020 to 2021.

It said groomers were contacting children on social media, gaming platforms, and chats, with US figures showing a three-fold increase in “self-generated” imagery of seven- to 10-year-olds.

The commission said that, currently, certain online service providers detect such material on a voluntary basis, but that many companies take no action.

“Voluntary action is, therefore, insufficient,” the commission said.

Under the proposals:

  • Companies will be obliged to “prevent” CSAM by assessing the risk of their service being used to share this imagery and taking action to reduce that risk;
  • Member states must set up national authorities to review those risk assessments and, where a significant risk remains, issue a detection order to address that risk;
  • Encrypted communications will be addressed, with the commission saying that a “large portion” of child sexual abuse takes place on them;
  • A new independent EU centre on child sexual abuse will be set up, which, among other things, will create a database of indicators allowing for the reliable identification of CSAM;
  • Member states will need to set out rules on “dissuasive penalties”, and fines “should not exceed 6% of the provider's annual income or global turnover”.

The commission said the new EU centre will facilitate access to “reliable detection technologies”.

On the key issue of whether or not it is technically possible to access encrypted technologies, the commission said a separate consultative process has shown that “solutions exist" but added that they “have not been tested on a wide-scale basis”.

The commission said the proposals do not amount to mass surveillance as they are “very tightly ringfenced” and “limited to what is strictly necessary”.

John Church, chief executive of the ISPCC, said it welcomes the European Commission’s proposals.

“For too long the rules of engagement have been written by the companies who have been shown to champion profits over the safety of children," said Mr Church.

The rules are now being rewritten with children rightly being at the heart.” 

He said the proposed detection technologies for CSAM online have been dismissed by privacy advocates as an exercise in mass surveillance, which, he said, is incorrect.

“A lot of work will be required to correctly inform people on how these technologies work to allay any fears,” said Mr Church.

“And, most importantly, how such tools will lessen the harm caused to children in the images by detecting them and removing them and, in some circumstances, rescuing these children and getting them the support and justice they deserve." 

TJ McIntyre, associate professor at UCD Sutherland School of Law and chair of Digital Rights Ireland, said the proposals will “fundamentally compromise” the security of communications.

“It involves looking at the contents of literally every message — it is indiscriminate mass surveillance,” he said.

Mr McIntyre said that Europe, including Ireland, “has been here before” with the European Court of Justice repeatedly ruling that such surveillance was “unacceptable”.

This proposal is very much in the same vein, it treats everyone as a suspect and under permanent surveillance.” 

Mr McIntyre said prevention of crime is a legitimate goal of society, but added: “You don’t allow people into your home without a warrant; likewise, to monitor people’s private communications, you need a warrant from a judge.

“This is not just allowing police to enter a suspect’s home without a warrant, it is allowing the police into everyone’s home indiscriminately without a warrant. It destroys the right to privacy.” 

In addition, he said it requires tech companies to compromise the security of their own systems, including end-to-end encryption.

“That opens it up to any attacker, it creates a vulnerability,” he said. That vulnerability could, in turn, be exploited not only by criminals but also states such as China and Russia.

  • Helplines: Childline 1800 66 66 66 or text 50101; hotline.ie

More in this section

Puzzles logo
IE-logo

Puzzles hub

Visit our brain gym where you will find simple and cryptic crosswords, sudoku puzzles and much more. Updated at midnight every day. PS ... We would love to hear your feedback on the section right HERE.