Watchdog finds 50% rise in reports of child sexual abuse material last year

Hotline.ie also reports an 'alarming' increase in 2024 in 'self-generated' material — described as intimate or sexually explicit content created by and featuring minors, shared either voluntarily or through coercion, grooming or blackmail.
Reports of child sexual abuse material surged by over 50% last year — to almost 50,000 cases — Ireland’s online watchdog has said.
Hotline.ie also reports an “alarming” increase in 2024 in “self-generated” material — described as intimate or sexually explicit content created by and featuring minors, shared either voluntarily or through coercion, grooming or blackmail.
Its 2024 annual report said the total number of child sexual abuse material (CSAM) reports increased from about 29,200 in 2023 to almost 44,960 in 2024.
In relation to self-generated material, the report documents a 166% rise, from 4,322 in 2023, to 11,505 in 2024.
It said the sharp increase in this issue required a “fundamental shift” in approach of society to “targeted prevention and detection”.
A strategic plan, accompanying the annual report, said under the current model, people — including children — see the material before they can report it.
It said this operating model was “no longer fit for purpose” and called for more “proactive” models, such as that operated by the Internet Watch Foundation in the UK.
It said these services “combine advanced technologies with human expertise to identify threats before or early in public exposure”.
Hotline.ie is co-funded by the EU and its members. Its new chief executive is former garda sergeant Mick Moran. Key statistics show:
- 53,441 total reports in 2024, compared to 40,543 in 2023, and 10,583 in 2020;
- 45,639 reports in 2024 were categorised as “illegal material” — 85% of all reports;
- Hotline removed illegal content in 97% of cases;
- 44,955 reports last year were CSAM, representing a 55% increase on 2023 (29,197) and a nine-fold jump on 2020 (3,048);
- 64% of CSAM material is on forums — online message or image boards on a range of topics;
- 11,505 self-generated material in 2024, a 166% increase on 4,322 cases in 2023, which was a 280% rise on 2022.
The report said the “surge in pre-teen and teen victimisation” correlated directly with the increase in forums and self-generation of material.
“We’re seeing exploitation content organised and monetised at unprecedented scale,” it said. “Our analysis reveals forums have become the primary vehicle for distributing self-generated CSAM.”
It said even when content was removed, it “frequently reappears” and these forums are using previews or screenshots to “promote material that is then sold through private payment channels”.
Hotline.ie set up a pilot in 2024 to extend its takedown procedures to cover child sexual exploitation material — defined as content that is morally concerning but short of the threshold for illegality, such as sexualised poses.
It issued 941 takedown notices and 885 were successfully removed, most within three days.
Hotline.ie reports 519 separate cases of intimate image abuse — 60% affecting female victims and 40% male.
It received 62 cases of sexual extortion — 83% from male victims, with young men aged 18-24 particularly affected. It said 15% of cases involved victims under 18.
There were 908 reports of suspected racism — a 600% increase on 2023. Almost 90% were on X but were “misdirected as hate speech” and were more about religion and were not illegal under Irish law.
A further 11% were potentially harmful content related to hate speech, but did not meet the legal threshold for racism.