Site promises 'zero-tolerance' after non-consensual images of Irish women found on server 

Site promises 'zero-tolerance' after non-consensual images of Irish women found on server 

The website, which reaches more than 100m monthly active users, has been one of the fastest-growing social media platforms of 2020. Picture: File Picture

The online site Discord has promised a "zero-tolerance approach" to image-based sexual abuse after one of its servers was used to share explicit content of Irish women without their consent.

The server, which is similar to a group chat, was shut down and more than 500 users involved were banned by management, who confirmed to the  Irish Examiner that it will co-operate with gardaí on the matter.

“Discord has a zero-tolerance approach to nonconsensual pornography and child sexual abuse material, and we work aggressively and proactively to keep it off of our service,” said a spokesperson for the instant messaging platform.

We are continuing to proactively monitor the situation in Ireland and will cooperate on this matter with Irish authorities subject to applicable law

“We are continuing to proactively monitor the situation in Ireland and will cooperate on this matter with Irish authorities subject to applicable law,” it added.

The website, which reaches more than 100m monthly active users, has been one of the fastest-growing social media platforms of 2020 with monthly active users growing by 47% from February to July.

“One of the major areas of focus for our proactive efforts on Discord is exploitative content, which includes non-consensual pornography (NCP). We have a zero-tolerance policy for this activity and when we find it — either through reactive or proactive means — we remove it immediately, along with the servers sharing the content,” said a spokesperson.

According to Discord, on average, 70% of the servers which are removed are “deleted proactively", meaning they are deleted before they are reported to moderators. This includes automated search tools that scan photos and videos for exploitative content.

“Discord proactively scans 100% of images and GIFs using PhotoDNA to detect child sexual abuse material (CSAM), and immediately reports any content and perpetrators to the National Center for Missing & Exploited Children (NCMEC) in the US or the proper authorities in each corresponding country,” it added.

x

More in this section

Lunchtime News

Newsletter

Keep up with stories of the day with our lunchtime news wrap and important breaking news alerts.

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

© Examiner Echo Group Limited