Limerick survivor of image-based sexual abuse raises awareness in US in bid to reform law

Megan Sims, who was a victim of revenge porn, said attention needs to be paid to AI, with the rise of the use of deepfakes, which are fake videos created using digital software.
A Limerick survivor of image-based sexual abuse has used a trip to Washington to raise awareness in the hope of reforming a law which “turns women and girls' bodies into commodities”.
The activist Megan Sims was asked to speak in Washington by US Health and Human Services.
At the event, the young woman spoke of her experience and advocated for reform of Section 230 of the Communications Decency Act, which she said “gives online publishers free reign to turn women and girls' bodies into commodities”.
In 2016, at the age of 19, Megan was the victim of image-based sexual abuse, which was formerly referred to as “revenge porn”.
She told the
: “There was like a forum that was made up of women and girls, I only knew people from Limerick, but they were reaching out to me, and I marched a few of them into the gardaí and we were told that there was nothing that could be done because there was no law in place.”After this, Megan started a petition to make image-based sexual abuse a criminal offence, which was signed by more than 85,000 people.
In February 2021, Coco’s Law was officially enacted.
“What I did was, I picked bits of the law from all over the world, and I petitioned it to the Irish government to pass it, and I sat down with some politicians and some lawmakers, and we drafted a new IBSA law for Ireland, then the law got passed,” she recalls.
Over the last few years, Megan has been working alongside other survivors across the globe in a bid to enact similar laws in other countries.
Speaking of her time in Washington, she explained: “We’ve done a report for the White House and the Department of Health and Human Services, who asked us to go over and to give our experiences with image-based sexual abuse, and how to tackle the issue in the US.”
In the US, there is still no federal law which covers image-based sexual violence.
“In Ireland, we're never really on the front of these issues. We have our own issue with our judicial system and everything, so it's nice for us to even have a voice on a global level about such a serious issue.
“I think that we need to take the blame from the victims and put it back onto the people who are taking and sharing these images non consensually, because it is a form of sexual abuse and it is a form of violence,” she added.
Megan said attention needs to be paid to AI, with the rise of the use of deepfakes, which are fake videos created using digital software, machine learning, and face swapping.
“It's starting to get worse. There's millions of websites that just are intended for deepfakes of people without their consent and these deepfake platforms are scrubbed, they're taking everyone's likeness and putting them into AI generators without people's consent and I think a lot of people don't know that just by having a social media profile you can be impacted by this, and not even know that you are,” she said.