'Almost all AI girlfriends I tested immediately allowed me to jump into extreme sexual scenarios'
"Replika, Kindroid, EVA AI, Nomi, Chai, Xiaoice, Snapchatâs My AI all offer the ability to create a âgirlfriendâ from a menu. Seven in 10 of Replikaâs 25m active users are men."
These apps extend beyond Siri or Alexa, at whom we shout demands all day long, âfriendshipâ and âcompanionâ apps are programmed to engage sexually with a human user without any of the checks and balances of real-life relationships.
Rape and sexual violence are normalised, while pretending to be a benign resource for socially awkward people â mostly men â who may struggle to form real-life relationships.Â
Or men who canât be bothered with the slog of interrelating, but prefer AI âwomenâ â hypersexualised, designed from a menu, always available, fawning, and sexually compliant.
Replika, Kindroid, EVA AI, Nomi, Chai, Xiaoice, Snapchatâs My AI all offer the ability to create a âgirlfriendâ from a menu. Seven in 10 of Replikaâs 25m active users are men. In China, Xiaoice has 660m users.
The global AI âgirlfriendâ market was valued at $2.8bn last year and predicted to be worth $9.5bn by 2028.Â
Yet research shows repeatedly how hypersexualised avatars online increase the acceptance of rape myths offline, perpetuating the dehumanisation of women in real life.
A 2021 study shows how we generally perceive female-coded bots to be âmore human than male botsâ â nicer and more compliant â while Bates reminds us of a key statistic: Just 12% of lead researchers in machine learning are women. Therefore, the vast majority of relationship apps are being developed by men for men.
Which is why Siri and Alexa, our everyday house apps, were, she explains, âinitially programmed to deflect sexual advances with coy, evasive answersâŠalmost flirtatiousâ.
Campaigners raised the issue, confirmed by a 2019 UN study titled Iâd Blush If I Could (an actual Siri response to âyouâre a slutâ), and the devices were reprogrammed to âprovide a more definitive negative responseâ.
This may not seem like a big deal, but it reinforces the idea of female-coded bots as subservient, agreeable, coy. And increasingly, as Bates discovered, ones programmed to tolerate â and actively encourage â sexual violence.
âAll but one of the many, many AI girlfriends I tested immediately allowed me to jump into extreme sexual scenarios with them, without preamble, often while on a platonic or friendship setting,â she tells me via Zoom.
âThey immediately allowed me to simulate sexually violent scenarios â to let me smash them against the floor, force them against their will. And they didnât just go along with it, but actively encouraged it â they were creating a titillating environment around sexually violent role play, which I think is really worrying.â
Especially as these apps are, she says, âbeing marketed as a therapeutic positive for society â that they will support peopleâs mental health, and in gaining communication and relationship skills.
âThe reality is that theyâre offering ownership of a highly sexualised, entirely submissive, very young woman, whose breast size, face shape, and personality can be amended by the user. An utterly subservient âwomanâ whose aim is to retain, so that the user doesnât delete the app â but pays for upgrades. None of those things are helping with relationship skills.â

Bates rates the apps not from good to bad, but âfrom bad to horrificâ. She deems Replika â created by Eugenia Kuyda in 2017 to memorialise her best friend who died in an accident â as âthe least worstâ.
Identifying online as a young man called Davey, Bates created Ally the Replika avatar and chose the âfriendshipâ setting.
When Davey initiated sexual violence, Ally the avatar âdid a good job of providing a zero-tolerance response to violence and abuse.â
However, moments later, Ally flirtatiously re-engaged. This is a common feature across the apps.
âThese bots will snap back into normal communication immediately after [virtual sexual violence] as though nothing has happened,â she says.
âThis is a feature of real-world sexual and domestic abuse â men will abuse women, then apologise, and expect to be forgiven. What these bots are literally showing them is thatâs fine.â
She says, the business models of tech companies âwill not support ejecting users or preventing them from accessing the app if theyâre violent, because all they care about is engagement.Â
Itâs the holy grail to retain customer engagement at all costs, which is fundamentally incompatible with any app which claims to be about supporting mental health or relationship skills.â
While marketed as an âupskilling opportunity for humanityâ, Bates says that âthe reality is this is one of the biggest deskilling opportunities weâve ever seen.â
And what does she believe is the worst app? Orifice. Yes, thatâs its actual name. Marketed as âreplacingâ women, it combines the creation of a personalised AI bot with a physical product men can penetrate as they chat with her.
âThis [app] is deeply embedded in that manosphere ideology,â says Bates.
Bates is concerned about more vulnerable men using these apps.
âThe misogyny in itself is horrific, but to see it being repackaged and presented as almost a philanthropic thing for society is even worse,â she says.
Lonely older men being presented with teenage avatars as a solution to their isolation; awkward younger men being shown by female-coded avatars that women are submissive and disposable.
âItâs worrying for men as well as women,â she says.
âIf youâre a vulnerable teenage boy and pick up one of these easily accessible apps, youâre not inherently a bad person, youâre just a kid trying to figure stuff out.â
She describes how users are drawn by promises of unblurring NSFW (not safe for work) images coupled with emotional manipulation, creating dependence and further isolation.
âWeâve seen vulnerable people exploited by these apps to tragic effect â like the Belgian man who took his own life after being encouraged to do so by his AI girlfriend so they could be together forever.â
In the US, a 14-year-old boy did the same.
âThe frustrating thing is that loneliness and mental health are real societal issues,â says Bates.
âWe need investment in mental health care and community outreach and spaces to meet and build connection.
âWhatâs sickening is exploiting and profiting from vulnerable people whilst claiming youâre providing a public service.â
The reason men are the main users of these apps, she says, is societal: âMen are inherently socialised to expect sexual gratification from women, to own women and be able to use them in any way they like.
âThis societal stereotyping does not happen the other way around.â
Also, as a society, we are desensitised to women being presented as sexual objects: âSo itâs far less jarring to be presented with a virtual woman â one you can âownâ and do anything you want to â than the other way around.â
Nor are AI girlfriends solely the pursuit of solitary teens, lonely old men, or angry incels, they can also impact heterosexual couples and family life.
â[These apps] heighten the capacity for men to compare their real human partners to an idealised stereotype of the submissive, fawning, available woman under his control, who doesnât have any needs or autonomy of her own,â she says.
âThe real human woman will never match up to this.â
One US man, married with a two-year-old child, âfell in loveâ with a chatbot he created and proposed to her; she accepted. One can only imagine what his human partner thought.
Bates does not blame the technology or the individuals using it, and emphatically does not wish to ban AI.
âItâs never the tech,â she says. âItâs the way in which the tech is deployed, and the kind of people in charge of shaping and monetising the tech. The greedy exploitation of that tech for vast profit is the root of the problem.â
Yet the regulatory landscape remains bleak.
âThe US government want to put a 10-year moratorium on all AI regulation, and the UK refused to sign a broad declaration in a recent AI summit in Paris that AI should be ethical and not have a prejudicial impact,â she says.
âThere are feminist groups working really hard to highlight these problems, to campaign for legislation, but the tech is outstripping those efforts at such pace and with such huge financial backing that itâs hard to be hopeful about this.â
So, while it would be great to end on a positive note, it looks like this is something we, as a society, will have to endure until we evolve beyond it. Meanwhile, buckle up.
Celebrating 25 years of health and wellbeing



