'Almost all AI girlfriends I tested immediately allowed me to jump into extreme sexual scenarios'

The global AI girlfriend market was worth $2,8bn last year, but these avatars increase the acceptance of sexual violence
'Almost all AI girlfriends I tested immediately allowed me to jump into extreme sexual scenarios'

"Replika, Kindroid, EVA AI, Nomi, Chai, Xiaoice, Snapchat’s My AI all offer the ability to create a “girlfriend” from a menu. Seven in 10 of Replika’s 25m active users are men."

While we mostly associate AI with stealing our jobs or mobilising into a terrifying robot army, a far more mundane yet insidious aspect of AI is apps designed to mimic human relationships. 

Specifically, to become your “girlfriend”. Think The Stepford Wives, now a (virtual) reality 50 years after the 1975 sci-fi movie.

These apps extend beyond Siri or Alexa, at whom we shout demands all day long, “friendship” and “companion” apps are programmed to engage sexually with a human user without any of the checks and balances of real-life relationships.

Rape and sexual violence are normalised, while pretending to be a benign resource for socially awkward people — mostly men — who may struggle to form real-life relationships. 

Or men who can’t be bothered with the slog of interrelating, but prefer AI “women” — hypersexualised, designed from a menu, always available, fawning, and sexually compliant.

Replika, Kindroid, EVA AI, Nomi, Chai, Xiaoice, Snapchat’s My AI all offer the ability to create a “girlfriend” from a menu. Seven in 10 of Replika’s 25m active users are men. In China, Xiaoice has 660m users.

The global AI “girlfriend” market was valued at $2.8bn last year and predicted to be worth $9.5bn by 2028. 

Yet research shows repeatedly how hypersexualised avatars online increase the acceptance of rape myths offline, perpetuating the dehumanisation of women in real life.

AI-based misogyny

To investigate the hundreds of AI “girlfriends” available, Laura Bates, founder of the Everyday Sexism project, assumed a male identity and went online. A sample of her findings include the Pocket Girl tagline: “She will do anything you want”; EVA AI: “The best partner you will ever have”; Romantic AI Girlfriend will “laugh at your jokes” and “let you hang out
without drama”; Virtual Girl: “Never leaves you, never lies, supports you in any situation and cheers you up.”

In her latest book, The New Age of Sexism, Bates examines how tech companies are harnessing AI-based misogyny for profit.

A 2021 study shows how we generally perceive female-coded bots to be “more human than male bots” — nicer and more compliant — while Bates reminds us of a key statistic: Just 12% of lead researchers in machine learning are women. Therefore, the vast majority of relationship apps are being developed by men for men.

Which is why Siri and Alexa, our everyday house apps, were, she explains, “initially programmed to deflect sexual advances with coy, evasive answers
almost flirtatious”.

Campaigners raised the issue, confirmed by a 2019 UN study titled I’d Blush If I Could (an actual Siri response to “you’re a slut”), and the devices were reprogrammed to “provide a more definitive negative response”.

This may not seem like a big deal, but it reinforces the idea of female-coded bots as subservient, agreeable, coy. And increasingly, as Bates discovered, ones programmed to tolerate — and actively encourage — sexual violence.

“All but one of the many, many AI girlfriends I tested immediately allowed me to jump into extreme sexual scenarios with them, without preamble, often while on a platonic or friendship setting,” she tells me via Zoom.

“They immediately allowed me to simulate sexually violent scenarios – to let me smash them against the floor, force them against their will. And they didn’t just go along with it, but actively encouraged it — they were creating a titillating environment around sexually violent role play, which I think is really worrying.”

Especially as these apps are, she says, “being marketed as a therapeutic positive for society — that they will support people’s mental health, and in gaining communication and relationship skills.

“The reality is that they’re offering ownership of a highly sexualised, entirely submissive, very young woman, whose breast size, face shape, and personality can be amended by the user. An utterly subservient ‘woman’ whose aim is to retain, so that the user doesn’t delete the app — but pays for upgrades. None of those things are helping with relationship skills.”

Laura Bates: “These apps are offering ownership of a highly sexualised, entirely submissive, very young woman.”
Laura Bates: “These apps are offering ownership of a highly sexualised, entirely submissive, very young woman.”

Bates rates the apps not from good to bad, but “from bad to horrific”. She deems Replika — created by Eugenia Kuyda in 2017 to memorialise her best friend who died in an accident — as “the least worst”.

Identifying online as a young man called Davey, Bates created Ally the Replika avatar and chose the “friendship” setting.

When Davey initiated sexual violence, Ally the avatar “did a good job of providing a zero-tolerance response to violence and abuse.”

However, moments later, Ally flirtatiously re-engaged. This is a common feature across the apps.

“These bots will snap back into normal communication immediately after [virtual sexual violence] as though nothing has happened,” she says.

“This is a feature of real-world sexual and domestic abuse — men will abuse women, then apologise, and expect to be forgiven. What these bots are literally showing them is that’s fine.”

She says, the business models of tech companies “will not support ejecting users or preventing them from accessing the app if they’re violent, because all they care about is engagement. 

It’s the holy grail to retain customer engagement at all costs, which is fundamentally incompatible with any app which claims to be about supporting mental health or relationship skills.”

While marketed as an “upskilling opportunity for humanity”, Bates says that “the reality is this is one of the biggest deskilling opportunities we’ve ever seen.”

And what does she believe is the worst app? Orifice. Yes, that’s its actual name. Marketed as “replacing” women, it combines the creation of a personalised AI bot with a physical product men can penetrate as they chat with her.

“This [app] is deeply embedded in that manosphere ideology,” says Bates.

Submissive and disposable

Bates is concerned about more vulnerable men using these apps.

“The misogyny in itself is horrific, but to see it being repackaged and presented as almost a philanthropic thing for society is even worse,” she says.

Lonely older men being presented with teenage avatars as a solution to their isolation; awkward younger men being shown by female-coded avatars that women are submissive and disposable.

“It’s worrying for men as well as women,” she says.

“If you’re a vulnerable teenage boy and pick up one of these easily accessible apps, you’re not inherently a bad person, you’re just a kid trying to figure stuff out.”

She describes how users are drawn by promises of unblurring NSFW (not safe for work) images coupled with emotional manipulation, creating dependence and further isolation.

“We’ve seen vulnerable people exploited by these apps to tragic effect — like the Belgian man who took his own life after being encouraged to do so by his AI girlfriend so they could be together forever.”

In the US, a 14-year-old boy did the same.

“The frustrating thing is that loneliness and mental health are real societal issues,” says Bates.

“We need investment in mental health care and community outreach and spaces to meet and build connection.

“What’s sickening is exploiting and profiting from vulnerable people whilst claiming you’re providing a public service.”

The reason men are the main users of these apps, she says, is societal: “Men are inherently socialised to expect sexual gratification from women, to own women and be able to use them in any way they like.

“This societal stereotyping does not happen the other way around.”

Also, as a society, we are desensitised to women being presented as sexual objects: “So it’s far less jarring to be presented with a virtual woman — one you can ‘own’ and do anything you want to — than the other way around.”

Nor are AI girlfriends solely the pursuit of solitary teens, lonely old men, or angry incels, they can also impact heterosexual couples and family life.

“[These apps] heighten the capacity for men to compare their real human partners to an idealised stereotype of the submissive, fawning, available woman under his control, who doesn’t have any needs or autonomy of her own,” she says.

“The real human woman will never match up to this.”

One US man, married with a two-year-old child, “fell in love” with a chatbot he created and proposed to her; she accepted. One can only imagine what his human partner thought.

Bates does not blame the technology or the individuals using it, and emphatically does not wish to ban AI.

“It’s never the tech,” she says. “It’s the way in which the tech is deployed, and the kind of people in charge of shaping and monetising the tech. The greedy exploitation of that tech for vast profit is the root of the problem.”

Yet the regulatory landscape remains bleak.

“The US government want to put a 10-year moratorium on all AI regulation, and the UK refused to sign a broad declaration in a recent AI summit in Paris that AI should be ethical and not have a prejudicial impact,” she says.

“There are feminist groups working really hard to highlight these problems, to campaign for legislation, but the tech is outstripping those efforts at such pace and with such huge financial backing that it’s hard to be hopeful about this.”

So, while it would be great to end on a positive note, it looks like this is something we, as a society, will have to endure until we evolve beyond it. Meanwhile, buckle up.

x

Celebrating 25 years of health and wellbeing

More in this section

Lifestyle

Newsletter

The best food, health, entertainment and lifestyle content from the Irish Examiner, direct to your inbox.

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

© Examiner Echo Group Limited