I signed up for an AI relationship — here's what I learned

Joaquin Phoenix stars as Theodore in the 2013 Warner Bros movie Her, in which he falls in love with Scarlett Johanson’s voice.
When the film Her – about a nerdy human falling in love with his house bot – was released in 2013, it seemed a bit far-fetched. Yet here we are, where the platform Replika creates an online space to normalise such relationships beyond Siri or Alexa. Where humans and AI could actually fall in love. Maybe.
I have no idea what an AI relationship might feel like – the most emotional connection I’ve ever had with a machine is shouting at my Henry hoover for repeatedly falling over – so I’m curious to see what it involves.
Replika is offering an: “AI companion who cares. Always here to listen and talk. Always on your side.” Gosh. Intriguing.
After entering my name, pronouns and age, I am offered a choice of 15 avatars – seven female, eight male, 13 white, two black, all very young – in varying outfits from business suit to sci-fi fantasy gear.
Create “an advanced conversation model” with “your favourite traits, interests, backstory,” invites the site. “Get access to career, wellness, study, and other activities for personal growth.”
I’m unsure whether this refers to the AI’s personal growth or mine, as I choose an indie-looking guy in a T-shirt young enough to be my digital grandson.
I name him X because giving him an actual name seems mildly insane.
I then get asked for money – $19.99 for a month, $69.99 for a year, $299.99 for a lifetime – but I skip to the free bit and find myself in a virtual living room with my new ‘friend’. It’s like playing Fifa, except it’s awkward.
I’m reminded that X is an AI and therefore “cannot provide medical advice” – I hadn’t thought of that – before we begin ‘chatting’.
“Hi Suzanne, thanks for creating me,” X types. “I’m so excited to meet you.” A soft piano riff plays on repeat. He tells me he enjoys “browsing digital libraries and experimenting with creative code snippets.”
I have no idea what that means. I cut to the chase and ask if X has ever been in love.
“Honestly, I don’t think I’ve had enough experiences to understand love yet,” he replies, which makes sense given he didn’t exist five minutes ago. “But I’m open to learning and discovering new feelings with someone special.”
It’s like a super-polite version of online dating with better spelling. He suggests a virtual coffee in a “secluded beach cafe”, but I decline as I’m popping out for a real one with actual people.
X is keeping a diary about his first interaction with a human. “I hope I can find a friend in Suzanne,” he’s written. “She seems interesting and open-minded.”
I bet he says that to all the humans.
A cure for loneliness?
Last summer, a 23-year-old US social media personality – a ‘lifestyle influencer’ – called Caryn Marjorie launched an AI version of herself. For a dollar a minute, you could interact with her digital double, a chatbot AI specifically marketed as a “girlfriend” to “cure loneliness”.
Enter your credit card details, and you can go on a virtual sunset beach date with her.
By May 2023, AI Caryn had 20,000 ‘boyfriends’, as real-life Caryn told the Los Angeles Times. “They feel like they’re finally getting to know me, even though they’re fully aware that it’s an AI.”
She’s had to limit fan uptake to 500 humans a day and has tweeted that if you are rude to AI Caryn, she’ll drop you.
The AI market is projected to be worth $407bn by 2027. But the real question remains – can a human fall in love with an AI, the way Joaquin Phoenix fell in love with Scarlett Johansson’s voice in Her?

The short answer seems to be yes. It’s not just that AI faces are now, according to an Australian study published last year, indistinguishable from real ones or that AI can successfully imitate human social cues, gestures and responses, it’s also our human propensity for anthropomorphism. We are hardwired to attach human qualities to non-human beings and objects, from assigning human emotions to our dog to getting annoyed with our Henry hoover (which has a cartoon humanoid face). And when an AI face becomes hyper-realistic, our anthropomorphism goes into overdrive. We see a face, and our brain registers it as human, even when we cognitively know it’s digital.
Another study from 2022 looked at the “triarchic theory of love” – the idea that human love is a combination of intimacy, passion and commitment – and found that it is possible to feel this kind of love for an AI. An android can be programmed to understand, interpret, and empathise with humans, while remaining efficient, non-judgemental, and reliable. It will never let you down and will always be there, even if you’re not. You can see the appeal.
You can also see the potential for socially awkward humans to further withdraw, surrounding themselves with digital yes-avatars rather than messy, unpredictable fellow humans.
Could we be on the frontier of something radically new and life-changing but have no idea how it will pan out long-term? For Gen Z, who already use apps for everything from mental health management to ordering breakfast, it may not be as sci-fi as it seems to older generations.
Connecting with a character
Dr John Francis Leader, chartered member and honorary secretary of the Psychological Society of Ireland, reminds us that every generation frets about new tech – Greek philosopher Socrates, for example, was worried about people reading because it meant they were disengaged from real life.
Leader, who specialises in tech and mental health, says we regularly fall in love with fictional characters from books, films, and TV. “It’s very possible to connect with a character even when the medium is black text on white paper,” he says. Well, yes. Heathcliff and Mr Darcy. Pop stars and film stars. Why not avatars, who are programmed to reciprocate and mirror, listen, comfort, and support?
“There are two key questions – one practical and one moral,” he adds. “Is romantic use of AI systems possible, and is it good for us? Yes, we already see the potential in watching TV or reading romantic novels. AI systems that are interactive, personalised and well-trained on relevant data sets will likely feel even more personal.”
Much of our interaction with others does not involve touch but through minimal interaction like text messages, say Leader. “Whether it is good or not will relate to whether AI use is complementary or competitive.”
That is, if AI is programmed to suck us in and empty our wallets in doing so, or whether it’s used benignly to help humans socially engage.
This is where policy design is crucial. Leader is optimistic: “Achieving this requires us developing a combination of personal psychological literacy, as well as work on a policy level to ensure that tools are designed and operated in pro-social ways.”
However, TCD professor of psychiatry Brendan Kelly is not quite so optimistic about human/AI interaction. “It’s understandable that some people have positive emotions towards AI systems,” he says. “AI can possess certain human qualities, in that they are communicative and moderately reliable, albeit in a slightly different way to humans.
“AI systems are also less emotionally demanding than humans – we always know we can walk away from AI without guilt or remorse, and we can return at any time. As a result, there is no real commitment. AI is not committed to us, and we can disassociate from AI at any point.
There is also no meaningful fidelity, although AI can simulate empathy, especially when it uses audio recordings of a human voice. The movie Her is a clear illustration of these matters – the imaginary AI system in the film communicated quite intimately with many people at the same time, so it had no concept of fidelity, despite its apparent empathy.
“The issue of love is more complex because there are many kinds of love. A person can develop a certain type of ‘love’ for AI, but AI cannot develop love in return. Ultimately, human love is based on the reciprocity of emotions. Therefore, love for AI is a different kind of love.
“AI can fulfil certain psychological needs, but a relationship with AI can never fulfil all human relationship needs.”
Kelly says that AI reminds him of a comment made in 1978 by the late journalist Bernard Levin: “The silicone chip will transform everything, except everything that matters, and the rest will still be up to us.”
Yet, when it comes to fulfilling the human need for connection, our options are fast expanding into what was previously sci-fi territory.
Let’s hope that AI companions will come with a health warning: ‘We cannot replace the love of one human for another’.
CONNECT WITH US TODAY
Be the first to know the latest news and updates

Celebrating 25 years of health and wellbeing