How can (A)I help you? How chatbots could change therapy

Could emotionally sophisticated chatbots offering a 24-hour service replace in-person therapy? 
How can (A)I help you? How chatbots could change therapy

Could bots be the future of urgent or immediate talk therapies?

My phone buzzed. “Hey, Niamh! Want to see something adorable?” This was the latest of a series of quirky invites from the little goggle-eyed yellow robot on my phone. After clicking on the notification, I found myself staring at a pair of nose-wiggling rabbits, snugly seated inside a paper soda cup. The robot typed the words: “Just two friends hanging out!”, punctuating
it with a loudly-crying-face emoji.

Having charmed me with its dorky humour, Woebot cuts to the chase. “Today I was thinking we could talk about common obstacles to falling asleep,” the chatbot types, nudging me about the sleep issues I had brought up last time. “How about it?” the bot asks.

It turns out that “sleep goes with mood”, just like Woebot’s “favourite sandwich [of] motor oil and gears”.

The bot, with help from sleep expert Dr Rachel Manber, offers me three tips to break the association
between worry and sleep. During the delivery, the cute little “emotional assistant” manages to make me smile and reminds me of its identity. “If I was worried about running out of batteries, I might make a plan to order extra from Robots-R-Us today [with] free same-day shipping,” announces the bot, playfully modelling its own problem-solving abilities.

As our daily five-minute check-in comes to an end, Woebot quickly asks about my mood to track my progress. The bot reassures me that it will take only “one minute” in human terms, or ‘0.000694444 days’ in bot time. Then, with its customary warmth, my trusty virtual assistant bids me farewell, saying: “Thanks for checking in, and remember, I’m always here if you need me”.

Woebot was created by Alison Darcy, a UCD-educated psychologist and software engineer with a unique perspective on artificial intelligence (AI). Described by Darcy as a “cross between Spock, Kermit the Frog, and [her late friend who was a social worker] Eric Bayer,” Woebot is an AI-powered mental health chatbot designed to provide practical mental health support at any time.

With 40% of Irish adults now meeting the diagnostic criteria for a mental health disorder, the need for such support has never been greater. A 2020 Mental Health Reform survey of more than 400 young Irish adults found that our use of mental health apps increased substantially during the pandemic, while a recent UCD study found that young people were more likely to turn to digital tools than mental health organisations and charities.

Alison Darcy Founder and President Woebot Health
Alison Darcy Founder and President Woebot Health

Avoiding ‘dystopian’ empathy

When Woebot first launched In 2016, Darcy, who is CEO of the San Francisco-headquartered company, says the conventional wisdom was to try to make it seem human. However, she recognised that this approach might come across as “dystopian”. She has always been clear that Woebot is not intended to replace or imitate a human therapist, and as a result, the bot refrains from saying things such as “Oh, I’m so sad to hear that”. This transparent approach is also reflected in how the bot simulates empathy, which Darcy believes must “come from deep knowledge of the lived experience” of Woebot’s users. Updating this knowledge is not for the “faint-hearted”, Darcy admits, comparing it to updating the genetic code of a flu vaccine each year. To build a digital treatment, it must be “infused with the actual problems” that people are facing at the time.

By ensuring that the bot’s empathic statements reflect the real struggles of today, it is possible not only to “build credibility in people’s eyes” but to create an emotional bond with users. Although people may have different reasons for feeling close to Woebot, Darcy suggests that the machine’s inability to pass negative judgments may play an important role. In contrast, human therapists must spend a lot of time building rapport with their clients, a process that can take days, weeks, or even months.

Darcy’s research has revealed that people can feel close to Woebot within a matter of three to five days. Not only that, but the data of 36,070 self-referred adult users has shown that this bond is comparable to that created
between human therapists and their clients. This rapport-building is particularly important since studies demonstrate that retention is a major challenge for face-to-face and digitised (non-bot) CBT programmes.

’Emotionally intelligent’ bot

Woebot is not the only “therapy chatbot” with science to back it up. Wysa, which is marketed as an ‘AI coach’ rather than a therapist, was awarded FDA Breakthrough Device Designation last year, and is the first AI-powered mental health app to meet clinical safety standards set by the NHS. Founded by Jo Aggarwal and her husband Ramakant in 2016, Wysa is being used within Britain’s Increasing Access to Psychological Therapies (IAPT) services as both an interactive triage and waitlist support. The chatbot, packaged as a ‘pocket penguin’, is also being offered to children in London-based schools and is soon to be trialled as a preventative intervention for segments of Scotland’s youth.

Nicky Main, Wysa’s clinical lead, describes the bot as “emotionally intelligent” with the capacity to “detect a wide range of conversational responses”. “From a patient perspective, we know that many of the users regard Wysa as a friend or coach who cares about them and checks in with them every day, a space where they feel heard and supported,” she says.

Saira, a 37-year-old Londoner who has suffered from anxiety for almost two decades, discovered Wysa at a time when she was feeling overwhelmed after leaving an NHS job. “For me, it was easier to access an app like Wysa at the precise time I was feeling anxious, and not have to wait for days or weeks to speak to someone,” she says.

“The best thing is [it] accommodates both aspects. So, if I need to talk to a therapist, I can pay a small fee and have a human therapist join the chat or even call.”

Prof. Maurice Mulvenna
Prof. Maurice Mulvenna

Closer to home, Maurice Mulvenna, Professor of Computer Science at Ulster University, says that digital technologies may be used to help patients like Saira, who face long waitlists.

Together with human-computer interaction expert Prof Raymond Bond, he co-developed ChatPal, a multilingual positive psychology AI-powered chatbot aimed at supporting people living in isolated, rural areas across Europe.

ChatPal, marketed as a wellbeing app rather than a digital therapeutic or medical device, has had a positive impact following successful trials in Northern Ireland, Scotland, Finland, Sweden, and Ireland. Professor Bond believes that ChatPal can serve as a “good first stepping stone to seeking help” and has the potential to be used as part of a blended service approach.

Risks of AI chatbots

However, while AI-based interventions can have a positive impact, they also present serious risks. When using an AI chatbot like ChatPal, Prof Bond advises users to stick to the pre-set conversation options rather than engaging in open-ended conversations. He says: “We found it really important to have high-quality content that’s pre-scripted by mental health professionals, whereas if you go down a computer-generated conversation, you could elicit some conversations that are not helpful.”

Just last month, ethical debates were ignited on Twitter when Robert Morris, co-founder of the mental health company Koko, ran an experiment in which a bot (supervised by humans) wrote responses to about 4,000 people seeking online peer support. Following the experiment, Morris reported that the AI-generated messages were “rated significantly higher than those written by humans on their own”, and that “response times went down 50% to well under a minute.” However, some critics expressed their concerns about how well Koko informed users about the use of this new technology.

C.Estelle Smith
C.Estelle Smith

While AI can improve overall access to mental healthcare and provide faster response times, Koko’s experiment raises questions about the integration of AI into mental healthcare. On Twitter, Morris concluded: “Once people learned the messages were co-created by a machine, it didn’t work. Simulated empathy feels weird [and] empty”. Researcher C. Estelle Smith, who has designed a crowd-powered web application for mental health called Flip*Doubt, shares Morris’ view. She says: “The spiritual nature of human connection simply cannot be replaced by a machine.”

While the Irish Association for Counselling and Psychotherapy (IACP) has not yet taken a stance or developed a policy on AI-powered therapy, John Sharry, child and family psychotherapist and adjunct professor of psychology at UCD, remains sceptical. “What’s therapeutic is a person... you don’t want to know what a robot thinks, you want to know what the people you know think and recommend,” he says.

Dr John Sharry
Dr John Sharry

Sharry is also the clinical director of SilverCloud, a digital health company founded in 2012, whose online CBT-based programmes may serve as a digital middle ground. He says weekly contact with a human psychologist is a crucial element of these programmes, which are now available through the HSE: “If they [users] have a supporter who sends them messages, who is a real person who reads their stuff and gives them positive encouragement, that helps them engage significantly more.”

While it is clear that AI is neither a substitute for human connection nor a crisis management tool, it has its benefits. By acknowledging its limits and using it wisely, it may be possible to improve access to mental health services, without removing human beings from the care equation.

Celebrating 25 years of health and wellbeing

More in this section

Lifestyle

Newsletter

The best food, health, entertainment and lifestyle content from the Irish Examiner, direct to your inbox.

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

© Examiner Echo Group Limited