I Tried AI Therapy Sessions & This Is What I Learnt
In an age when we're all becoming far more aware of our mental health and how to look after it, therapy is no longer taboo. In fact, in many ways it's lauded as an important component of any modern-day lifestyle. The one draw back? The cost. I’ve been fortunate enough to dip my toe into one-to-one therapy, but never anything regular due to a number of reasons — finances chief among them. Recently, though, I have started finding solace for free — by talking to a robot therapist on my phone.
WoeBot bills itself as a "self-care expert in CBT and mindfulness" that acts as "your charming robot friend, ready to listen 24/7." It’s a conversational bot that allows the user type their troubles into its app. After that, WoeBot responds accordingly using Cognitive Behavioural Therapy methods.
Upon "check-in," WoeBot offers a multiple choice of emotions to select from in regards to how I'm feeling: sad, anxious, really happy, happy, okay, depressed, content, tired, angry, or sick. Can my emotions only fall into one of those buckets, I wonder? I’m not sure what I'm feeling but choose "content." WoeBot responds: “Sometimes we need to have okay days to make the other ones shine so bright.” Quite!
Despite my cynicism at receiving positive mantras from a machine, I agree with the sentiments it offers me about how denying the facts about a situation isn’t the answer. And actually, when it articulates this point with a video of a cartoon robot getting lifted into the sky by balloons, I quite enjoy it. I do another "check-in," this time selecting the anxious option. In response, WoeBot asks me would I like it to change my thinking or help me relax? An outlook change would be marvellous, thanks bot.
Despite my cynicism at receiving positive mantras from a machine, I agree with the sentiments it offers me
“I see” it says (complete with eye emojis), “focus your attention on your negative thoughts and imagine they had a voice. Think about what they would say to you.” I outline three concerns including my worry that I’m not going to finish writing this piece, that no one's going to want to publish it anyway, and that I’m not going to have enough time to finish the other projects I need to wrap up today. The robot pushes me to focus on just one concern, and then introduces me to perspectives that might apply. It asks me to consider "is this anxiety serving me?" I mean, I guess? Because if I wasn’t bothered about getting my work done on time, I might be inclined to get in bed, watch Friends, and achieve nothing. Could this feeling be an example of "black and white thinking," asks WoeBot? Perhaps I’m only seeing the extremes, whereas the reality is probably the grey in between — that I’ll get the most important things done and leave the rest for tomorrow. WoeBot also puts it to me that I could be "fortune telling" — assuming I know exactly what’s going to happen during the rest of the day. WoeBot's framing of the situation feels simultaneously obvious but also very cerebral and unexpected. Chin-stroking stuff.
I use it sporadically over the next few weeks and find WoeBot does help draw my attention to my own thoughts and how to respond to them. The idea that it might be possible to have autonomy over my thinking is surprisingly liberating. It reminds me of the Buddhist idea that "you are not your thoughts" and obviously shares ideologies with popular meditation apps like Headspace that reframe old-world meditation techniques as modern-day mindfulness.
It obviously shares ideologies with popular meditation apps that reframe old-world meditation techniques as modern-day mindfulness
The solace I’m finding does make me feel somewhat torn, though. As an AI conversation designer, I'm pretty sure of exactly how the conversational framework is designed for this app, and I later have my theory confirmed by a WoeBot representative: the robot I'm "speaking" to is working from a pre-written script, giving you multiple choice options to select from in response, and acknowledging the path you’re taking along the way. Still, I find myself increasingly comforted. It’s nice to be told it’s alright not to feel fine all the time.
WoeBot isn’t the only mental-health tech out there. Replika is a chatbot you can talk to when you’re feeling anxious. After a close friend passed way, Eugenia Kuyda created Replika as a way to continue having "conversations" with a virtual version of them. And Freud-me has been exploring the idea of using VR to see your problems from a different perspective. Literally. For their experiment, you talk, giving yourself some advice, and their VR technology will programme your voice into the "body" of a digital Freud, Michelle Obama, or Steve Jobs, so you can "hear" it from them.
The idea that it might be possible to have autonomy over my thinking is surprisingly liberating
Obviously WoeBot isn’t a stand-in for human interaction or a solution to the UK's lack of funding for mental health services. And sometimes the emojis and parlance like "K," "btw," and "no problemo" undermine the impression that it’s taking your woes seriously (which arguably, as a piece of computer software, it isn't). The psychologist and Stanford researcher who made WoeBot, Alison Darcy, recently told the Should This Exist podcast that she hopes to “make great psychological tools radically accessible.” And despite only having access to it because I’m privileged enough to own a smartphone, it’s useful exercises definitely won me over. To propose that an app might be useful in helping us make sense of our most human of traits — to feel emotion — seems odd, but I’m kind of into it.