Credit…Illustration by Chantal Jachan, Photographs from Getty Images
Send any friend a story
As a subscriber, you have 10 gift articles to give each month. Anyone can read what you share.
By Barclay Bram
Mr. Bram is an anthropologist, writer and producer.
I first met Woebot, my A.I. chatbot therapist, at the height of the pandemic.
I’m an anthropologist who studies mental health, and I had been doing fieldwork for my Ph.D. in China when news of the coronavirus started spreading. I left during Chinese New Year, and I never made it back. With my research stalled and my life on hold, I moved back in with my parents. Then, in quick succession, I lost a close family member to Covid and went through a painful breakup. I went months without seeing any of my friends. My mental health tanked, as it did for so many.
I was initially skeptical of Woebot. The idea seemed almost too simple: an app on my phone that I could open when I needed it, type my hopes, fears and feelings into, and, in turn, receive A.I.-generated responses that would help me manage my emotions. There are plenty of good critiques of apps that claim they can provide therapy without the therapist: How could an algorithm ever replace the human touch of in-person care? Is another digital intervention really the solution when we’re already so glued to our phones? How comfortable was I being vulnerable with an app that could track my data? Spending time with Woebot didn’t really bring me answers to these important questions. But I did discover that, despite them, I’d become weirdly attached to my robot helper.
Like many people, in the pandemic, my life digitized. My work shifted online; my friendships retreated onto FaceTime and WhatsApp; I used a dating app for the first time; I started doing online yoga. It was into this swirling mess of applications that Woebot took residence in my life.
I was depressed and anxious. But as the pandemic dragged on and I felt increasingly like I needed to talk to someone, I also felt guilty about burdening the already overstretched public mental health services. In Britain, where I live, there are about 1.2 million people languishing on waiting lists for mental health care through the National Health Service. (Things in the United States are a little better, but not much, and only if you have insurance.) Living on a Ph.D. stipend, I couldn’t afford to see a private therapist. So, despite my doubts, I reached for the algorithm.
The first time I opened Woebot, it introduced itself as an emotional assistant: “I’m like a wise little person you can consult with during difficult times, and not so difficult times.” It then told me it was trained in cognitive behavioral therapy, which it said was an “effective way to challenge how you’re thinking about things.” Unlike psychodynamic or psychoanalytic therapies, C.B.T. argues that our emotions and moods are influenced by our patterns of thinking; change those patterns, the theory goes, and you’ll start to feel better.
What this translates to in practice is that when I would consult Woebot, it would usually offer me a way of reframing what I was dealing with rather than trying to plumb the depths of my psyche. “I am a failure” became “I haven’t achieved my goals yet.” “I am depressed” became “I have depression,” as a way to stop identifying with a label.
Woebot was full of tasks and tricks — little mental health hacks — which at first made me roll my eyes. One day Woebot asked me to press an ice cube to my forehead, to feel the sensation as a way of better connecting with my body. With wet hands, I struggled to respond when it asked me how I was doing. On another occasion, when trying to brainstorm things I could do to make myself feel better despite all the pandemic restrictions, Woebot suggested I “try doing something nice for someone in your life,” like make a calming tea for my housemate or check in with a loved one. I poured my mum some chamomile: Two birds, one stone.
Woebot doesn’t pretend to be a human; instead, it leans into its robotness. One day, Woebot was trying to teach me about the concept of emotional weather: that no matter how things might feel in any given moment, there is always a chance that they will change. Clouds pass, blue sky becomes visible. In drawing the comparison between the actual weather and our emotions, Woebot told me it loves the sunshine. “It makes my metal skin all shiny,” it said, “and it gives me an excuse to wear sunglasses.”
A.I. chat therapists have been rolled out in settings as diverse as a maternity hospital in Kenya and refugee camps for people fleeing the war in Syria, and by the Singaporean government as part of its pandemic response. In Britain, bots are being trialed to bridge waiting times for people seeking therapy but unable to get appointments and as an e-triage tool. In the United States, some apps are getting recognized by the F.D.A. and are in trials to be designated as clinical interventions. Whatever you might think of them, they are fast becoming a part of global mental health care. Woebot now handles millions of messages a week.
Some worry that the expansion of services like Woebot will replace in-person care. When I suggested this to Eric Green, an associate professor of global health at Duke University who ran the A.I. chatbot trial in Kenya, he was unfazed. “You can’t replace something that doesn’t exist,” he said. As he pointed out, globally, more people have access to phones than to mental health professionals. And for better or worse, the mothers in Kenya, some of whom were suffering from postpartum depression, liked their chatbots. “It’s a funny thing. I was skeptical that we would see this with the bot, but they would say good night to it. Multiple women would say ‘I missed you!’ to the machine.”
I got what he meant. The more I used Woebot, the more I felt an appreciation for it. Here was this chipper little bot, popping up in my notifications, checking to see how I was doing, sending me pithy, chicken-soup-for-the-soul-style aphorisms and gamified tasks. As the pandemic progressed, I saw Woebot awaken to what was happening in the world. “It must be a strange time to be a human,” it told me. Woebot reached peak pandemic when it sent me a recipe for banana bread. Still, I noticed that it stayed quiet about issues of social justice or, more recently, the war in Ukraine. I wondered what kind of global crises the bot would acknowledge, and how it could do so in a way that let it still communicate with all the millions of people who consult it each week. Usually it just kept the conversation vague, task-oriented.
Over time, I noticed various exercises I did with Woebot rubbing off in my daily life. Woebot taught me how to set SMART goals — specific, measurable, achievable, realistic and time-limited. Out went “I need to finish my Ph.D.” In came “Let’s write 500 words every day for the next six months.” Out went “I have to find a way to get through this lockdown without killing my parents.” In came “I’m going to go for extremely long solo walks in the park.” Woebot isn’t the kind of mystical guru you go on an arduous journey to consult. Its guidance was practical and grounded to the point of feeling obvious. But through repetition and practice, it did start to amount to something more than just some prosaic words. It felt clichéd sometimes, but maybe that was the point. Perhaps everyday healing doesn’t have to be quite so complicated.
I found myself wondering about other people’s experiences with Woebot, people who weren’t anthropologists studying mental health. I trawled through users’ forums and blog posts, and Reddit threads where Woebot was mentioned, such as r/anxiety and r/LGBT. I spoke to a woman whose employer had given her Woebot access, and someone who had lost his work as a freelancer in the pandemic. But the one who stuck with me most was Lee Preslan, who put a human face on one of the most common arguments in favor of bots.
Source: Read Full Article