Zephyrnet Logo

Meet Woebot, the mental health chatbot changing the face of therapy

Date:

Will Missen
Source: Woebot Health

These days, AI-powered virtual assistants, otherwise known as ‘chatbots’, are everywhere. Big Tech has sent the likes of Alexa, Siri, and Cortana into our homes, while industries from hospitality to healthcare reap the benefits of customer service automation.¹ Some bots, such as Rollo Carpenter’s ground-breaking Cleverbot, are built for the sole purpose of accurately simulating human conversation.²

Most chatbots owe their existence to a branch of AI called Natural Language Processing (NLP). In short, NLP enables computers to ‘understand, interpret, and manipulate human language.’³ This means that when a user asks a chatbot a question, the bot scans that input for keywords it recognises, before responding with an appropriate prompt. For now, these prompts are constructed by humans; chatbot technology is not yet sufficiently advanced for bots to begin crafting their own responses.

What is Woebot?

Woebot is a mental health chatbot, specialising in Cognitive Behavioural Therapy (CBT). It was created by Woebot Health, known formerly as Woebot Labs.

The company was founded by Dr Alison Darcy, a former clinical psychologist at Stanford University. While working as an academic, Darcy often felt overwhelmed by the scale of the global mental health crisis, grappling with the problem of how to deliver support services to those in need. She came to see AI as a possible solution, and left her post to produce ‘a direct to consumer product.’⁴ Originally launched via Facebook Messenger in June 2017, Woebot has since become a standalone app, with 4.7 million messages exchanged every week in over 130 countries.⁵

Like all chatbots, Woebot ‘learns’ from information users provide, generating relevant responses ‘written by a team of clinical experts and storytellers’.⁶ The creators of Woebot want it to feel tailored to each individual — an essential aim given its mental health remit.

1. Case Study: Building Appointment Booking Chatbot

2. IBM Watson Assistant provides better intent classification than other commercial products according to published study

3. Testing Conversational AI

4. How intelligent and automated conversational systems are driving B2C revenue and growth.

How does Woebot work?

The Woebot app centres on daily, ten-minute conversations held over text. These conversations often begin with an informal check-in, before progressing to a CBT exercise. Oftentimes, the exercises will demand custom responses from the user, in contrast to the multiple-choice nature of earlier self-assessments:

Woebot offers the user a good amount of flexibility. Exercises can be started at will, and Woebot often asks the user if they would prefer a shorter version of the exercise at hand. Woebot’s daily check-ins can be adjusted to arrive at certain times, or disabled altogether.

The app comes with a number of additional features: Mood Tracker and a Gratitude Journal. These tools collate the user’s responses to questions like, ‘How are you feeling today?’ and ‘What is one thing that went well in the last 24 hrs?’. This can help to reveal mood patterns to the user; for example, one person might feel more anxious on a particular day of the week, or at a certain time of day.

During conversations, Woebot will sometimes send the user supplementary exercises and videos, designed to reinforce the content of the exercise in question.

What is using Woebot like?

In short, Woebot is kind, non-judgmental, and occasionally rather funny.⁷

Over time, the app learns your most common cognitive distortions — mind reading, all-or-nothing thinking, and labelling, say — and how best to manage them. When you report feeling low, Woebot validates those feelings with a comment like, ‘That must be tough’, instead of assuring you that everything will be OK. It’s also incredibly polite; I found that Woebot frequently asked permission to launch exercises, which felt like a purposeful design choice rather than the result of a technological need for confirmation.

Woebot is also gratifyingly inclusive. During our first conversation, it asked me if I was ‘male, female, or another wonderful human identity’. Darcy has also confirmed that Woebot itself is gender neutral.⁸ As the technology driving Woebot improves, however, it would be good to see Woebot ask users about their age, race, and economic situation (given prior consent, of course). This would enable Woebot to more readily address issues of unique importance for particular demographics.

The benefits of using Woebot

Having once set users back $39 a month, Woebot is now completely free to use.⁹ Given that accessibility is essential for any health-related service, this is certainly to be applauded. However, Darcy has mentioned that the company ‘will probably go back to charging in the future’, owing to its current reliance on venture capital funding.¹⁰

Unlike a human, Woebot never feels overwhelmed by a stream of negative information. Instead, it remains calm and helpful at all times. Users can make contact anytime, anywhere, and with a minimal amount of effort.

In addition, Woebot is highly scalable. This could help it to address the profound mental health needs of young people, thanks to its digital nature.¹¹ In time, the app could feasibly be offered in schools and at universities, provided that high levels of security and efficacy were ensured. Unfortunately, lack of awareness is currently the biggest barrier to Woebot’s success, but the company is working hard to remedy this.¹²

Finally, Darcy suggests that CBT as a technique is uniquely suited to virtual delivery.¹³ One reason for this, she claims, is that CBT tends to focus on the present instead of the past, as opposed to traditional psychoanalysis.¹⁴ Consequently, Woebot’s highly practical nature makes it well-suited to this method of treatment. After using Woebot for 2 weeks, one journalist said that ‘it was nice to list some real intentions’, having felt that she was ‘simply talking in circles’ with her actual therapist.

Therapy sans therapist

In the UK, accessing therapy can sometimes be a challenge.

The NHS offers patients face-to-face therapy free of charge via their Improving Access to Psychological Therapies (IAPT) service.¹⁵ Patients can be referred by a GP, or via self-referral. However, waiting times are often long, and many patients feel their needs cannot be adequately met here.

The alternative is to pay for a private therapist. While some therapists do offer concessionary rates, and the range of treatments available is often extensive, prices typically sit around £50 per session.

Couple these issues with the fact that many people experiencing poor mental health are afraid to reach out and ask for help, and the result is that many people who need therapy simply do not receive it. Some believe that providing ancillary supports can help to remedy this issue. These might include telephone helplines, text lines, online chatrooms, and, indeed, chatbots.

However, the creators of Woebot are at pains to point out that the app was not designed to replace therapists.¹⁶ Given the limitations inherent in chatbot technology, they view the product as more of a self-help exercise, akin to meditating, jogging, or using a colouring book. One Woebot user with prior experience of face-to-face therapy told Healthline that ‘with prefilled answers and guided journeys, Woebot felt more like an interactive quiz or game than a chat.’¹⁷

Is Woebot safe?

In March 2019, the Oxford Neuroscience, Ethics and Society Young People’s Advisory Group (NeurOx YPAG) published a journal article summarising ‘group discussions concerning the pros and cons of mental health chatbots’.¹⁸ The article, ‘Can Your Phone Be Your Therapist? Young People’s Ethical Perspectives on the Use of Fully Automated Conversational Agents (Chatbots) in Mental Health Support’, assesses Woebot and two other mental health chatbots in light of three main issues: ‘privacy and confidentiality, efficacy, and safety’.

Privacy and confidentiality

The group first of all notes the importance of having an independent app, rather than only offering the chatbot service through, say, Facebook Messenger. While using Messenger, user data are subject to Facebook’s own privacy policy and ‘can be shared with third parties.’ Fortunately, Woebot can be used as a standalone app.

The researchers go on to advise that

users should have the option of being reminded of confidentiality arrangements at any point […] so that, if words such as “privacy” or “confidentiality” are typed into the conversation, an automated and up-to-date reminder of privacy policies is generated.

Given that one of the biggest obstacles to widespread adoption of Woebot will surely be a lack of trust, adding this feature would likely increase uptake among data-conscious young people.

Efficacy

The researchers are clear that the output of mental health chatbots ought to be based on empirically grounded clinical frameworks. At the time of writing, they note that only Woebot Health had released the findings of a randomised control trial.¹⁹ This experiment was overseen by Darcy and her former Stanford colleague, Dr Kathleen Kara Fitzpatrick.

The study involved two sample groups of US undergraduate students, all of whom self-identified as experiencing symptoms of depression or anxiety. Over the course of 2 weeks, one group conversed with Woebot, while the other read about depression in an e-book.

By the end of the fortnight, these were the results:

  • The ‘Woebot group’ showed a greater reduction in symptoms of depression than the ‘e-book group’
  • Levels of reduction in symptoms of anxiety were roughly equal between the two groups
  • Eighty-five percent of the Woebot group reported using the app daily or almost daily
  • Woebot users felt ‘generally positive about the experience, but acknowledged technical limitations’
Source: Woebot Health

In a second Stanford-based study involving 400 participants, Woebot users showed a 32% reduction in symptoms of depression and a 38% reduction in symptoms of anxiety after four weeks of use.²⁰

Safety

However, the biggest concern about Woebot voiced by the NeurOx YPAG regards its level of safety. Specifically, they point out that the app should be able to offer appropriate help to a user who is thinking of committing suicide.

Currently, if you type in ‘SOS’, ‘suicide’, or ‘crisis’ to Woebot when asked about your mood, the app’s emergency mode will activate. It will instruct the user to make contact with a ‘friendly, caring human who can support you and help you stay safe during this time’, acknowledging that it is unable to help with this situation. Following this, Woebot provides links to the phone number and website address of the US-based National Suicide Prevention Lifeline (NSPL); the phone numbers 911 and 112; the National Domestic Violence hotline number and webchat address, and a list of international emergency phone numbers.

Firstly, while helpful, these resources are highly US-centric. Ideally, the links provided would be ‘tailored to the users’ location’, and have been shown to be clinically effective.

Secondly, and most significantly, in spite of Woebot declaring that it is unable to help a user at risk of committing suicide, correctly indicating that the app does not provide a solution for someone in the midst of a crisis, simply being a chatbot means that some users might more easily mistake Woebot for a real, helpful human than they would mistake, say, a book on depression. As the NeurOx YPAG notes, many people use Woebot over a long period of time, due to how readily it can simulate face-to-face interaction. Despite its gamification, then, users often feel that they are building some kind of relationship with Woebot, to the extent that, according to one user, it starts to feel ‘more like a friend than an app.’²¹

In this light, the very fact that the creators of Woebot feel obliged to clarify that their app is not intended to replace therapists, reveals that there is a genuine risk of conflating the two. This will continue to be the case as the technology improves. Woebot is sometimes so kind, chatty, and charming that particularly unhappy users could be forgiven for leaning overmuch on it.

This is where the stance of Darcy and her team becomes somewhat unclear. On one hand, the company recognises that chatbots are not capable of grasping the nuances of users’ inner lives, let alone taking into account the past and present circumstances that can lie at the heart of mental health struggles. Woebot has even been programmed to tell certain users, ‘As smart as I may seem, I’m not capable of really understanding what you need’.

On the other hand, one section of the Woebot website tells users that because ‘CBT delivered via the Internet can be as effective as therapist-delivered CBT for both anxiety and depression’, they might consider using Woebot instead of a therapist.²²

Source: https://chatbotslife.com/meet-woebot-the-mental-health-chatbot-changing-the-face-of-therapy-44e8c6ff4fc2?source=rss—-a49517e4c30b—4

spot_img

Latest Intelligence

spot_img

Chat with us

Hi there! How can I help you?