Can robots do therapy?
In a new study, researchers from the Pace University compared an AI-powered therapy app (Woebot) with three other interventions: a non-smart conversational program from the 1960s (ELIZA), a journaling app (Daylio), and basic psychoeducation (which they considered the control group). They found no differences between the groups regarding improvements in mental health. The resulting article was published in Computers in Human Behavior: Artificial Humans.
Findings from the study reveal that Woebot does not offer benefits above and beyond other self-help behavioural intervention technologies and provided partial support for the hypotheses that Woebot would not differ from other app-based (non-psychotherapy-identified) interventions in leading to improvements in mental health and that all three active interventions would be superior to a passive informational control group.
Overall, the results from this study support previous research illustrating the benefits of behavioural intervention technologies. The benefits experienced by participants in the journaling group support previous studies’ findings that daily journaling via an app promotes creativity and resilience and improves mood by allowing individuals to further reflect on their thoughts and behaviours.
The current study echoes growing concerns that the affordances of digital technology (convenience, ease and cost) are leading to complacency and reduced quality of what is considered appropriate mental health treatment.
Woebot was initially created as a conversational agent that is programmed to deliver messages to users that are empathic and personalized while also remaining transparent in its presentation as an artificial agent to avoid representing itself as an artificial agent trying to pass as a human.
Nevertheless, Woebot’s website notably features findings of users expressing that talking to Woebot feels like talking to a human being who shows concern, as well as research findings that Woebot users report feelings of human-level bonding after interacting with Woebot for five days.
Woebot has been designed to emphasize its humanness and availability to offer therapeutic care during hard times. Moreover, Woebot’s developers have conducted research to demonstrate that people develop a bond with Woebot that is proportional to the bond people develop with their human therapists.
These results raise the question of whether the public is being duped by AI hype. This study did not find that an AI-powered CBT bot led to better outcomes than the first conversational program or even psychoeducation program from the 1960s and provides reason to conclude that the claims around mental health apps are not evidence-based. Yet, despite concerns about privacy and coercion, they continue to grow in popularity.
Eltahawy and colleagues argue that future chatbot research must demonstrate that they are at least as good as existing psychotherapies (such as CBT delivered by a human therapist) before they should be delivered as supposed “effective” interventions. To make a long story short, a good quality bot can be an aid for those who cannot afford psychotherapy, but it is still far from replacing it.