Designing and Evaluating a Conversational Agent for Mental Health Support

Many people with mental health issues face significant challenges getting the help they need.  Psychological counseling or psychiatry services could be a luxury expense for people under financial stress. 5.1 million American adults, including 2.8 million with severe mental illness, did not receive services because they could not afford the cost of care (Lipari, 2018). Beyond structural barriers, fear of being stigmatized also prevents people from seeking help for mental health concerns (Lannin et al., 2013). 

To expand the access to mental health services and to counteract the problems of stigma, there has been a burgeoning growth in internet-based and mobile applications for mental health interventions. However, these mediated interventions are characterized by relatively poor adoption and adherence (Donkin et al, 2013). More recently, text-based conversational agents, or chatbots, have gained traction as the new generation of e-therapy. Powered by natural language processing technique, these agents can engage clients in a therapeutic process using natural language as inputs and outputs. This human-AI environment offers a “judgment-free zone” for those clients who are concerned about stigma. When invoking anthropomorphism, the conversational AI has great potential to provide emotional support in a rapid-response capability.

Literature related to psychotherapeutic chatbots is rather sparse in both psychology and HCI literature. Recent psychology scholarship began to evaluate the efficacy of using conversational agents for mental health interventions (Fitzpatrick et al., 2017; Park et al., 2019). While confirming that chatbots provide an affordable and effective method to deliver therapy, this line of research has not thoroughly explained the underlying mechanism of why chatbot might be more accepted and engaging than other types of e-therapy. Therefore, the first goal of the study is to examine the socio-technical relationship through the lens of Technology Acceptance frameworks to evaluate factors related to the adoption and use of conversational agents for psychotherapeutic purposes. Moreover, previous research has not evaluated the therapeutic relationship between humans and anthropomorphic technology. Following the APA  guideline that states effective psychotherapies that do not mention the therapeutic relationship are “seriously incomplete and potentially misleading on both clinical and empirical grounds”(Ackerman et al., 2001), this study unpacks the human-agent therapeutic relationship by evaluating anthropomorphism and its impacts on people’s perceptions and interactions with chatbot.

Winter 2020
Principal Investigator
Total Award Amount