Would people talk to a chatbot for concerns about substance use?
COVID-19 has seen an increase in the use of telehealth and online counselling for the delivery of mental health and addiction treatment. But would people talk to a chatbot – an artificially intelligent, computer program that aims to simulate human conversation – if they had concerns about their substance use?
A recent study by Turning Point and Monash University has found that while clients and counsellors value in-person care and view it as important to address a client’s emotional needs, chatbots could also assist in online counselling by performing assessment, referral and simple tasks.
Researchers conducted interviews with clients and counsellors from Counselling Online, Australia’s national alcohol and other drug online counselling service. The research explored clients’ and counsellors’ experiences of online care, and their views about whether chatbots might assist or constrain care that is delivered online in the future.
Both groups had concerns that chatbots lack a ‘human’ element important for empathic care, and clients said they would be less likely to share information with a chatbot. However, clients and counsellors agreed that the ability of chatbots to perform straightforward tasks, such as screening, triage or referrals, could allow counsellors time to provide more efficient care to more people.
While technology is useful in supporting people seeking treatment for substance use, particularly where there are barriers to help seeking, the ability for humans to empathise and understand each other is an important aspect of the counsellor-client relationship, something a chatbot cannot currently emulate.
“You can’t make a computer understand what a human is feeling. They can’t exactly give the right answers, whereas a normal human can,” said Bryce*, who had concerns with cannabis use.
Although artificial intelligence (AI) technologies such as chatbots promise to revolutionise the future of mental health care, Dr Tony Barnett who led the study, cautions against implementing AI technologies without first consulting clients and clinicians who use them.
“Chatbots may enable counselling for substance use to be delivered to a larger population, without the constraints of traditional face-to-face healthcare, such as time, location or cost. However, we need to slow down, and carefully consider – what do clients appreciate about human contact, and how can we maintain empathic care in future digital healthcare delivery?” said Dr Tony Barnett.
The findings of the study indicate that counsellors and clients view human counsellors as continuing to play a central role in providing care for the treatment of substance use. However, an opportunity exists for humans and machines to work together to deliver care more efficiently. Importantly, involving clients and clinicians in the design and implementation of future digital healthcare technologies will be essential in ensuring that they are not only useful, but also relevant and engaging.
The findings of the study have been published in the International Journal of Drug Policy
*Pseudonyms are used to protect the identity of participants