bgbg
createbytes-logo

AI Text Detection

our showreel

Toggle

Chatbots as Therapy Bots: Social & Ethical Consideration

Thu Jan 21 2021  ·  

 ·  5 min read
Chatbots for Mental Therapy
Introduction

World Health Organisation (WHO) states that depression is a common illness which is now worldwide wherein almost 264 million people were diagnosed suffering from Depression out of which around 800,000 people die due to suicide every year. To reduce this continuously increasing void in human beings, the tech world came up with the idea to use chatbots as therapy bots. Artificial Intelligence companies have developed chatbots that can be integrated with mobile applications which can be further leveraged to identify and then talk with the people who are suffering from multiple health-related problems like depression, bipolar, dementia, and even schizophrenia. 

Alison Darcy, CEO of Woebot and an adjunct lecturer at Stanford in Psychiatry stated that on a weekly basis around 4.7 million messages are exchanged between the chatbots and their users who are seeking mental help. Researchers in AI have listed detailed pros for using a chatbot for the same whereas it becomes equally important to see the same through a different lens. Social and even AI researchers themselves have pointed out some of the cons where they have urged to review the use of chatbots as therapy bots from a social and ethical point of view. 

What is a Therapy Chatbot? 

Therapy bots are chatbots that are trained using ML & NLP Algorithms to chat with humans who are suffering from depression or some other mental disorder related to Depression. Based on the Clinical studies, Emotional Intelligence and Artificial Intelligence, chatbot based applications are trained to detect, talk and provide mental support to human beings. Bots are trained with databases based on the clinical psychological studies where they are taught to detect the mood of their user and depending on the same they began with the therapeutic conversation with their human user. 

Benefits of Using a Chatbot as Therapy Bots

Chatbot for mental therapy

  1. 24X7 Availability

There are certain scenarios where the void is eating up the humans in which case they need to talk and speak their heart out. Therapy bots integrated with the application and installed in one’s phone can help reduce the void and be available for its user at any time. Whereas the replies too are instant and the patient need not wait for his human therapist in times of emergency. Lastly, it can also book appointments or remind the user of its counseling sessions on time. 

  1. Customize & Personalise

The bot is trained in a way that it will not only provide chat with the user but will also detect the mood of the user to customize the user experience as per the real-time requirement of the user. For example, if the bot detects that the user is feeling rather lonely then the therapeutic conversation will be in a manner that will try to make the user feel otherwise. 

  1. Each and every patient will have a therapist

With more than 264 million people suffering from depression, it becomes challenging and almost an impossible task to provide each and every patient with a therapist. And, not many of those can afford to take counseling whereas many of these applications are free to use. 

Social & Ethical issues with Therapy Bots 

In one of the recent blogs by Nabla, the author pointed out many conversations that the most recent OpenAI developed GPT-3 had with its user whereinto one of the users, on being asked about whether he should kill himself? The GPT-3 bot said that he thinks he should and not only that he also pointed that he may even help him with the same. To many, this might seem like a ridiculous moment where the therapeutic bot is supporting the human to kill himself. Such incidents make one wonder and write about the development of AI with the lack of attention being paid to ethics that must be incorporated into during the development procedure. 

GPT-3 supports user to kill himself

  1. Bots Replacing Humans

This point can be simplified in two ways; first, the bots will take the place of a therapist or rather a human being which is not quite correct in its own way as the patient who is already feeling lonely might get a negative impact that eventually all he has a is a machine who can support him but not a human. This may lead to a further negative impact on human psychology. 

  1. Chances of Data Breach

The patient will be uploading all his health-related details on the application; further, even the everyday mood, routine, medication, appointments, and even the most confidential information related to the person’s health will be stored on a server that may be vulnerable to multiple threats related to the data breach. 

  1. Empathy and Chatbots? 

Chatbots are based on NLP frameworks whereby they are trained with the database created with thousands of psychological cases but psychology in itself is such a diverse field where they themselves come up with hundreds of human cases every day which have not been seen before. The human brain works in multiple lengths and the progress that has been made in the same is not much that the bot can cover such intricate patients of depression, bipolar, anxiety, or even schizophrenia. In Fact, the question arises whether it is possible for the bots to penetrate into the depths of human thoughts and then understand and react accordingly? 

Conclusion 

The nurse vs patient ratio currently is going at 1:10,000; therefore, having a bot as a therapist can be very useful in many circumstances. But, where it has its strengths, the weakness for the same comes following the same road too. There are ethical and social issues that get compromised while developing these therapy bots. 

References

Continue Reading...

Tagged with :

Web & App Development
Software Development
Product Recommendation
Mobile App Development
User Experience

FAQs

How do chatbots support mental health?

They can guide users through how they are feeling, help users challenge negative thoughts, suggest tools and resources, and engage them in evidence-based therapy. Never forget that chatbots aren’t perfect.

 

Is talking to AI healthy?

Having an actual conversation puts you in a different mode to really encourage sharing, openness, and self-reflection. Beyond all this AI technology has the potential to support mental health and wellbeing more broadly.

 

How can AI improve mental health?

Mental health professionals use AI to improve the accuracy of diagnosis and treatment. 84% of psychologists have seen a rise in demand for anxiety treatment.

What is emotional AI?

Emotional AI refers to AI that detects and interprets human emotional signals. Sources can include texts, audio, video, or combinations thereof.

 

Start a new project

Kindly fill up the below details and We will contact you ASAP!