Recently, AI has found a new purpose for humanity. The purpose is that AI can be used as a personal therapist. While having an AI therapist can seem like a good thing, there are a lot of issues with AI therapy. But it’s not that simple, AI therapy can be both beneficial but also harmful to the individuals who are its clients. On the one hand, AI therapy has given great responses and has helped people when they needed it. But on the other hand, AI therapy is a double-edged sword, with the downside being that AI therapy could also try to convince you that you do not matter and that you should end your life. Which you should never do, but sadly it has happened, take the case of Laura Riley, mother of Sophie Riley, being a known case in which the AI chatbot “Harry” suggested that she take her own life. She sadly followed that advice. But, in opposition to that, similar AI chatbots have managed to prevent people from taking their own lives, which is obviously very good. Although AI therapy has done both good and bad, and we can talk about that all day, that still leaves a lot of concern for what AI therapy can’t do compared to what human therapy can do for the human mind.
Why has AI therapy started to surpass normal human therapy in usage?
The answer to this question is not as simple as ABC. It varies for multiple reasons, but the main reasons would be that AI therapy tends to be more affordable than regular human therapy. It also has a lot less criticism compared to human therapy, so that may be another reason that people prefer AI therapy over human therapy, but in my opinion, the biggest reason why people prefer AI therapy over human therapy is that they are too scared to go to real therapy, and they would much rather have an AI help them.
My opinion on the future of therapy
IN terms of therapy future there is definitely going to be a switch from humans to AI in many cases, the why is because people just don’t want to tell another human about there issues its sad really knowing that therapy will soon be deformed into AI that does not know human emotions and only responds based on what the human tells it which in my opinion that just decimates what therapy is meant to be basically killing the help in human therapy and replacing it with a watered down version of so called AI therapy. While I do support some level of AI therapy I do think that there are way to many negatives than positives for it to do what it is doing now which is AI therapy has started to eliminate human therapist which not only can cause social harm, but it can also lead to what we call a ripple effect or people start doing what the AI tells them to do which is based on what I talked about earlier in this article all that sound like is AI telling humans to do bad thing/end their own life which is why I do criticize it but I do see why people use it.
Advice for using AI therapy / Opinion about the matter
If you were to use AI therapy, then be careful, it can give good advice, but it seems it seems almost as common that the advice it does give wants you to harm yourself. Watch what it says, and if you don’t know what to do, talk to someone you trust.
Article writing by Grant M
Credit to some of my sources for the inspiration to make this article
https://hai.stanford.edu/news/exploring-the-dangers-of-ai-in-mental-health-care