...

Can AI-powered therapy work

Jack pluging in to laptop for counselling online

How long before therapy is provided by AI

There is nothing wrong with simple mental health apps to help people meditate relax and be mindful. But there are misguided people out there who think that an app powered by artificial intelligence, will one day be suitable for people suffering from mental health issues like depression, complex anxiety issues, or trauma.

While AI-powered therapy has its benefits, it also has its drawbacks. One major concern is the lack of human interaction. While AI-powered therapy can provide support and guidance, it cannot replace the human connection and empathy that comes from a face-to-face therapy session. Another problem is the lack of personalized attention. AI-powered therapy may use algorithms and pre-written responses, which may not fully address the individual’s specific needs and concerns. This can result in a lack of progress in therapy, and individuals may find themselves feeling frustrated and discouraged.

Although technological advances have provided some major benefits for people who need therapy. Such as online therapy that provides video and other kinds of therapy over the internet, AI-powered therapy is nothing like real therapy powered by people.

“AI can use the science of psychology to help people with mental heal problems”

Can psychology provide AI with the knowledge needed?

Psychology can be used to try and validate opinions that look as if there is an option that is cheaper and takes less time compared to other types of therapy, it’s called CBT. If you look at some of the past marketing, you will be intrigued by how it was promoted as the only verified and valid form of therapy that should be used.

Is therapy all about marketing?

Many people would disagree with that statement who have tried CBT, and many found that it provides only short-term help, or it does not work at all. In my experience as a therapist, in some complex cases, it can even make people worse.

There are bad therapists about as some people can unfortunately testify, can an AI-driven therapist be as incompetent as a human counsellor, I would think yes, but not with any intended malice, it would be just down to the inability of AI to understand even basic human emotions.

If it’s all down to marketing to make money who decides if you need therapy? Can you fight against the power of the advertising world? They are very good at making people think they need something they do not, or even inventing completely new aliments just to make money.

Therapy is not based on science

Therapy is not science. If you think it is, you have been wrongly persuaded. Many of the psychology studies are seen as scientific but are only studies open to interpretation not based on any actual scientific criteria that result in actual factual results.

There is a reason why psychology is referred to as lesser social science and is very different from formal scientific subjects. like chemistry and mathematics that are based in the physical world not based on just someone’s opinion or insights.

Psychology used as validation

Some psychologists/therapists/psychiatrists have used psychology as a validation to make a lot of money and gained a lot of prestige from pushing their opinions that CBT is the must-have option. Science requires the ability to form a theory and provide experiments that can be repeated by others to independently verify the results. Although psychology can provide an interesting overview of human interaction it fails miserably when trying to determine outcomes of individuals in therapy.

AI will never understand the human equation

I think that psychology will never be able to provide any kind of cure for depression or anxiety because humans are all free-thinking individuals with unique qualities that cannot be quantified in a way that provides a cure-all option.  When people are in crisis how would an AI understand when and why people can become suicidal? How can a computer comfort someone without sounding patronising or oblivious to what going on? I am imaging a therapist called Spock counselling an upset client called Jim by razing on eyebrow and telling poor Jim to stop being illogical.

AI is limited by psychology

AI can overcome the limitation of psychology; this comes down to the inability to understand the human equation. just like how physicists can understand and give reasons for why some things work, but they still can’t understand what gravity actually is made of or how to manipulate it.

Although they can measure its force and have ideas of what it could be. But in reality, they do not have a clue. Just like how psychologists do not know why some people do not experience PTSD after experiencing a traumatic event, and some, unfortunately, do develop it.

It will never happen

If you think it will never happen, think again.  People are trying to use computer programs to deliver therapeutic help now, often as free therapy programs providing a CBT approach, this may look like a beneficially motivated response, but in my opinion, it has nothing to do with how beneficial it will be for humans in the end.

In my opinion regarding AI used for providing therapy, it will more likely be motivated by the money. A way of delivering cheap therapy for the masses while making lots of money for the manufactures of mental health apps and programs.

You now work for the supermarkets

Still not sure, just look at the supermarkets. If someone had told you 15 years ago that you will be queuing up at the self-service checkouts, so you can do the work normally reserved for the checkout staff, scanning products, and packing your bags and using the payment systems to save the store money, would you have believed it?

While there was some small benefit for the customer, the stores can now reduce the checkout staff forcing people to use the self-service option, maybe the customers should form a union and apply for a pay rise.

Self-service mental health

Can you imagine if there was a fundamentally flawed mental health delivery system based on AI that was cheap to deliver and can be used on mass to reduce the need for actual therapists? Would it be used?

If it was introduced how would that happen? probably very slowly at first with free apps and some positive spin. Then later a powerful advertising campaign and free apps and tempting offer aimed at health services and health insurance companies that have big budgets.

Therapy versus marketing

Make no mistake the people behind the AI have powerful options available to them, it is called marketing and it often works. The problem with big organisations is the managers at the top who make the decisions, don’t usually understand mental health services. Even if they have some knowledge, it is not in the detail required to be able to defend against the coming marketing tsunami of suggested benefits of AI therapy services.

AI therapy programs are all about the money

 I may seem to have a pessimistic view of AI but although sometimes it can bring us huge benefits it can also bring us possibly dangerous outcomes. 

Big business does not care about people, only profits and power. People are not machines; all humans need personal human interaction to feel connected and fulfilled.  Forcing AI therapy onto people would only create profits for the few and misery for the many in the long run.

Do a google search for AI therapy and see what comes up, you may be surprised or even worried.  I hope I am wrong; do you think I am?

Leave a Comment

Seraphinite AcceleratorOptimized by Seraphinite Accelerator
Turns on site high speed to be attractive for people and search engines.