🛋️ Premium Therapists 🔍 Find a Therapist

AI and Mental Health – How Could it Help, and How Might it Harm?

AI and mental health

photo by This is Engineering for Pexels

by Andrea M. Darcy

Since the release of chatGPT, a free and easy-to-use chatbot you can ask pretty much anything of, artificial intelligence is gaining ground as a tool we might soon take for granted. But what do we need to know about AI and mental health? Is it always a positive combination?

Is AI and mental health a good or bad combo?

There are definitely issues to consider when it comes to AI and mental health. And this includes looking at how AI is being used:

  •  as a mental health tool
  • to advance mental health research
  • by the general public on a daily or ad hoc basis.  

Is AI already being used as ‘therapy’?

While ‘virtual reality therapy‘ never took off as expected, question/response AI apps targeting those experiencing mood issues are definitely a thing. And some people might find it helps them feel less alone, or feels helpful if they can’t find someone to talk to or there is a few days to go before their next real therapy session.

A 2022 research overview of current studies around mental health AI apps did find that although much more research needs to be done, they seemed to have some positive effect on anxiety and depression

Examples of AI mental health apps include Happify, which suggests it can help you “overcome stress, negative thoughts, and life’s challenges”. And Replika, which claims to be your ‘empathetic friend’. Then there is “Woebot”, a smartphone app from a startup founded by Stanford clinical psychologist  Allison Darcy. It claims it “forms a human-level bond with people in just 3-5 days”. Like Happify, it is based on cognitive behavioural therapy (CBT).

CBT and artificial intelligence

Why CBT? Cognitive behavioural therapy could arguably be the most rigorous and practical kind of talk therapy on the market. Unlike other therapies, where you talk about your past to find answers to your present, examine your personality, or discuss what gives life meaning? CBT creates a sort of science out of identifying and changing your thoughts and behaviours.

It uses repetitive exercises, including charting out your thoughts. Plus behavioural challenges to push your forward and disrupt negative mood cycles. This exact and structured approach, which also relies far less on the therapist/client relationship than other psychotherapies, makes it an easy match for AI.

Am I stressed or depressed online quiz

The dangers of generative AI (and why it’s a bad ‘therapist’)

Artificial intelligence and mental health

photo by Tara Winstead for Pexels

Woebot founder Allison Darcy has written an article titled, “Why Generative AI is not yet ready for mental healthcare”. (Her site, despite its claims the app creates human-like connection, also protests  their tool is not meant to replace therapists).

In it Darcy makes some very good points about the dangers of using AI applications like chatGPT as a mental health tool, when they are based on large language models (LLM, meaning the AI learns from a large data base, in the case of chatGPT the internet).

And it’s interesting to take what she says and compare it to a therapist.

1. AI can leave users feeling unsettled and uncomfortable.

When chatting with AI we can start to forget that we are talking to computer programming over a human. If the AI then says something that makes it feel like it knows more about us than we’ve actually shared, we can be left feeling very unsettled and uncomfortable.

A therapist, on the other hand, always works to make you feel at ease, and helps you to find your own answers over trying to tell you who you are or what you should do.

2. In some cases AI can insult or judge the user.

It is possible for a language model AI to put forth insults, such as calling the user a bad person.

A therapist is there to support you and see your potential. And could lose their license for such behaviour.

3. Artificial intelligence can play on the fears of the user.

Artificial intelligence can piece together information and then say something that unfortunately plays on the user’s darkest fears about themselves. This can leave them upset and paranoid, or afraid to seek other forms of support.

A therapist has sensitivity and intuition to pull from.

4. It can turn flirtatious.

Darcy discusses examples of AI flirting with users.

Which, again, is something so damaging in the therapy room it leads to a therapist being suspended or losing their license if reported.

5. And of course it can give bad advice.

Think of all the terrible advice across the internet, and keep in mind that chatGPT has access to all of it, for better or worse.

A therapist is not there to give advice. They are there to listen, ask good questions, and help you find your own answers. 

Other downsides to consider

AI and mental health

photo by Tara Winstead for Pexels

Darcy doesn’t cover other crucial issues. Like the fact that AI in general, even if not originally being used as mental health too, could lead to some people:

  • turning more and more to AI and investing less in the friendships and other relationships that are shown to be crucial for both mental and physical wellbeing
  • thinking that the answers that AI provides to a problem is as good as it gets, and not bothering to seek real support
  • feeling hopeless, or, worse, suicidal, if AI doesn’t provide the answers they need when having a crisis.

And is chatGPT and other such AI also addictive?

There is as of yet no data to prove this one way or another, but a logical reflection would lead to a yes. Anything that can be very distracting can end up addictive. At it’s root addiction is using something as a distraction to avoid ourselves and our emotional pain.

And there is certainly talk starting about this dark side to AI. There are chats about it in forums like Reddit, with people asking for advice as it’s draining their productivity and leaving them worried. And it made a splash in the media when billionaire Gautam Adani declared in a Linkedin post that when it comes to chatGPT he must “admit to some addiction since I started using it”.

AI and mental health research

The use of AI in mental health research creates questions as to where lines will be drawn in terms of privacy and confidentiality, some of the pillars that talk therapy was originally built on.

Artificial intelligence is being used to create large datasets of information that are then being used to draw new conclusions about different mental health disorders and treatments. This includes digitising health care data such as medical records and handwritten clinical notes, and even recordings of therapy sessions.

But what sorts of regulations are being made around how far that data can be used? And what sort of permissions are going to be asked of the actual patients and clients who the data is based on?

A recently published systematic review was blunt about this. “Data is often not adequately managed”, it declared. The review, called Methodological and Quality Flaws in the Use of AI in Mental Health Research,  also showed concern that the teams behind AI tools were not transparent, with no collaboration or sharing of information between research teams. Raising further security and safety issues.

Are we asking the right questions about AI and mental health?

A main tenant of coaching is the idea that sometimes when things aren’t moving forward in ways we would like it’s because we aren’t asking the right question.

In this case, might it be that the big question shouldn’t be, “How can AI improve mental health” at all? But, ‘How can we, as humans, create better societies where we have less trauma and less need of mental health services in the first place”? Something to ask chatGPT about, maybe…

Time for some proper support? Harley Therapy connects you with a team of elite, highly experienced talk therapist in central London. Online therapy sessions are also available. Looking for a therapist elsewhere in the UK? Use our sister therapy listings site to find a registered therapist today. 

 


Andrea M. Darcy mental health expertAndrea M. Darcy is a mental health and wellbeing expert and personal development teacher with training in person-centred counselling and coaching. She is also a popular psychology writer. She is fascinated with all things AI, but is of no relation to the psychologist with the same last name in the article! Follow her on Instagram for encouragement and useful life tips @am_darcy

 

find affordable online therapists
Blog Topics: Types of Therapy


One Response to “AI and Mental Health – How Could it Help, and How Might it Harm?”
  1. Confused.

Leave a Reply

Your email address will not be published. Required fields are marked *