Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

In Focus

‘My therapist used AI without my consent’: The hidden role of AI in counselling

With a rise in complaints from clients about their therapists using AI during sessions, Radhika Sanghani talks to therapists and their cliences about how and why they are using it

Librarian Molly Quinn was shocked her therapist used AI without her consent
Librarian Molly Quinn was shocked her therapist used AI without her consent (Molly Quinn)

Librarian Molly Quinn, 31, had been seeing her therapist for two years with no issues. Until the day her therapist asked Molly if she’d consent to her using a new note-taking app – an app that uses AI to record sessions, transcribe them and create summaries.

“I’m not a big fan of AI,” admits Quinn. “But I know how hard note-taking is in the therapy world, so I told her I needed to think about it before I signed anything.”

Quinn planned to go away and research the app to see how the private data would be stored, which was a concern “The app supposedly deletes the data after it’s recorded the session and transcribed it, but the fact it’s being summarised as well means it’s being placed somewhere in the AI cloud of stuff, which I’m just not comfortable with.”

But halfway through that therapy session, Quinn noticed her therapist wasn’t taking handwritten notes like usual. Instead, she was using her iPad to record their session. That was when Molly realised her therapist was using the AI note-taking app – without her consent. “It left me feeling violated,” said Quinn. “I messaged her after to say I wasn’t going to come back because I didn’t appreciate what went down. She basically admitted to it, saying we don’t ever have to use AI again, but my trust was broken. I no longer see her.”

Molly is by no means the only client to notice their therapist using AI. The National Alliance of Counsellors and Psychotherapists (NACP) in the UK has received numbers of complaints from clients about their therapists using AI this year – whether it is for note-taking, transcribing sessions, or even to reply to emails and questions.

“There are certainly therapists dabbling with it one way or another,” says Meg Moss, head of public affairs and advocacy at the NACP, who thinks the numbers will only continue to grow. “The complaints have been around therapists using AI, perhaps without contracting for it or without appropriately disclosing it to their clients. Clients have been obviously feeling uncomfortable about that.”

There are dozens of threads on Reddit with clients sharing their stories of noticing their therapists using AI. One had a shock during a Zoom meeting when their therapist accidentally shared their screen to reveal they were using ChatGPT to work out what to say during the session.

The photographer Brendan Keen wrote on Medium about his experience with online therapy platform BetterHelp. After a successful first session on video, he later spoke to his therapist using the platform’s built-in text chat functionality and realised (after sharing his thoughts on a book called The Courage to Be Disliked, only to be sent a reply with a disconcertingly familiar block of text) that his therapist had used AI to reply. When he called her out on this, she admitted she’d “referred” to AI without “revealing any client information” – but it left Keen feeling betrayed and worried about a breach of therapist-patient confidentiality.

Moss urges all therapists to ensure they have their clients’ consent before using AI. But she also stresses that there are other issues that therapists need to think about. “From a data perspective, you have to be careful about the privacy platforms of any platform you’re using. With AI, how is that data being used to train the model?”

Molly Quinn felt ‘violated’ when her therapist used AI without her consent
Molly Quinn felt ‘violated’ when her therapist used AI without her consent (Molly Quinn)

She also explains that it can be damaging from a therapeutic perspective. “There is the effect on the therapeutic relationship itself. For example, when you’re potentially using AI in order to understand your client, how are you understanding what that effectively third perspective is having? What bias are they bringing in? Bias is a really big thing and people underestimate the effect it has on the output of AI, because it’s trained on heavily Western data and experiences.

“If you have clients from different cultures, and you’re asking AI to make sense of their notes for you, it might not make any sense at all because it’s not culturally attuned. If you’re a therapist using AI, it’s important to deeply consider how you’re using it.”

Ranjith Devakumar has been working as a therapeutic counsellor for the last two years since he left a career in compliance. He’s been using AI ever since he qualified and has no plans to stop. “It’s a tool that’s useful and I can’t deny that,” he explains. “I use things like ChatGPT or Microsoft Copilot to bounce off ideas to work out the best way to support my client, or research different tools or techniques.”

He would never share a client’s private information or personal story with AI. Instead, he offers hypothetical scenarios or simply says things like - “show me different techniques on how to manage emotional regulation or stress for a male client in his thirties” – to work more efficiently and effectively.

“It’s a lot of work to do your own research sometimes and AI helps me to do that in a quicker way,” he says. He would never use AI for note-taking, and still ensures his primary source of support is his supervisor, but thinks of it more as a “soundboard”.

Devakumar has considered including it in a written agreement with his clients but decided against it because “that’s like telling them I read textbooks or go into online e-journal libraries or Facebook groups for therapists”. He believes that even when he’s been practising for decades, he’d still use AI as a learning tool, because therapists never stop learning.

Ruby Mitchell, an NCPS-accredited therapist, has a similar view. “I wouldn’t use AI to make clinical decisions to refer someone on, or learn a new way of working. I would use it to brush up on things. For example, I’ve done specialist ADHD training and I might say what are ADHD-friendly strategies that would help with X issue? It’s a refresher rather than a teacher.”

The NACP has received complaints from clients about their therapists using AI
The NACP has received complaints from clients about their therapists using AI (Alamy/PA)

She’s noticed AI assistance appearing as an automatic feature on Zoom to summarise meetings and has specifically opted out to make sure it doesn’t compromise her clients’ data. She’s cautious around AI, fearing it can keep clients stuck in loops by constantly offering reassurance if they use it for personal therapy, but also that it could replace supervisor contact for therapists who become reliant on it.

Even so, she admits she’s been tempted to use it in that exact way. “I’ve come away from a tricky session with a client and thought I need to run through this with my supervisor. But he’s a busy man, and that means bringing my next meeting with him forward. There is that temptation to run it by ChatGPT and see if I’m right in my thinking. I’ve toyed with it.

“But if and when things go wrong, and we make bad decisions, which does happen, where does that put you if you’re hauled in front of a complaints panel or tribunal and you quote ChatGPT? If you go to a supervisor with an ethical dilemma, you have a paper trail.”

Therapist Richard Miller is currently working with the British Association for Counselling and Psychotherapy (BACP) as an ethics consultant to create an ethical framework review – a “rulebook for counsellors”. The plan is to include AI, to give therapists an up-to-date, evolving guide on how to use AI – and how not to.

“I don’t want a counsellor using ChatGPT for a diagnosis in the same way I wouldn’t a doctor to do that,” he says. “And if you’re outsourcing your note-taking, don’t outsource your critical thinking. Your client’s story doesn’t belong to you, it belongs to them. Even if a large server says it is secure and data is deleted, if you can’t check that, as an industry we need to push ourselves to have the highest standards we can and be uncomfortable using the tech until we can do that safely.”

“I don’t want the first generation of people who use AI to be the guinea pigs of confidentiality.”

He delivers training to ensure counsellors don’t make mistakes with AI, and urges therapists to think primarily of their client’s consent, and then not to store anything on the cloud that might embarrass or cause shame to clients if it became public.

But above all, he wants therapists to think conscientiously and to do the research before using AI to help them speed up their work: “I don’t want the first generation of people who use AI to be the guinea pigs of confidentiality.”

That’s exactly how Molly Quinn felt when she walked out of that therapy session. “There were two moral issues that happened there. The first was a lack of consent, which was why I knew I couldn’t go back to that therapist. Then there was the other question of her using AI in the first place.

“I looked up the app she suggested and it says on their page it goes through OpenAI rather than closed AI, which is more private. The app has an agreement with OpenAI that they’re supposed to delete the data afterwards, but I don’t trust that OpenAI does that. I wouldn’t be surprised if in the next couple of years we see data leaks from stuff like this”.

She doesn’t think her therapist actually understood “the moral implications or how AI really worked” and now worries for clients who “are not as well versed in tech and might not know to ask for details.”

Meg Moss understands therapists using AI to spark ideas, or assist them with communication if that’s something they struggle with. But she does urge caution as it can drastically change the therapist’s relationship with their client.

“It throws up all sorts of questions for the client if they know their therapist uses AI. Like, if they’re using it now, are they using it for my notes? It’s going to add a huge dimension of uncertainty to that relationship. I’d caution against using it unless you absolutely have to.”

She also points out that it’s just not necessary. “Therapists have been doing their job for years without AI, and I’m sure they were doing just fine. I want them to be confident they don’t need AI. Sometimes, it can cause more problems than it solves.”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in