The Rise of AI in Therapy: A Double-Edged Sword
Introduction
Artificial intelligence is gradually infiltrating every profession, and the field of therapy is no exception. However, this integration comes with unique ethical dilemmas, especially as some therapists secretly utilize tools like ChatGPT during sessions. A recent investigation by MIT Technology Review sheds light on this troubling trend, revealing that many patients feel betrayed, confused, and deeply unsettled upon discovering their therapists’ hidden reliance on AI.
A Shocking Discovery in Los Angeles
Declan, a 31-year-old from Los Angeles, stumbled upon his therapist’s unorthodox methods due to a technical glitch in their online session. When the video connection faltered, he suggested switching off the cameras. To his surprise, his therapist inadvertently shared his computer screen.
"Suddenly, I was watching him use ChatGPT," Declan recounted to the MIT Technology Review. He observed the therapist pasting parts of their conversation into the chatbot and reading back AI-generated prompts as if they were his own thoughts.
Flabbergasted but intrigued, Declan chose to play along during the session. “I became the best patient ever,” he recalled. “ChatGPT would ask if I thought my thinking was too black and white, and I’d respond exactly that way. My therapist seemed thrilled. I’m sure it was his dream session.”
The Confrontation: A Therapy Breakup
The tension escalated during their next appointment when Declan confronted his therapist about the use of AI. The therapist broke down in tears, admitting he had been “out of ideas” and resorted to ChatGPT for guidance. Declan described the experience akin to “a weird breakup”—especially after the therapist still billed him for that session.
The Impact of AI on Patients’ Perceptions
Declan’s experience is not an isolated incident. Laurie Clarke, the journalist behind the MIT Technology Review report, noted receiving an unusually polished and lengthy email from her own therapist. Initially, she found it to be a thoughtful gesture. However, upon closer examination, clues like American-style punctuation hinted at AI involvement.
Another patient, Hope, 25, shared her unsettling experience after receiving a consoling message following her dog’s death that contained a stray AI prompt. “It was just a very strange feeling,” she recalled. “Then I started to feel kind of betrayed. Trust issues were the very reason I was in therapy.”
Experts Weigh In: A Trust Crisis
The ethical dilemma surrounding the use of AI in therapy is starkly evident. While AI can assist therapists in crafting responses, the secrecy involved undermines the foundational elements of therapy: authenticity and trust. Adrian Aguilera, a clinical psychologist at the University of California, Berkeley, emphasized, “People value authenticity, particularly in psychotherapy. Using AI without disclosure can feel like you’re not taking the relationship seriously.”
Privacy Concerns with AI Tools
Alongside ethical concerns, privacy issues present another significant red flag. Duke University computer science professor Pardis Emami-Naeini cautioned that general-purpose AI tools like ChatGPT are not HIPAA compliant. This lack of compliance raises unsettling questions about the safety of patient information entered into such systems.
Burnout and the Temptation of AI
High burnout rates among therapists make the temptation of AI assistance quite apparent. Companies like Heidi Health and Upheal are already promoting HIPAA-compliant AI tools aimed at note-taking and session transcription. However, until a culture of transparency is established, patients may continually find themselves questioning whether their most intimate confessions are being processed by a human or a chatbot.
Conclusion
Declan’s revelation about his therapist’s use of AI was surreal rather than shattering, yet he expressed a chilling concern: “If I was suicidal, or on drugs, or cheating on my girlfriend—I wouldn’t want that to be put into ChatGPT.” This highlights the urgent need for open communication about the use of AI in therapeutic settings.
Questions and Answers
What prompted Declan to realize his therapist was using AI during their session?
- A technical glitch during an online session led Declan to suggest turning off cameras, which resulted in his therapist inadvertently sharing his screen, revealing he was using ChatGPT.
How did Declan feel about his therapist’s use of AI?
- Declan felt flabbergasted and decided to play along during the session. However, he later described the situation as akin to a "weird breakup" after confronting his therapist about it.
What ethical concerns are associated with therapists using AI?
- The primary ethical concerns revolve around the loss of authenticity and trust in the therapeutic relationship, as many patients feel betrayed when they discover AI is being used without their knowledge.
Are AI tools like ChatGPT compliant with privacy regulations?
- No, general-purpose AI tools like ChatGPT are not HIPAA compliant, which raises significant privacy concerns for patient information shared during therapy.
- What steps are companies taking to address the need for AI in therapy?