, ,

The Risks of AI Note-Taking Software in Therapy: A Psychotherapist’s Perspective

Risk of AI Note-Taking Software As artificial intelligence (AI) technology rapidly evolves, its application in healthcare, including psychotherapy, is growing. AI note-taking tools, designed to record and summarize therapy sessions, are marketed as time-savers for clinicians and ways to improve the efficiency of documentation. However, as a psychotherapist, I believe it’s essential to address the…


Risk of AI Note-Taking Software

As artificial intelligence (AI) technology rapidly evolves, its application in healthcare, including psychotherapy, is growing. AI note-taking tools, designed to record and summarize therapy sessions, are marketed as time-savers for clinicians and ways to improve the efficiency of documentation. However, as a psychotherapist, I believe it’s essential to address the potential dangers of these tools and educate therapy clients about their rights. Spoiler alert! I do not use AI-Listening note-taking tools. I think there is a HUGE danger in doing so. There may be space for this in the future but I’m not willing to risk my client’s personal data or risk what is so sacred about psychotherapy and mental health counseling (confidentiality). I utilized data from around the web to build this blog, referenced at the end of the blog post.

Documentation: A Critical and Time-Consuming Task

On average, psychotherapists spend 20-25% of their workweek on documentation. This includes writing detailed session notes, treatment plans, and correspondence related to client care. This documentation is vital for providing ethical, evidence-based care and ensuring continuity in treatment. I can attest, that documentation is one of the more laborious parts of our profession. Some clinicians take note during sessions, other wait until after sessions. I take what are called “psychotherapy notes” (bulleted points during sessions) and then create the formal session notes (also referred to as “progress notes”) after the client has left the session. The psychotherapy notes are my private notes and cannot be (and are not) seen by anyone, ever. The progress notes are the client’s personal records and can be seen anytime by any client. They are retained for the period set forth by your state’s record retention laws, then destroyed (usually around 7 years on average). Progress notes are similar to the note your medical doctor writes after you see them.

AI tools promise to alleviate the progress note workload by automating note-taking. However, the convenience they offer may come with significant trade-offs, particularly in privacy and the quality of therapeutic relationships. There are now, generally, two types of AI note-taking tools available to clinicians: listening and non-listening. This article is about the “listening” AI note taking software.


Concerns About AI and Therapy

1. HIPAA Compliance and Data Security Risks

Under the Health Insurance Portability and Accountability Act (HIPAA), therapists are required to ensure the confidentiality and security of client data. AI note-taking tools often involve transcription services or cloud storage, which could expose sensitive client information to unauthorized access or breaches.

Some companies (I would hope any company offering this to psychotherapists) claim their tools are HIPAA-compliant, but compliance doesn’t eliminate risks entirely. For example, breaches in third-party data storage systems have occurred even in HIPAA-covered entities. Remember the massive data breach earlier in 2024 that crippled the healthcare payment systems, which will remain nameless for the purposes of this blog? (Just do an internet search–you’ll find it.) It is critical for consumers to ask their therapist whether AI tools are being used and how their data is protected. If your therapist uses AI listening tools, you MUST be given an option to opt-out or opt-in and your therapist will (hopefully) have signed a BAA with the company offering the transcription services, ensuring HIPAA compliance. AI-Note taking tools that do not listen do not require you to opt-in, as the therapist will still be responsible for ensuring they have a BAA (Business Associates Agreement), ensuring HIPAA compliance (the difference being the service isn’t listening into the session real time).

2. Erosion of the Therapist-Client Relationship

The therapeutic relationship is built on trust and confidentiality. I firmly believe the use of AI to transcribe sessions can introduce a sense of surveillance that undermines the safe, non-judgmental space necessary for effective therapy. Clients may feel hesitant to share deeply personal information, knowing that a machine is recording their words. Furthermore, consider that I’ve had clients request we move all electronic devices out of the room for sessions. Have you ever been with a group of friends and talked about your bathroom renovation and all of a sudden your social media is showing you toilet seat ads!? Need I say more?

3. Potential for Errors

AI systems are not perfect. They may misinterpret nuanced language, cultural context, or the emotional undertone of a conversation, leading to inaccurate or incomplete session summaries. This could impact the quality of care and documentation accuracy. I heard from a colleague who has obtained permission from some clients to use AI transcription software that a client made a joke referencing death (meaning the punch line was about death) and the AI software created an entire safety plan and instructed the therapist to call the police! Imagine if this therapist hadn’t proofed the note and then the notes were subpoenaed for, say, a child custody case.


The Role of Big Tech in Mental Health

The rise of big tech in the mental health space is concerning. I’ve offered services to some of my clients through big-tech firms over time and I’m growing more and more leery of their place in our industry. Large companies are investing in teletherapy platforms and AI-driven tools, often at the expense of small, independent practices. These platforms tend to prioritize profit, sometimes lowering pay for clinicians and reducing the availability of highly personalized care. As a consumer you may be thinking how great it is to access lower cost services. But believe me when I tell you, you cannot and will not get the same benefits from an over-worked and burned out therapist as you will from one who is present, unworried about how they will pay their bills. Furthermore, integration of tech and apps into a therapeutic treatment plan may have a place. I often provide apps and tech to clients to use between sessions. But to solely rely on an app is possibly dangerous. In a world (post-pandemic) where so many of us work from home, we need more human connection, not less. This dilution of the industry risks turning mental health care into a transactional service rather than a deeply relational and individualized process. And this is exactly where these tech and auto-AI note taking companies are heading, in my opinion…diluting the industry. Consider that one of the largest therapist platforms right now just obtained a large investment from the largest US-based behavioral healthcare insurance company? Why? Because they see tech as the future and an opportunity to push out clinicians. They are turning the sacred space of a psychotherapy alliance (client-clinician therapy/working relationship) into a widget-based business. This is VERY dangerous.

Studies have highlighted these concerns, with one report noting that some platforms prioritize growth over quality, which can lead to compromised therapeutic standards. Independent therapists, on the other hand, often provide continuity of care and greater attention to client needs. And referencing the auto-note taking error, misinterpreting the joke above, a human can understand a joke without risking client’s future safety or future issues should a note be needed for another context (e.g.: subpoena).


Your Rights as a Therapy Client

As a consumer of mental health services, you have rights, including:

  • Informed Consent: Your therapist must disclose whether they use AI tools in documentation and obtain your consent before doing so.
  • Data Privacy: You are entitled to know how your data is stored and shared.
  • Right to Opt-Out: You can refuse the use of AI tools in your sessions without jeopardizing your care.

Why I Don’t Use AI Note-Taking in My Practice

In summary, I choose NOT to use listening-AI note-taking tools in my practice because I believe in protecting the confidentiality of my clients and maintaining the integrity of the therapeutic relationship.


Conclusion

While AI-listening tools have potential in mental health care, its application must be carefully considered. The risks to client privacy, therapeutic trust, and the overall quality of care cannot be overlooked. As big tech continues to expand into the mental health space, it is crucial to advocate for the preservation of independent practices and personalized care.

As a therapy client, you are empowered to ask questions and make informed decisions about your care. Therapy is about building trust and finding solutions that work for you, not about becoming a dataset in a tech-driven world.


References

  1. https://www.springhealth.com/blog/2024-workplace-mental-health-trends
  2. https://www.mahalo.health/insights/latest-technological-trends-in-mental-healthcare-in-2024
  3. HIPAA Journal: Recent HIPAA Data Breaches
  4. NBC News: Concerns Over Big Tech’s Role in Mental Health