12. Using AI for Diagnosing in Mental Health: Friend or Foe?

In this episode of “Run Your Private Practice with AI,” I explore the transformative role of artificial intelligence in diagnosing mental health issues. While AI’s integration into diagnostics is a topic of debate, I delve into how it can complement traditional methods and enhance our clinical judgment. Join me as I discuss the potential benefits, ethical considerations, and practical applications of AI to diagnose our clients in the mental health field.

Say goodbye to the burden of mental health notes with automated note creation! 

https://www.berries.icu/  coupon code: 50FromKym (gets you 50% off the first month)

Join our Clinical AI Club at https://www.clinicalaiclub.com to explore these topics in greater depth and discuss practical ways to integrate AI into your daily practice. 

Key Points:

1. **AI as a Supplementary Tool:** AI is not meant to replace the DSM or clinical judgment but to provide additional data points that enrich our diagnostic processes.

2. **Confidentiality and Compliance:** Emphasizing the importance of maintaining client privacy and adhering to HIPAA regulations when using AI tools in mental health diagnostics.

3. **Efficiency and Precision:** AI can analyze vast amounts of data quickly, identifying potential health risks and comorbid conditions, and providing insights that might escape even the most trained clinical eyes.


If you’re interested in being a guest on the podcast to discuss your experiences with AI in your practice, please reach out. We’d love to hear from you!

Social Media Post: 

🌟 New Episode Alert! 🌟 Can we use AI to diagnose our clients? This is a hot topic of debate for sure. Dive into the fascinating world of AI in mental health diagnostics with the latest episode of “Run Your Private Practice with AI.” Discover how AI can enhance our clinical judgment, improve diagnostic precision, and provide personalized treatment plans. 🎧

🔗 Listen now and join the conversation: [Podcast Link]

🔗 Join the Clinical AI Club: [www.clinicalaiclub.com](https://www.clinicalaiclub.com)

#MentalHealth #AI #Therapy #ClinicalAI #Diagnostics #Podcast #Therapists #PrivatePractice



everyone welcome back to another episode of run your private practice with AI. Today we’re gonna dive into something that’s somewhat controversial yet immensely fascinating to me. And it’s at the crossroads of technology and therapy. So let’s talk about how artificial intelligence is reshaping the way we diagnose our clients in the mental health field. And before I start getting email saying that diagnosing with AI shouldn’t be done. It’s important to underscore that while AI offers incredible capabilities, our primary tools for diagnosis should still be the DSM and our clinical judgment. Ai doesn’t replace these foundational elements, but it rather sort of supplements them so providing additional data points and insights that can really enrich our understanding and our decision making processes when we’re diagnosing claims. And before we get into this a crucial reminder for all of us in private practice, when using AI tools, it’s paramount to ensure that no personal health information, otherwise known as pH AI is involved in using AI tools, especially with free AI tools. For diagnosing the this, obviously, it’s essential to maintain confidentiality, and to comply with our HIPAA compliant regulations that we have to adhere to and everything that we do. That ensures our clients privacy is never compromised when we’re using tools like Chad UBT, to help us with diagnosing clients. So, diagnosing mental health issues really has always been a super nuanced art that relies heavily on the dynamics between the client and the clinician. So traditionally, we dependent on our extensive training, and experiential learning to interpret symptoms and behaviors and utilize tools such as peer consultation, reading about symptoms online, we collaborate with psychiatrists and other therapists in our office or in our social networks to talk about cases to try to get help with diagnosing. And we even use standardized psychological tests to get a diagnosis. So we might refer somebody out to a psychologist that’s going to do some testing to help us fine tune diagnosis. But as technology is evolving, so what are our tools, this evolution really brings us to the frontier of AI in diagnostics, and offers a new dimension of precision and efficiency that we can use to diagnose our clients more quickly, more effectively, more accurately, even in some cases, which I know is controversial. But let’s talk about it a I really pushes the diagnostic boundaries further by integrating complex algorithms and machine learning to analyze data really, that in a lot of ways are humanly impossible to do. So for instance, AI can scrutinize vast arrays of client data quickly and accurately, it can identify potential health risks or diagnosable conditions. There are systems now out there that can use predictive analytics. And they can offer insights into potential comorbid conditions from like noticing subtle patterns that a client’s behavior during the therapy sessions and even AI shifts. So while the client is talking about different things, we all know like with Neuro Linguistic Programming, for example, you can really tell a lot about it people just where they look when they’re talking about things. So there’s an AI now that can do this. And in some experiments I read about recently, they’re already doing it through Harvard, and some other places, they’re using NLP with AI, and they can analyze what a person is really trying to say even at their words might be saying something completely differently. Which is crazy, to me, totally unbelievable. And obviously, these capabilities significantly are surpassing the traditional methods that we’ve used as clinicians, which is tough to say, I know it’s controversial AI, really, I don’t know, it’s taking over the therapeutic world. And I feel like we’ve got to get on board with that. But that’s a whole other episode a whole nother topic that we can talk about. So some examples of this is something called woebot. This is an AI powered chatbot that uses cognitive behavioral therapy principles to deliver mental health support to people that are using the body at the time. And while it’s primarily a therapeutic tool, its interactions can provide insights into the user’s mental health. And it could do this by tracking their mood, their emotional health over time, and that can be super valuable for a diagnostic assessment. So if somebody’s showing up with depressed symptoms, over six weeks, you know, they’re going to meet the criteria for a diagnosis probably or if they’re talking about grief symptoms, and it’s like, come to prolonged grief disorder. You know, it’s gonna pick up on that and it can help with diagnostics and you know, the robots not diagnosing people but you know, it’s taking that information and giving help and suggestions and assignments and that sort of thing around somebody that might have a diagnosis that is picking up on. So the benefits of incorporating AI into diagnostics are really substantial. And they’re multifaceted. So there’s a lot to talk about here. And if you’re listening to this, and you’ve got ideas or thoughts about it, I’d love to hear from you. If you ever want to come on the podcast and talk about your feelings about diagnosis or examples of where it’s gone wrong, or where it’s been helpful, I’d love to have you on as a guest to to talk about that. So another aspect to this, for example, in terms of accuracy, and speed, using AI really can significantly reduce human error and expedite the diagnostic process, which allows us to intervene more effectively with our clients. So you know, using apps to do these many psychological assessments and give you immediate feedback and results. And you know, can say, you might want to consider this diagnosis or that diagnosis, that’s, that’s AI working for you thinking about the results and how it can be interpreted into a diagnosis, and then into interventions and treatment plans, and things that really can help our clients in the long run. And this can happen a lot more efficiently and quickly than maybe just the therapist who might always diagnose with the same diagnoses. Like I’m a PTSD therapist, I think everybody has PTSD. Like that kind of thing, it can help fix that process, make it more efficient, more accurate, because we all have blind spots. And really, it’s just amazing. If you asked me, I think it’s going to help us more than hurt us, hopefully. And, you know, super excited about the possibilities around this. Oh, another thing we can see this helping with more personalized treatment plans around you know, once you get the diagnosis, or an enhanced, you know, diagnostic precision around a diagnosis, it means that our plan is gonna be more accurately tailored to each client’s unique situation, which obviously will help with improving therapeutic outcomes for our clients. And for any of you who are interest based clinicians, you might have noticed that insurance companies are moving more towards like a value based care system. So this idea of faster and more accurate outcomes is really a hot topic is going to be more and more important in the coming years, especially for insurance based therapists, even at a network therapists that are submitting super bills, you’re still subject to reviews and audits from the insurance company, even if you’re not in networks, if you’re not demonstrating value based care, this is going to go on and be more and more of a problem in the future for for all of us. So I think AI can really help with that, though, in addition to all of this, we’ve already talked a little bit about a bit AI off also offers more accessibility tools for clients, it can help democratize access to mental health resources, especially for underserved populations. So there’s a free woebot out there that somebody can download on their smartphone and talk to when maybe they can’t, you know afford a copay or services with with a therapist, like a private pay therapist. These are other options for them. I know they’re not a replacement for therapists. But really, it’s happening every day in in a way it’s positive for people that can’t get help or don’t want to get help or don’t feel comfortable going through a therapist, they feel more comfortable talking to it like a chatbot. So AI can help by providing sophisticated and more affordable tools that support early diagnosis and proactive interventions for clients who might fall through the cracks. Otherwise, you know, and if an AI bot is out there saying, okay, hey, client using this, it looks like maybe you have major depressive disorder, that could be a good thing to go see a therapist Argosy psychiatrists talk about medication options, you know, maybe it will push them more towards treatment where before maybe they thought they shouldn’t do it, or they didn’t need to do it. So these are just my thoughts on AI and how it’s coming into the diagnostic scene and helping us in not hurting us because I know a lot of us are worried about like, okay, or is AI just going to replace therapists altogether. So despite these really significant advancements, the integration of AI into mental health diagnostics does bring some challenges and ethical considerations. Obviously, issues like privacy, consent, and the autonomy of AI decision making are all really paramount in this discussion as well. So I want to talk about that side of it too.


So keep in mind that we must be really cautious of the risk of over reliance on technology and in diagnosing complex human emotions and behaviors, which could potentially lead to oversimplifications or misdiagnosis and a lot of ways if we get lazy with it, and we start relying just on AI to do that part for us, we could miss stuff. So it’s really tempting to just put it into a chat bot and have it say, Okay, looks like major depression and you say, okay, major depression. And you go ahead and write a treatment plan around that and might not be that you know, so obviously, we still need to use our clinical judgment, and decision making and all those traditional tools that we were using before a ik on the scene to make sure it’s an appropriate diagnosis. And we all know in the realm of mental health, we often encounter complex cases where symptoms intertwine. And the path to a clear diagnosis isn’t really straightforward. That is where artificial intelligence could play a transformative role in helping us make decisions. You know, AI’s ability to sift through and analyze huge amounts of data, like, you know, put a psychological test results in put some progress notes, put an update from a psychiatrist, putting all that together into sort of a case conceptualization into AI, could be super helpful, instead of us having to like spend all that time and energy going through all that stuff, and then trying to make, you know, decisions, diagnostic decisions around the information that we’ve got there, it can help speed up the process, but we still need our judgment in there. And, you know, AI can help us uncover underlying patterns that might escape even the most trained political eyes, you know, I mean, we’re therapists, we’re not doctors. So there’s a clinical AI that’s got a psychiatrists bent to it, or even, let’s say, a functional medicine perspective in there, you know, we know nothing about it, they might come in and say, Well, have you had their magnesium levels checked to make sure that that’s why they’re not sleeping, you know, stuff like that it can come in and like sort of like supplement specialties we don’t have or things we might not think about, which is really cool. To me, when I think about that, for example, you can consider a scenario where a client presents with symptoms that might suggest multiple overlapping psychological conditions. And traditionally disentangling these to arrive at a really precise diagnosis could be challenging and time consuming, you might have to work refer out to a lot of different professionals and try to make a concise decision based on all the referrals and we know it could take months to even get into see like a doctor or functional medicine doctor or, you know, an oncologist, you know, there’s just, there’s so much that goes into it. So the AI can take some of that information, and try to make some suggestions for you before you have to wait for all those like ancillary referral pieces, it could be super helpful. And, you know, again, this isn’t about replacing our judgment, but more like enhancing it, and providing a range of possibilities that we might have overlooked in our own clinical mind or clinical judgment, when we’re just sort of in a vacuum around our training and what we typically diagnose people with if we’re a specialist or something like that. So what’s really exciting for me is how AI can prompt us to consider new treatment interventions, goals and objectives. You know, with generated AI insights, we can tailor our therapeutic approaches more precisely, so we can adapt our strategies to better meet the specific needs of our clients. And, you know, for example, if AI suggests a potential comorbid condition that hasn’t been considered already by us or somebody else on their team that’s trying to figure out what’s going on, we can adjust our treatment plan to address this dual diagnosis and might even perhaps introduce new therapeutic techniques, or even modifying existing goals to better support the client’s needs. In this level of customization and treatment planning driven by AI, it not only improves outcomes, but it also accelerates the therapeutic process. So it allows us to act swiftly and confidently and also arms us with data backed suggestions that ensure our interventions are effective and as relevant as possible, because with AI, we can say please go find some research around this diagnosis and how you know which interventions have been helpful before or please go search for evidence based therapeutic interventions that we can put into this treatment plan to make sure I’m giving the client the best possible care that I can, that kind of thing. So we your next phase with a complex case that seems to defy really the straightforward analysis that we would usually do consider how AI tools might help clear the fog a little bit, offering us more clear paths forward that will benefit you and your clients. So you’re not struggling in session to know where to go with things, but also help them get a more direct path to healing and improving their overall mental health, physical health. All of that, as we know, you know, our clients are bio psychosocial. So there’s all kinds of things going on all the time. So with that being said, let’s talk about expanding our AI toolkits just a little bit. But this podcast, I always like to give practical solutions and little tips and tricks that you can use to even get started with the stuff if you’re not using sophisticated AI tools to analyze like large client data set. So So imagine you’re reviewing a transcript of a recent session with a client and something they said really stands out to you but you’re not quite sure, like what the emotional undercurrents are that might be present with that so you can use ChaCha Beatty to help decipher even emotional tones that might be embedded in the words that somebody’s using. And also to make a possible diagnosis from that, which I know sounds pretty wild, but you can. So here’s a simplified, practical prompt that you could use. You can enter this prompt into chat GPT, or any of the free API’s out there. And you could directly enter this into the chat. GPT is chat bots. So I’m a therapist into the chat GPT chat box. So you could say like, I’m a therapist, I had a session with my client who said this in session, quote, I’m fine, comma, I guess I’m just tired. Unquote. I want you to analyze this emotional tone of the statement, it suggests possible underlying feelings or concerns that might be expressed indirectly. So by feeding the statement into an AI chat, GBT, the tool can then analyze the text and provide insights for you. So here’s an example of what Chad GBT would say about that particular thing. You know, I’ll go on with this. It’s funny, I just love how these answers come up. I just took a couple of excerpts to share with you but let’s so the phrase so chatty Beatty says the phrase I guess suggests uncertainty or ambivalence which combined with on just tired indicates a sense of resignation or weariness. The use of just minimizes the issue which might imply that the speaker is downplaying their feelings or concerns, and fine is often used as a polite non committal response, suggesting a reluctance to delve deeper into true feelings. And then it goes on to say when we look at the possible underlying feelings or concerns, we can identify several key points. The mention of being tired could be surface level expression of a deeper emotional exhaustion or burnout, possibly related distress, anxiety or depression. The client might be avoiding discussing their true feelings are problems hiding feelings of sadness, frustration or overwhelm. The nonspecific nature of I’m fine can sometimes indicate a sense of isolation, or the feeling that others do not understand or cannot help with their deeper with their deeper issues, the uncertainty and I guess, cope reflect indecision or confusion about their emotions or their life circumstances, the downplayed expression might suggest a need for support or a safe space to discuss more profound concerns. And then it goes on to say, Now let’s consider how we might approach this therapeutically, we can encourage the client to explore their feelings further by asking open ended questions like What do you think might be contributing to your tiredness? Or can you tell me more about what sign means for you right now, reflecting the words back to them can validate their feelings and encourage deeper exploration such as saying it sounds like you’re feeling quite tired? Can you tell me more about that, asking about their daily activities, sleep patterns, and any recent changes or stressors they


uncover or in any recent changes their stressors can uncover potential contributions to their fatigue, normalizing their feelings by acknowledging that it’s common to feel tired and overwhelmed. And that’s okay to discuss these feelings in detail. And that it’s okay to discuss these feelings and details can also be helpful. So, yeah, so what do you think about that response? It’s just unbelievable, in my opinion, and I’d love to hear what you think of that prompt and maybe give it a try. Let me know what you come up with. I know about you. But I’ve been in sessions where, you know, client has said something weird. And I’m like, What do you mean by that? How should I have responded to that? What clinically, what, what what does it say? You know, questions like that. So you could put little phrases in like that you’re not violating any pH i or anything like that. But it’s sort of a clinical assistant to help you think about things in a different way to help guide you and your sessions with your clients. So not so you can follow up with that prompt with a new prompt. To further get down to diagnosis. Since we’re talking about diagnoses in this in this episode, six A now I’d like you to analyze his tone, and give me a potential diagnosis that can help me develop interventions and treatment strategies for this client based on the diagnosis. So here’s the response a chat GPT gave me considering the statement I’m fine. I guess I’m just tired in the emotional undertones of resignation minimization and surface level responses, we might suspect several potential underlying diagnoses several potential underlying issues. The client’s expression of fatigue and ambivalence could be indicative of a emotional exhaustion, possibly pointing to conditions such as depression or anxiety and the minimization and avoidance of deeper issues might suggest that the client is struggling with feelings of overwhelm and stress. Then it goes on to say potential diagnosis based on the emotional tone and underlying feelings one potential diagnosis could be major depressive disorder. Symptoms of MDD often include persistent fatigue feelings of sadness or emptiness, a lack of interest in activities. The client statement writer might reflect the low energy and emotional numbness associated with depression. Another possible diagnosis can be generalized anxiety disorder, characterized by excessive worry fatigue and difficulty difficulty concentrating, which could also To align with the client’s expression of tiredness and uncertainty, and then I’m not gonna read all this to you, but it goes on to give me you know, interventions and treatment strategies based on what says for potential diagnosis of major depressive disorder. interventions could include cognitive behavioral therapy to address negative thought patterns. And then it goes on to give me ideas around that. Yeah, so it’s a very interesting, you could continue to chat with a chatbot run interventions, you can then tell it to make a treatment plan based on these couple of interventions that you like for that client. And I mean, really, it it just blows my mind actually, that, that this a I can come up with this. So this is just that initial step for you towards integrating AI into your practice. It’s small, and it’s a way for you to begin experiencing the transformative power of AI firsthand with diagnosing for your clients and working with your clients around you know, emotional things that you might not be sure how to treat or what to do with or how to diagnose. So if you’re as intrigued by the potential of these technologies as I am, don’t hesitate to join our clinical AI club at WWW dot clinical ai club.com. This is where we explore these topics in greater depth and discuss practical ways to weave AI into our daily practices. So let’s embrace the future of therapy leveraging cutting edge technology to enhance our understanding and treatment of mental health using AI. So hope you enjoyed today’s episode. I’d love to hear your thoughts. Please reach out any time. If you’re really into AI and you’re using it in your private practice and you’d like to be gasp please feel free to reach out and share your experiences with me. Have a great day.