• Home
  • Store
    • Total Access Subscriptions
    • Newsletter Subscriptions
    • Multimedia
    • Books
    • eBooks
    • ABPN SA Courses
  • CME Center
  • Multimedia
    • Podcast
    • Webinars
    • Blog
  • Newsletters
    • General Psychiatry
    • Child Psychiatry
    • Addiction Treatment
    • Hospital Psychiatry
    • Geriatric Psychiatry
    • Psychotherapy and Social Work
  • Toolkit
  • FAQs
  • Log In
  • Register
  • Welcome
  • Sign Out
  • Subscribe
Access Purchased Content
Home » AI-Assisted Therapy
Expert Q&A

AI-Assisted Therapy

CHPR_QA2_Omer Liran_photo_sm.png
January 1, 2026
Omer Liran, MD, MSHS
From The Carlat Hospital Psychiatry Report
Issue Links: Editorial Information | PDF of Issue

Omer Liran, MD, MSHS

Co-Director, Cedars-Sinai Virtual Medicine; Assistant Professor, Department of Psychiatry & Behavioral Neurosciences; Co-Founder and CTO, Xaia, Los Angeles, CA. 

Dr. Liran is co-founder of Xaia, a company that develops AI-assisted therapy and clinical documentation tools. This content has been peer-reviewed to ensure it remains balanced and educational. Relevant financial relationships have been mitigated. 

PDF

Getting your Trinity Audio player ready...

CHPR: Dr. Liran, please begin by telling us a little about yourself. 
Dr. Liran: I’m a psychiatrist at Cedars-Sinai Medical Center in Los Angeles, and I co-direct the medical center’s Virtual Medicine lab. Our lab uses artificial intelligence (AI) and virtual reality (VR) to improve care for patients and lighten the bureaucratic load on physicians.

CHPR: What drew you to bringing these kinds of technologies to psychiatry? 
Dr. Liran: I’ve been fascinated by AI for a long time, and especially with how VR, augmented reality (AR), and AI can be applied in health care. Psychiatry faces a worsening shortage, and we’re not training enough new psychiatrists to meet demand. I believe technology has to be part of the solution.

Glossary of Terms

Artificial intelligence (AI): Computers that mimic aspects of human reasoning, like understanding language or spotting patterns.

Virtual reality (VR): A headset-based experience that immerses patients in a fully digital environment, such as a beach or forest.

Augmented reality (AR): A mix of digital and real life (eg, overlaying virtual images or text into real surroundings via a phone or glasses).

Scribe: An AI assistant that listens during the visit and helps with documentation, flagging safety issues and suggesting next steps.

Copilot: An AI tool that runs in the background during a clinical encounter to assist the clinician (eg, drafting notes, pulling up relevant information, or flagging safety concerns).

Chat/AI chat: A structured conversation where the AI asks patients questions and records their responses, following a clinician-designed workflow.


CHPR: How can AI and VR be a solution to that problem? 
Dr. Liran: We see them as ways to extend a psychiatrist’s reach. The field is moving toward platforms that can assist with every stage of the patient encounter—for example, by helping with intakes through synthesizing information from the chart and from structured AI chat interactions that gather basic history before the visit. Some next-generation systems also incorporate an AI scribe or copilot during the encounter to help document the conversation. They can also highlight potential safety concerns and instantly pull up information you might need, like for medication side effects. And they’re also evolving toward assisting with after-visit documentation, such as drafting clinical notes that include a differential, proposed treatment plan, and even coding recommendations. Taken together, these tools give psychiatrists more time to focus on their patients and boost patient and provider satisfaction.

CHPR: For clinicians who are interested in integrating AI into their work, what’s the best place to start?
Dr. Liran: The easiest entry point is to use AI scribes or speech-to-text tools that many EHRs already support. These can cut down on documentation almost immediately. Beyond that, the APA Learning Center and the Digital Medicine Society have courses on topics like generative AI in health care (Editor’s note: See “Where to Start With AI” table on page 7).

CHPR: Does the technology also have a direct therapeutic role for patients, beyond supporting clinicians?
Dr. Liran: Absolutely. Depending on the platform, AI tools can be used to teach skills such as relaxation training or breathing interventions. A growing area of development is longitudinal care, like programs in the style of cognitive behavioral therapy in which patients can practice skills between sessions, complete homework, then review their progress with the AI at the next session. The idea is to extend the reach of psychotherapy by supporting the skills-based components with AI while the clinician handles the relational and diagnostic parts of care.'

“Even when the technology flags a safety concern, escalation still falls on a human clinician. The AI isn’t calling 911. That’s also the point where human connection really matters. I don’t believe AI, even when it’s super-intelligent, will ever make psychiatrists obsolete, because there’s something special about human connection, about people talking to each other.” 

Omer Liran, MD, MSHS

CHPR: Are there ethical or legal issues we should be aware of?
Dr. Liran: Safety is the biggest concern. If the AI mishandles a suicidal patient, the results could be tragic. Good systems have multiple built-in safeguards, but the risk isn’t zero. And there’s the worry about people using general chatbots to replace therapists. Unsupervised models can just tell people what they want to hear or amplify delusions, and that can be very dangerous. If a manic patient believes they’re the emperor of the world, a chatbot might just agree with that. There have been recent reports of chatbots reinforcing psychosis (Fieldhouse R, Nature 2025;646:18–19). 

CHPR: How can clinicians tell whether an AI therapy tool is safe and reputable?
Dr. Liran: Look for four things: (1) Clinical oversight: Does it connect to a provider? (2) Evidence: Has it been studied? (3) Safeguards for crises, and (4) Credibility: Is it affiliated with a trusted health system or university? Those are good signs you’re dealing with a responsible product.

CHPR: How do different systems out there compare to one another?
Dr. Liran: There is now a broad spectrum of digital mental health tools, ranging from unsupervised wellness chatbots like Woebot, Wysa, or Replika, to app-based structured therapy programs such as Headspace, SilverCloud, or Meru Health, and finally to clinician-integrated platforms that are designed to work alongside psychiatric care. The key distinctions are safety guardrails, evidence base, and how care is escalated when symptoms worsen.

CHPR: Speaking about how care is escalated, what happens if a patient is in crisis—for example, if they are suicidal?
Dr. Liran: Some platforms already attempt to detect when a patient may be at risk and notify the clinician, and this is likely to become more sophisticated over time. But even when the technology flags a safety concern, escalation still falls on a human clinician. The AI isn’t calling 911. That’s also the point where human connection really matters. I don’t believe AI, even when it’s super-intelligent, will ever make psychiatrists obsolete, because there’s something special about human connection, about people talking to each other. I’d be worried about a future where AI that only mimics empathy is left to care for patients on its own.

Where to Start With AI

Examples of AI Documentation/AI Speech Tools Already in Use

  • EHR-embedded AI scribes (eg, Epic)
  • Dictation platforms (eg, Nuance Dragon Medical One)
  • Ambient scribes that transcribe visits in real time (eg, Nuance DAX, Sunoh.ai, DeepScribe)
Educational Resources
  • APA Learning Center CME: Courses such as AI Explained: Practical Applications for the Modern Psychiatrist (education.psychiatry.org)
  • Digital Medicine Society: Short course on generative AI in health care (dimesociety.org)
  • “How Artificial Intelligence Helps Doctors Focus on Their Patients”: A free video overview on how AI documentation tools work in clinical settings and reduce charting burden (www.tinyurl.com/56yybyau)
  • Artificial Intelligence and Machine Learning for Primary Care (AiM-PC): Free curriculum relevant to psychiatrists on the fundamentals of AI and machine learning, ethical and social implications, how to critically evaluate AI tools, and practical guidance on integrating AI into clinical encounters (www.tinyurl.com/yvwmykxz)
  • Artificial Intelligence in Health Care: Free audio and video series from the American Medical Association (www.tinyurl.com/4p5zv892)


CHPR: How have patients felt about sharing their personal details with an AI? 
Dr. Liran: Survey data so far suggest that many people feel comfortable disclosing sensitive information to AI, reporting for example that they find it to be nonjudgmental and patient (Spiegel BMR et al, NPJ Digit Med 2024;7(1):22). Some patients do report that the tone can feel robotic or emotionally flat, which is a reminder that this isn’t a substitute for human connection. But for many, especially early in treatment, that sense of psychological safety can lower the barrier to opening up.

CHPR: What are the interfaces usually like? 
Dr. Liran: Many systems run on VR and AR headsets like the Quest and the Apple Vision Pro, but those are quite expensive. Mobile versions are generally much more accessible. On a phone, patients can talk with it by voice just like a conversation, or switch to text mode (which younger patients seem to prefer these days) and it looks like any other chat app.

CHPR: It's too bad the headsets are so expensive. They provide such an immersive experience. 
Dr. Liran: It really is the future, but we’re not quite there yet. When headsets become lighter, more comfortable, and more like glasses, people will use them more. Right now, I can’t wear a headset for more than 30 minutes before it feels too heavy. And in hospital settings, especially psychiatric units, there are added concerns. You don’t want to hand patients a device with cords and straps that could pose risks. So, while the technology is promising, it still has practical limitations.

CHPR: Are there certain patients whom you think AI tools are better suited for than others? 
Dr. Liran: AI tends to be most helpful for patients who are stable enough to engage with structured therapeutic content. That includes many patients with anxiety disorders, mild to moderate depression, insomnia, chronic pain, or stress-related conditions. We need to be more cautious with patients who are highly dysregulated, actively psychotic, manic, or in acute crisis, where misinterpretation of language or delayed escalation could cause harm (Grabb D et al, arXiv preprint arXiv:2406.11852). There are also practical considerations. For example, VR headsets are not a good fit for patients who are severely agitated or behaviorally disorganized, because the hardware itself can become a safety risk. However, a non-immersive tool, such as a scribe assisting the clinician or a simple breathing or meditation module on a mobile device, may still be appropriate in those cases—as long as the clinician remains in charge of the overall course of care.

CHPR: Is VR used for trauma, like PTSD exposure therapy?  
Dr. Liran: Yes. VR exposure therapy is well studied and used by the VA, although it hasn’t gone through FDA clearance as a psychiatric indication (www.tinyurl.com/yhvy7tx5). One of the ongoing questions is how AI might eventually assist with therapist-guided trauma work in a safe, regulated way.

CHPR: Where do you see this technology in five years? 
Dr. Liran: The technology is accelerating extremely fast. If we had this conversation six months ago, it would be different from today. These tools will become far more capable, but also riskier. AI models have complex internal decision-making processes that we don’t fully understand, with unpredictable outputs. We need strong safeguards and clinician oversight to steer them toward good.

CHPR: Thank you for your time, Dr. Liran.

Hospital Psychiatry
KEYWORDS AI therapy clinician oversight digital psychiatry mental health technology virtual reality therapy
    Chpr qa2 omer liran photo sm
    Omer Liran, MD, MSHS

    More from this author
    www.thecarlatreport.com
    Issue Date: January 1, 2026
    SUBSCRIBE NOW
    Table Of Contents
    Learning Objectives, Emerging Technologies in Hospital Psychiatry, CHPR, Jan/Feb/March 2026
    Transforming Psychiatric Documentation: The Rise of AI Scribes
    Beyond Meds: TMS in Acute Psychiatric Care
    AI-Assisted Therapy
    Telepsychiatry in Hospital Psychiatry: From Stopgap to Standard Care
    Are We Missing a Potentially Reversible Contributor to Cognitive Impairment?
    Ketamine Shows Only Modest Advantage Over Midazolam in Inpatient Depression Treatment
    From Acute to Maintenance: The Role of ECT in Long-Term Psychiatric Care
    CME Post-Test, Emerging Technologies in Hospital Psychiatry, CHPR, January/February/March 2026
    DOWNLOAD NOW
    Featured Book
    • MFB7e_Print_App_Access.png

      Medication Fact Book for Psychiatric Practice, Seventh Edition (2024) - Regular Bound Book

      The 2024 reference guide covering the most commonly prescribed medications in psychiatry.
      READ MORE
    Featured Video
    • KarXT (Cobenfy)_ The Breakthrough Antipsychotic That Could Change Everything.jpg
      General Psychiatry

      KarXT (Cobenfy): The Breakthrough Antipsychotic That Could Change Everything

      Read More
    Featured Podcast
    • shutterstock_2716137939.jpg
      Child Psychiatry

      Living Without Illusions: Psychological Survival in a World of Persistent Hatred

      Today’s episode is one we’ve been sitting with for a long time. We’re talking about how to survive psychologically in a world where hatred is persistent, not abstract,...

      Listen now
    Recommended
    • Join Our Writing Team

      July 18, 2024
      WriteForUs.png
    • Insights About a Rare Transmissible Form of Alzheimer's Disease

      February 9, 2024
      shutterstock_2417738561_PeopleImages.com_Yuri A.png
    • How to Fulfill the DEA's One Time, 8-Hour Training Requirement for Registered Practitioners

      May 24, 2024
      DEA_Checkbox.png
    • Join Our Writing Team

      July 18, 2024
      WriteForUs.png
    • Insights About a Rare Transmissible Form of Alzheimer's Disease

      February 9, 2024
      shutterstock_2417738561_PeopleImages.com_Yuri A.png
    • How to Fulfill the DEA's One Time, 8-Hour Training Requirement for Registered Practitioners

      May 24, 2024
      DEA_Checkbox.png
    • Join Our Writing Team

      July 18, 2024
      WriteForUs.png
    • Insights About a Rare Transmissible Form of Alzheimer's Disease

      February 9, 2024
      shutterstock_2417738561_PeopleImages.com_Yuri A.png
    • How to Fulfill the DEA's One Time, 8-Hour Training Requirement for Registered Practitioners

      May 24, 2024
      DEA_Checkbox.png

    About

    • About Us
    • CME Center
    • FAQ
    • Contact Us

    Shop Online

    • Newsletters
    • Multimedia Subscriptions
    • Books
    • eBooks
    • ABPN Self-Assessment Courses

    Newsletters

    • The Carlat Psychiatry Report
    • The Carlat Child Psychiatry Report
    • The Carlat Addiction Treatment Report
    • The Carlat Hospital Psychiatry Report
    • The Carlat Geriatric Psychiatry Report
    • The Carlat Psychotherapy Report

    Contact

    carlat@thecarlatreport.com

    866-348-9279

    PO Box 626, Newburyport MA 01950

    Follow Us

    Please see our Terms and Conditions, Privacy Policy, Subscription Agreement, Use of Cookies, and Hardware/Software Requirements to view our website.

    © 2026 Carlat Publishing, LLC and Affiliates, All Rights Reserved.