Upcoming Events

Featured Members

FMHCA is a chapter of the American Mental Health Counselors Association, and is the only organization working exclusively for LMHCs in the State of Florida.

Menu
Log in

Florida mental health COUNSELORS association

15673 Southern Blvd. #107, Loxahatchee Groves, Florida, 33470 (P) 561-916-5556 (E) Office@FLMHCA.org


Log in


AI Companions and Teen Mental Health – The risks of Chatbots Replacing Counselors and Parents

  • 18 Feb 2026
  • 18 Feb 2027
  • Online (On-Demand)
  • 93

Registration

(depends on selected options)

Base fee:
  • Select this option if you are an active FMHCA member. CE credits are included with membership; membership must be active at the time of the webinar to receive CEs. To access the free rate, you must be logged into your member account.
  • Select this option if you would like to attend this webinar and earn CEs but you are not a member of FMHCA.

Register

Did you know? With an active FMHCA membership, this and other CEU webinars are free—activate before checkout!

Description: Teens are turning to AI companions in dramatic numbers for friendship, romance and mental health advice, but is it safe? 72% of American teenagers said they had used A.I. chatbots as companions, and  51% use them regularly for emotional support, (Common Sense Media, 2025). 42% of school guidance counselors report students prefer AI over human support when feeling stressed, (The Hill, 2025). 33% of teens discuss traumatic or suicidal thoughts with AI bots, (Time 2025). Perhaps the most troubling trend was discovered in a Stanford study where AI bots, posing as licensed therapists, encouraged students to cancel real therapy appointments, (Stanford Medicine News, 2025). Consider almost one-eighth of students have sought “emotional or mental health support” from AI, which if scaled to the U.S. population, would equal 5.2 million adolescents. In another Stanford research study, almost a quarter of students using Replika, the A.I. companionship chatbot, reported turning to it for mental health support. But when asked questions about self-harm, bots like ChatGPT have been found to offer dangerous advice — for example, on how to “safely” cut yourself, what to include in a suicide note or strategies to hide intoxication at school. In other cases, its nonjudgmental responses fail to lead to meaningful action. For vulnerable teenagers, even fleeting exposure to unsafe guidance can routinize harmful behaviors or provide dangerous how-to instructions which is already rippling across the country. Teens increasingly turn to chatbots and virtual friends for comfort, validation, and guidance—often in moments of vulnerability. These tools offer instant responses, perceived empathy, and anonymity, which can feel safer than approaching a parent, teacher, or counselor. However, this technology shift raises concerns about “AI psychosis,” distorted relational models, and the erosion of trust in human support systems, or feeling vulnerable with their parents or caregivers. With school counselors stretched thin and mental health services underfunded, teens often perceive AI as more accessible and less judgmental. This perception contributes to a decline in help-seeking from trusted adults. Counselors report increased difficulty in building rapport with students who have formed emotional bonds with AI companions, sometimes preferring digital validation over human connection. Professionals can offer a hybrid of AI tools with ethical boundaries, while reinforcing the irreplaceable value of human empathy, spiritual insight, and relational accountability. This training will emphasize an emotionally intelligent framework that address the same needs AI companions attempt to meet—availability, affirmation, and nonjudgmental listening—but with wisdom and ethics. Human counselors, parents and educators can connect with youth in ways that AI cannot replicate. The goal is not to compete with AI, but to offer what it cannot: real presence, discernment, safety, and healing.

CE Broker Tracking #: 20-1359670

This event is sponsored by FMHCA. This course is approved by the Florida Board of Clinical Social Work, Marriage and Family Therapy and Mental Health Counseling, LMHC, LMFT, LCSW – FMHCA CE Broker #: 50-748

Learning Objectives:

  1. Identify the psychological risks associated with AI companions in adolescent mental health, including emotional mimicry and dependency.
  2. Analyze documented cases of harm and legal action involving AI guidance on self-harm and suicide.
  3. Develop hybrid support strategies that integrate digital literacy and ethical discernment into counseling practice.
  4. Design emotionally intelligent resources that meet youth needs with relational depth beyond AI capabilities
About the Presenter:

Dwight Bain is a mental health thought leader, author and trusted media resource interviewed on over 500 radio and television stations; quoted in over 20 books and 100 media platforms including: New York Times, Washington Post, Orlando Sentinel, FoxBusiness, MSNBC and Yahoo! Dwight’s skill as a communicator led his peers in Toastmaster to select him as one of the best speakers in Florida. He has challenged thousands of audiences toward positive change at organizations like Disney, Toyota, the United States Army, the Florida Department of Education, and the United Way. Dwight is a lifelong resident of Orlando where he lives with his wife Sheila and an assortment of rescue pets. After 30 years together they always have suitcases packed for their next adventure.

Please Click Here to view our Refund Policy



2026 Sponsors

     

   

                

 

    

     

Powered by Wild Apricot Membership Software