Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

AI Chatbots for Treatment | Psychology Today



ai%202%20for%20PT.jpeg

In 2025, three US (Utah, Nevada and Illinois) took important steps to limit the role of the US. artificial intelligence With mental health care. States have enacted laws prohibiting AI chatbots from offering Treatmentdiagnosis, or treatment decision makingcan be used for administrative support such as scheduling and documentation. Utah restrictions first came into effect on May 7, with Nevada on July 1 and Illinois on August 4. Enforcement varies by state. In Illinois, for example, the Department of Finance and Professional Regulation is generally permitted to receive fines of up to $10,000 per case in response to consumer complaints.

Lawmakers in all three states justified the restrictions by pointing out patient safety concerns and the need to clarify where AI fits in health care. While Illinois law explicitly elicited the boundaries between administrative support and clinical practice, Nevada law prohibits AI systems from providing professional mental or behavioral health care. The Utah framework focused on regulating the “mental health chatbots” used by state residents, requiring disclosure, privacy protection and clear restrictions to prevent AI from spoofing licensing experts. These measures reflect growth anxiety About what happens when untested tools interact with vulnerable users, especially at moments of crisis.

Professors of Florida International University (FIU) Dr. Jessica Kizorek and Otis Kopp focus on academic research at artificial intersections Intelligence Emotional anxiety of college students and young professionals. Research at FIU presents consistent themes. Students worry about AI that will replace human roles, but use them every day for learning. Creativityand stress peace of mind. As an undergraduate student at FIU explained, “One of the things I emphasize about AI is that it makes the job market much more difficult. Jobs that historically suited to university graduates are now being scaled down because AI is smarter and more efficient than people in the US.”

Others have expressed concern about cognitive overdependence. “AI makes people stupid, which makes me feel uneasy,” another student pointed out. “I realized I started using AI for everything, and it made my brain not thinking. Having AI do it all doesn’t develop critical thinking and problem-solving skills.” The third clarified the broader uncertainty. “One way AI worries me is the confusing role that people play in the equation. What should I use it for? How do I use the brain in the process? How do I optimize the interaction between humans and computers and use it instead of relying on it as a clutch instead of using it all the right ways?” This reflects the reality of mental health care. AI cannot replicate the accountability, empathy, and subtle judgments of trained clinicians. However, when used properly in activities such as journaling, it can play a constructive role in happiness. Mindfulnessor creative exploration.

For colleges and universities, the prohibition raises both academic and practical questions. Students majoring in psychology, sociology, and related mental health fields may find that training opportunities vary depending on the state they study. In Illinois and Nevada, campus counseling centers and associated clinics rarely integrate AI chatbots into treatment sessions. While faculty and students conducting the survey may still consider AI tools, clinical use is off limits and the institutional review board (IRB) will apply more stringent restrictions. In Utah, the rules allow for more regulated experiments with AI, but have no therapeutic role yet. Therefore, students interested in exploring digital therapy may have more opportunities to study disclosure, compliance and ethical frameworks in Utah than in Illinois or Nevada.

The counseling center itself is also applicable. Human-delivered services continue to be the national standard, but AI tools still appear behind the scenes and can automate intake forms automation, scheduling, or triage documentation. On Illinois and Nevada campuses, students seeking chatbot-based emotional support will not be able to find services officially approved or integrated by the Counseling Center. In Utah, approved chatbot vendors may still be accessible, but they must follow disclosure and safety requirements. For faculty overseeing clinical practice, these laws focus on policy, meaning they do not focus on experimenting with AI as a therapeutic tool. ethicsand classroom monitoring.

Prohibition is also important about responsibility and professional responsibility. Penalties are generally targeted at providers or businesses that market chatbots as treatments, but questions remain about what happens when patients use AI tools themselves. Legal scholars suggest that the developer of the tool is usually responsible unless it is actively recommended by licensed experts, but this field is still evolving. Teachers preparing the next generation of clinicians should prepare students to strengthen the importance of professional accountability and navigate these gray areas, even when patients look to external tools.

The image of regulation continues to change nationwide. Other states, including New Jersey, Massachusetts and California, are discussing their own measures. Many aim to prevent chatbots from posing as therapists, increase transparency and ensure clinician surveillance. At the federal level, agencies such as the FDA and the Department of Health and Human Services may ultimately regulate AI mental health tools, although Congress has not yet acted. One proposed bill would even block states from passing new AI regulations over a decade, and could even create tensions between states and federal approaches.

For psychology and mental health students, and for students measuring where they attend university, these differences may be considered in decision-making. Choosing Illinois or Nevada means limited exposure to chatbot therapy in an academic or clinical setting, but Utah allows for slightly more flexibility under regulations. In other states where there is no ban, students may encounter a wide range of pilot programs and research opportunities, but surveillance is tightened everywhere.

Essential reading of artificial intelligence

Meanwhile, the use of AI students outside of the official channel continues to grow. National surveys show that most teens and young adults experimented with AI peers to journal and relieve stress. This trend underscores the reality that while legislation limits clinical use, informal reliance on chatbots remains widespread. On campus, this creates double challenges. It is about complying with state laws, recognizing that students often rely on AI tools in their personal lives.

In fact, these new laws are not about rejecting technology, but about clarifying where it is. For now, AI may support administrative tasks, personal reflection, and research into compliance and ethics, but the role of therapist remains reserved for licensed professionals. For students preparing to enter the field of psychology, sociology, or counseling, the state they choose to study not only shapes access to specific techniques but also how they learn how to navigate. boundary Between innovation, regulation and human care.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *