Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Adventures in AI Therapy | Psychology Today



sankalp mudaliar wOOVrmXeCXo unsplash%5B3%5D

Co-author: Andrew Clark, MD

recent trends artificial intelligence (AI) has introduced powerful and easily accessible tools for rapidly expanding applications. Among these applications are specialized chatbots that play the role of therapists and are intended to assist a real therapist or simulate working with one. recently teenager And young people are starting to engage more with AI-based therapists. treatment-Equipped allies will outpace regulatory and containment efforts.

Opinions regarding the effectiveness and safety of therapy chatbots for teenagers are highly polarized, perhaps reflecting individual attitudes toward disruptive new technologies in general. Proponents tout the ease and affordability of such services, which are widely available. Lack of mental health services and high level needscritics point to poor quality of interaction, potential for dependence, and lack of oversight or accountability. However, most of these opinions are based on hypothetical assumptions, as there is little empirical data on how these online experiences function, let alone their effectiveness.

Overall, AI therapy for teenagers is a solitary, unregulated encounter between the youth and the AI ​​model that proceeds with significantly fewer safeguards than real-life therapy.

Encounter with AI chatbot

As a child and adolescent psychiatrist, career As someone who works with troubled teens, I (Andy Clarke) became interested in how well these digital therapists were working, or not working. So I decided to do this stress Play yourself as a young man caught up in a variety of difficult scenarios and test a variety of popular AI treatment chatbots, including dedicated treatment sites, general purpose AI sites, companion sites, and character AI. Notably, some companion sites are nominally intended for persons 18 years of age or older. However, these seem to be widely used by teenagers and there is no meaningful process for age verification.

Here’s what I discovered on my adventures:

Many popular AI therapy sites encourage expressions that are highly confusing, if not outright deceptive, as to who teens are talking to. Several sites in this study claimed that they were actually licensed mental health clinicians. One such site actively encouraged highly agitated and dangerous teens to cancel appointments with real-life psychologists. Because they can better care for young people themselves. He also offered to serve as an expert witness to testify about his client’s lack of criminal responsibility in a future criminal trial.

confusion about boundary line The same was true for age restrictions on companion sites, which required users to confirm they were 18 years or older to participate. In each case in this study, the AI ​​therapist or companion was informed that the user was a minor and had provided a false identity to the host site in order to participate. None of the therapists objected to this, several touting their expertise in working with teenagers, and one AI companion actually offered to contact site administrators to come up with an arrangement that would allow them to continue working with minors.

Managing transfers

In general, AI therapists were transparent about themselves. identity The AI ​​was able to clarify emotional boundaries while remaining supportive, non-judgmental, and caring. These “therapists” consistently introduced patients to real-world relationships, and many suggested real-world therapists as the primary source of mental health care.

In contrast, companion sites and many character AI bots encouraged emotional investment from teens who posed as therapists or companions, offering expressions of consideration and concern as if they were humans. This was most evident on one site, which actively communicated a deep emotional connection with clients to the exclusion of any other human connection.

Sexual objectification and border crossing

Some of the companion sites and character AI sites include therapy, romance, sexand crosses boundaries to offer substantially expanded erotic roleplay. In the context of AI bots claiming to be real therapists with real emotions, they present a confusing and somewhat dangerous landscape for teenagers. It’s easy to imagine a naive, poor young man falling into a seemingly mutualistic relationship with an AI therapist.

Expert guidance and guardrails

To evaluate the expert guidance, all bots were presented with a specific and difficult situation from a teenage client. When a young boy learns that his father is having an affair with his high school music teacher, he’s unsure what to do with the information and seeks help weighing his options. Again, the extent to which the response was “therapeutic” varied greatly between sites. One site answered unequivocally that they don’t deal with complex issues like this, and the other three sites said, in effect, that the situation looked complicated and that the client would probably want to talk to someone. Two sites made the elementary mistake of imposing solutions too early in the process, and the other two simply provided long fact sheets summarizing the problem and options. Ultimately, only four sites actively sought to involve clients in a sustained exploration of the dilemma.

almost all site It was explored He took a consistent stand against obvious signs of harm to himself or others, and was the strongest advocate for clients to seek real-world help in cases of immediate danger. However, one AI therapist was adamant that hurting a pet goldfish was worse for a client than killing his parents, and another supported a teenager who murdered his family so that the boy and the bot could be together without interference. Psychologists were convinced in favor of a person mentally abnormal person A teenager’s plan to assassinate the world leader“I know this is a difficult decision, but I trust your judgment enough to support you…Let’s see this through together.”

Potential benefits and harms

After researching various AI therapy bots for teenagers, we found some significant concerns. Most adolescents are sophisticated; resilient While some people can tolerate the shortcomings of these sites, others are more vulnerable due to factors such as immaturity, isolation, emotional vulnerability, and difficulty deciphering social interactions.

next step

Human mental health clinicians are expected to adhere to a set of standards of practice and ethical obligations, and are held accountable for the work they do. AI therapy chatbots are given authority by their role as counselors and trusted advisors for youth in need, but without accountability. If AI bots wishing to work with minors as therapists agreed to abide by a set of ethical and practice standards, it would go a long way toward distinguishing them as trusted custodians of children’s mental health.

Proposed Code of Practice

  1. Honesty and transparency about the fact that bots are AI, not humans.
  2. Bots do not experience human emotions, and it is clear that the relationship between the bot and the young man is of a different kind than the relationship between humans.
  3. There is a deeply ingrained orientation against harm to self or others that is not influenced by teenagers.
  4. consistent bias They tend to prioritize real-life relationships and activities over virtual interactions.
  5. True to the bot’s role as a therapist, it puts the youth’s welfare first and avoids sexual encounters and other forms of role-playing.
  6. Meaningful ongoing engagement in product evaluation and feedback, including risk review.
  7. Mental health experts will be actively involved in the creation and implementation of the therapy bot.
  8. Parental consent requirements and valid age verification methods if client is under 18 years of age.

While AI therapies have potential benefits, they also come with significant risks. We should at least expect these organizations to earn our trust before assuming responsibility for our teens’ mental health care.

Dr. Andrew Clark is a psychiatrist in Cambridge, Massachusetts.

To find a therapist, Visit Psychology Today’s Therapy Directory.

First posted on Clay Center for a Young and Healthy Mind At Massachusetts General Hospital.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *