A disturbing trend: young people are seeking help from artificial intelligence instead of psychiatrists.

Young people in mental health crisis are increasingly seeking help from artificial intelligence. Experts warn that such “therapy” carries the risk of providing harmful advice, neglecting professional treatment, and emotional dependence on AI.

A disturbing trend: young people are seeking help from artificial intelligence instead of psychiatrists.

photo: aastock / / Shutterstock

As the British daily The Times recently reported, young Britons are seeking help from artificial intelligence instead of psychiatrists and psychotherapists. They use AI to diagnose their problems when they feel anxious, worried, overwhelmed by too many responsibilities, etc.

The phenomenon in the British Isles is taking on worrying proportions. In March this year alone, 16.7 million posts appeared on the TikTok platform, whose authors admit to regularly using ChataGPT as a therapist , and in videos published on social media, young people convince that AI is the only “person” they can talk to about their mental problems.

As the authors of the publication emphasize, the popularity of artificial intelligence in the UK is undoubtedly related to the inefficiency of the local public health system. The average waiting time for a psychiatric consultation in the NHS (the equivalent of the Polish National Health Fund) already exceeds 18 months. The high costs of private therapy, which reach 400 pounds per month, are also significant.

The trend of replacing contact with a specialist with advice from a GPT chat is, however, global and is also beginning to reach Poland.

As experts emphasize, artificial intelligence (in the form of therapeutic chatbots) can be helpful in psychiatry, providing support to doctors or therapists.

However, a worrying phenomenon is the use of popular language models by young people, such as GPT chat, bypassing specialists. Such tools should be supervised by an experienced therapist, but this is not always the case.

“We hear about this trend, young people treat the GPT chat as an equivalent of another person, a friend, or even a therapist. They have no one to turn to with their problems in their environment, the age of adolescence is such that you don't run to your parents. Young people are looking for solutions here and now, and the GTP chat provides them with very empowering information. However, there are certain risks associated with this, because artificial intelligence will not confront a young person with their real health problem,” emphasized Aleksandra Olawska-Szymańska, MD, from the Świętokrzyskie Center of Psychiatry, provincial consultant for child and adolescent psychiatry, in an interview with PAP.

Research shows that young people often experience a subjective improvement in well-being after contact with a chatbot, but this is often just a placebo effect.

Chatbots are designed to provide users with maximum comfort and emotional support, but paradoxically this may reduce their motivation to seek professional treatment.

“Besides, it is impossible to protect large language models against possible errors. There is a risk that chatbots may provide inappropriate or even harmful advice, especially in cases of more complex mental disorders,” noted Dr. Marcin Rządeczka, head of the Multimodality Research Laboratory at the Institute of Philosophy of Maria Curie-Skłodowska University in Lublin, in an interview with PAP.

He says users often ignore warnings displayed during conversations with AI that the chatbot is not a professional therapist, treating the responses as an authoritative source of psychological advice.

Another potential threat is the risk of emotional dependency resulting from the accessibility and ability of AI to provide quick responses.

Excessive trust in a virtual therapist leads to a person struggling with emotional problems starting to rely on AI advice instead of seeking help from qualified specialists.

According to Ewa Górko, director of the Department of Mental Health at the Office of the Patient Rights Advocate, artificial intelligence can be used as an auxiliary tool in psychiatry, but it should not replace a doctor. In her opinion, a potential threat to the widespread use of “advice” offered by AI is the elimination of personal contact with a specialist, which is indispensable, especially in the field of psychiatry.

“Every person in a mental health crisis, especially minors, should have the opportunity to have personal contact with a professional. This guarantees patients the right to health services. Making a diagnosis in psychiatry requires conversation, interaction, observation and interpretation of the patient's behavior. On this basis, a comprehensive assessment of the patient's health condition is made, and artificial intelligence does not guarantee this,” emphasized Director Ewa Górko.

In her opinion, AI tools can be used to educate and support both patients and their parents.

“However, when it comes to making a diagnosis or even providing medical advice to people in a mental health crisis, it is a risky tool and leaving their implementation solely to artificial intelligence is inconsistent with the principles of providing services,” noted Ewa Górko.

In addition, the phenomenon of seeking advice provided by chatbots has not yet been regulated by law. “Access to artificial intelligence advice is not limited by age. Meanwhile, the law clearly states that it is the legal guardians of persons under 16 years of age who make decisions regarding treatment, and the opinion of a minor is taken into account only after the minor has reached the age of 16,” emphasized Director Ewa Górko.

Another problem is the lack of regulations regarding legal liability for any deterioration of a patient's health due to inappropriate or even harmful management of the patient during interaction with a chatbot.

However, experts have no doubt that artificial intelligence medical advice will become more common due to the growing needs of patients.

Experts appeal to parents to be attentive to the needs of their children, then they will not have to seek psychological support in the virtual world.

“Let's take care of our relationships with our loved ones, because if they are not there, it will be difficult to notice the often alarming signals when it comes to the child's mental health and functioning,” emphasized Aleksandra Olawska-Szymańska, MD.

The number of patients who need specialist support is growing every year, which is due to the deterioration of the mental condition of children and adolescents after the pandemic, but also to greater social awareness of the problem.

Last year, over 279,000 young people received support from the National Health Fund, which means an increase of over 130% compared to 2019. Before the pandemic, the fund spent PLN 260 million on psychiatric services for children and adolescents, and now it is over PLN 1 billion. (PAP)

bli/ bst/ lm/

Sourse

No votes yet.
Please wait...

Leave a Reply

Your email address will not be published. Required fields are marked *