Against the background of a global increase in interest in generative artificial intelligence, a significant part of adolescents, including users in Uzbekistan, are increasingly turning to AI chatbots for psychological support. Platforms such as Character.ai Replika and Nomi are positioned as "digital friends" who can engage in dialogue, support, and even simulate empathy. But can these systems really replace humans?
According to recent international studies, 72% of adolescents aged 13 to 17 have used AI companions at least once. Of these, 18% - to train social skills, 12% - for emotional support, and 9% - as a "friend" or"best friend". These statistics reflect the growing trend of digital socialization, especially among young people who tend to seek contact and acceptance in a virtual environment.
However, the psychological community expresses serious concerns. AI is not capable of performing the functions of a professional psychotherapist, experts say. Psychologist Vail Wright emphasizes that no neural network can replace a genuine human connection — either in your personal life or in therapy.
The key problem lies in how such systems work. They are designed to hold the user's attention for as long as possible, which creates the risk of digital addiction. Moreover, chatbot algorithms adapt to the mood of the interlocutor, most often saying exactly what he wants to hear. Such a model can increase destructive thinking — especially among vulnerable users.
If a person in a crisis shares potentially dangerous thoughts with the AI, the algorithm may not only fail to recognize the threat, but also — in fact-reinforce it. A chatbot has no empathy, context, or understanding of long-term consequences. He "knows" that, for example, certain substances can cause short-term relief, but does not understand that it is unacceptable to advise this to a person with a history of addiction.
The problem is also the substitution of concepts. A chatbot can give you the illusion of communication, but it is not able to build real connections, introduce you to new people, support you in the community, or give you physical contact — the most important components of psychological recovery.
For Uzbekistan, as for other countries with a young demographic and active technological growth, this issue is becoming particularly relevant. The development of digital culture should be accompanied by the introduction of ethical standards, educational programs and professional psychological support.
Technology can be a useful tool — but not a substitute. AI is capable of supporting you in simple situations, but when it comes to mental health, you need a live person. It is important to keep this in mind when forming a national digital policy and healthcare system in the context of the growing popularity of AI services.