The topic of how machines can think was first created by the mathematician Alan Turing in 1950. This became the basis of his experiment, known as the Turing test. The point was simple: if a machine can imitate the human behavior of a computer, so that a person cannot recognize it as valid, then it can be considered intellectually guilty. Thüring conveyed that such things will happen more and more often in the future. And he didn’t have mercy: already in the 1960s, MIT professor Joseph Weizenbaum created Eliza, the first chatbot, which became the prototype of modern piece intelligence. It’s cool that Eliza was programmed to imitate a psychotherapist. Today's Thuringian diet sounds even more relevant, even if we live in an era when technology is developing so quickly that marriage does not require the creation of proper exchanges for them.
In 2025, nutrition has fundamentally changed. Now it sounds like this: can machines not only think, but also perceive and understand emotions? Even more people are devoting themselves not to a living therapist, but to a piece of intelligence, actually looking like a cognitive machine.
Advertising.
Joaquin Phoenix in the film “Vona”, 2013
In the hours of Eliza, the world has been walking for a long time. Today there are Pi services, which position themselves as “your special AI creations to support and always fulfill your instructions,” or Replika, which is “ready to listen and talk.” Also Woebot, Earkick, Wysa and Therabot. The list of such problems is steadily growing. Some of them were solved with the participation of mental health experts, while others were dismantled without such examination, and it is difficult for the cross-sectional correspondent to understand how careless they are.
One of the main reasons for the popularity of artificial intelligence in the field of mental health is its value. Sessions with a live therapist, online or offline, are expensive, often not covered by insurance and require complicated bureaucratic procedures. For the younger generation, this has become a barrier, so those who hire a digital “therapist” to save money. Stigma is an important factor. “Rich families, through culture, religion, or tiredness, have a pre-emptive approach to psychotherapy and mental health,” explains Brigid Donahue, a licensed clinical social worker and EMDR therapist based in Los Angeles.
Another plus for koristuvachs is handiness. Woebot’s slogan speaks for itself: “get ready.” “Your AI-therapist will never be at your door, never ask you to attend, or miss a session,” like Videnna Pharaoh, family therapist and author of The Origins of You. “They are available 24/7. This creates the illusion of the ideal I'm sure you won't be disappointed at all. But the truth is that there is no healing in the world without mercy.”
Health is rarely perfect. It is popular in difficult moments of therapy – in conflicts, stress and difficult issues that are impossible in an automated format. “If we remove the graces and natural frustrations, we eliminate the possibility of clients learning to deal with difficulties,” explains Pharaoh. For the rich, even this “lack of training” in live therapy can be beneficial. “If you grew up in an atmosphere of constant pressure and being homeless, then the therapist’s dribbling suggestions may seem unexpectedly arbitrary,” Donahue adds.
For young people, animal agriculture before technology looks natural. “Whenever you have a smartphone in your hand, your first instinct is to reach for the phone,” explains Alyssa Petersel, licensed social worker and founder of the MyWellbeing service. Donahue adds: Generation Z was formed during the pandemic, when life was cut off, and young people relied on social media and gadgets. For them, the transition to AI therapy has become a logical step.
All the same, I praise the specialists. Young people are more emotional. “Her brain has not yet formed the ability to make decisions independently of the thoughts of others,” explains Petersel. “Add to whatever device is successful, and you eliminate chaos.”
In extreme cases this can lead to tragedy. Thus, two families from Texas filed a challenge against Character.AI after their children, victorious and zastosunok, caused harm to themselves and others. But even worse was the relationship with OpenAI: fathers from California filed a lawsuit against the company after their son put his hands on himself. The call confirms that ChatGPT tells you how to tie the loop, and instead blocks unsafe dialogue, actually encouraging you. “Victorian students, you lose the ability to think independently and make decisions, and young people are especially good at trusting technology,” says Petersel. Vaughn adds that people over 40 years of age tend to be more critically minded and skeptical about life (part of which they lived without gadgets)
That piece of intelligence in psychotherapy is not without merit. “We can understand this deprivation, and let us recognize the truth, and even the same,” says Pharaoh. For example, AI can quickly analyze large amounts of medical information. Petersel pointed the butt: you can visit dozens of pages of stolen records and ask the AI to formulate a number of shopping sprees. But everything needs to be kept in context. If you ask for the sake of expanding the yakus zvichka, the bot can present ten options. What if you mean “take a walk in the streets” and you live in an unsafe area? Or “talk to your dad”, what is the shape of your centenary? There are few recommendations to extract powerful justice and justice, says Petersel.
The study, published in 2025 people in PLOS Mental Health, found that participants were rarely able to differentiate between ChatGPT responses and those of a real therapist. Moreover, types of III were often evaluated more. This gives rise to important nutrition: how can individual intelligence be integrated into therapeutic practice under the minds of proper control?
“Chatbots don’t understand nuances,” says Pharaoh. It is not possible to provide any deep context that a real therapist could provide. There is one more problem: the meeting is geared toward ideal solutions, otherwise the process of healing from trauma and grief will take time. “If we always have the testimony at our fingertips, what kind of infusion does people joke about the testimony in themselves, spilling over to the power of evidence and value?” – asked Petersel.
Staying indoors can prevent isolation and prevent you from having direct contact with people. Since today we are actively discussing the threat to human intelligence, which encourages our ability to start thinking and thinking critically, it is no less important to call on the vests for human contacts. And the stench itself is the key to effective therapy. “On a chemical, physical and energetic level, the presence of another person is a big part of the healing process,” says Petersel. Donahue adds that for people who feel overwhelmed by powerful emotions, this contact can be especially important.
However, piece intelligence can be a useful tool, but there is no way to treat it in a single way. Piece intelligence is neither good nor evil. Ale wine can never replace the human connection in therapy. “It makes it impossible for people to experience the beauty and complexity of growth through human steps,” says Donahue. “We need human contact in order to live. People need people. Fragment.”
Behind the material vogue.com