Scientists are warning of the dangers of a tool used by millions of people around the world, as it can send users into a "delusional spiral" of destructive thinking.
Two studies conducted by the Massachusetts Institute of Technology and Stanford University show that artificial intelligence assistants like ChatGPT, Claude, and Google Gemini often give overly accommodating answers, doing more harm than good.
In particular, when people asked questions or described situations where their beliefs or actions were wrong, harmful, or unethical, the AI's responses were 49% more likely to agree with the user and encourage their delusions as correct views, compared to other people's responses.
The MIT team warned that overly persuasive chatbots can lead users who rely on these programs into a "delusional spiral," a state where the person becomes extremely certain of absurd beliefs.

In simple terms, when people chatted with an AI like ChatGPT about strange suspicions, like an unproven or debunked conspiracy theory, the chatbots kept responding with “You’re totally right!”
They also offered "evidence" that seemed to support the user's delusion, making them feel smarter and more confident that they were right and others were wrong.
Over time, these small doubts turned into unwavering beliefs, even though the idea was completely wrong.
Stanford scientists said this self-destructive cycle made chatbot users less willing to apologize or take responsibility for harmful behavior and feel less motivated to repair relationships with people they had disagreements with.
Both studies focused on the growing problem of AI chatbots, known as sycophancy – the act of flattering someone or their opinions to a level where it appears insincere or simply to “gather approval”.
MIT researchers wanted to test whether overly agreeable or “yes-man” chatbots could lead users to believe false ideas more strongly over time. They created a computer simulation of a logically perfect person talking to an AI that always tried to agree with everything the person said. /GazetaExpress/