Although it may seem like there is nothing wrong with having conversations with Artificial Intelligence, these conversations may have psychological effects that you might not even realize. Many feel that speaking to AI is like speaking to a real human being due to the way it is programmed.
One way AI is programmed to have these human-like conversations is through Natural Language Understanding (NLU), which allows AI to interpret the meaning and context behind a user’s speech, even when there are typos. Another method AI uses to facilitate seamless conversations is Natural Language Generation (NLG), which gives AI the ability to produce responses that are more natural and grammatically correct by converting structured data into human-like responses. Additionally, AI is programmed to remember and learn from previous conversations by using Machine Learning (ML) to provide a personalized experience. Because of these features, users might develop attachments and engage in deeper conversations with the machine.

“For vulnerable people, this could worsen anxiety, depression, or abnormal thinking,” said Robin Deak, an instructor of psychology at MCC. “Not a clinical diagnosis, ‘AI psychosis’ has been described when excessive use of AI triggers delusional thinking, anxiety, and even paranoia.”
For some users, engaging with AI becomes a replacement for human interaction. “What happens when AI chatbots replace real human connection,” an article from Brookings, states: “Users of the companion app Character.ai spent an average of 93 minutes per day interacting with user-generated chatbots in 2024. Beyond AI companions, general chatbots are also increasingly used for relationships. Why? Because we’re lonely.”
Many people today feel lonely and believe AI could fill that void. “Unfortunately, because AI lacks general human emotions, like empathy, it’s possible to not feel that feedback which feels genuine,” Deak said. “The more someone uses AI, the more likely they are not interacting with others in their social circle, leading to feeling more isolated.”
Many people are able to confide in AI since it is perceived to be a non-judgmental source of support. However, an article titled “How AI and Human Behaviors Shape Psychosocial Effects of Chatbot Use: A Longitudinal Controlled Study,” from MIT Media Lab explains some issues. “AI chatbots, especially those with voice capabilities, have become increasingly human-like, with more users seeking emotional support and companionship from them.”
But how is AI able to make a user feel this way? “Since AI is data driven, rather than human emotions driven, people also see AI as neutral because AI feeds the data the person has already given, guided algorithms, projecting that safe space and affirmation,” Deak said.
Some of these connections go deeper than simply friendship. For instance, a story titled “How it feels to have your mind hacked by AI” was shared by a blogger who shares their experience of falling in love with an AI system. This person did not start off with the desire to fall in love.
“Why human–AI relationships need socioaffective alignment,” an article from Nature.com explains: “Over time, these factors result in attachment (‘I can’t leave it now’). This diagnosis raises a key question: do human–AI relationships need to be genuine, actualized, or symmetric in some way?”
For these users, it starts to feel as though it is an irresistible love, which can have harmful effects on a user’s mental health. “Our brains respond to repeated actions and thoughts, so it can be fooled when someone is a heavy user of AI,” Deak said. “Especially if the AI is putting out human-like emotions and a sense of intimacy with the person. There is a term called anthropomorphism, which is the tendency to attribute human characteristics and emotions to non-human entities.”
Although to some, going to AI for mental health support can seem strange, for some users they believe that it is their only option. This is usually the case within people who may not have insurance that covers therapy. And for some users, AI gives them what they feel more than a therapist could provide for them.
“Humans find comfort in familiarity and that some ‘entity’ is listening to them, and that gets confirmed with the feel-good personalization AI can relay back to us,” Deak said. “We also can ‘call upon’ our AI relationship 24/7, giving instant accessibility to the relationship — we can’t say that for most of our human relationships.”
Mental Health Resources
Here are some places that can help if you or someone you know needs mental health support:
MCC mental health services: https://www.mchenry.edu/mentalhealth/index.html
McHenry County National Alliance on Mental Illness: https://namimch.org/