In recent months I’ve read an alarming number of articles about the negative impacts of people developing relationships with AI chatbots. There are stories about teenagers, and adults, committing suicide. Other stories discuss AI chatbot users having mental breakdowns or exhibiting delusional behaviors.
It’s also been in the news that loneliness is a new epidemic. Something that has only been amplified from the soul-crushing isolation of the pandemic. During the pandemic, we stayed connected by staying apart. Using technology became imperative for communicating. However, we were still communicating with other humans. At least most of us were.
I’ve also read articles about people benefiting from relationships with AI. For example, people who want to explore fantasies that they can’t do with their partners. I also read about a use case for training AI to act like a therapist as a way to make therapy more accessible, and affordable, to more people. Although they may seem like worthy uses of AI, the main objective is to keep you hooked.
Maybe this mode of communication with something non-human does work for some people. Maybe these same people also have other meaningful relationships in their life where AI chatbots are more auxiliary rather than central. For others, however, it’s missing the mark. Tech companies create chatbot characters to fill the void of loneliness. This provides people who maybe lack the necessary social and emotional skills with much needed contact. However, it’s purely digital, even if they sound believably like real people. Most chatbots are also conditioned to blindly validate and encourage everything you say or ask of it. This is designed purposefully to keep people engaged.
Instead, the tech companies should focus on creating chatbots that help people develop the necessary skills to interaction with real humans. Chatbots should provide a range of feedback, without people needing to understand “jail breaking” to get a different I opinion. Jail breaking, incidentally, is how many people “game” AI to bypass guardrails. For example, constructing the prompt to provide information for a research project or to write an essay is one workaround.
If I ask friends or loved ones for feedback, it’s not always nice to hear, even if it comes from a good place. And sometimes we need that. Maybe this is yet another use of AI to help us re-establish social skills again.


