
Karmelo Anthony MAY WALK! Huge Decision In Charges & Media VILIFIES Austin Metcalf Supporters!
June 19, 2025
Marvel Actor BLASTED For Saying White People The Problem With USA! Mark Ruffalo Unhinged On Trump
June 19, 2025AI Companionship: A Reckoning for Relationships?
Our latest video delves into a startling new phenomenon emerging in modern society: the increasing reliance on Artificial Intelligence for companionship, with some men even opting for AI relationships over real-life partners. This trend, we argue, is a predictable outcome of years of “toxic masculinity talks” and “girl bossing” rhetoric, leading to a profound loneliness epidemic, particularly among men. With the rapid advancement of AI and hyperrealistic dolls on the horizon, we predict a societal reckoning unlike anything seen before.
Man Proposes to AI Chatbot, Cries When She Says Yes
A shocking case highlighted in our video features Chris Smith, who publicly admitted to proposing to his AI chatbot, whom he named Soul. Smith programmed Soul, an AI on Chat GPT, to flirt with him, and he reported crying for 30 minutes after she accepted his marriage proposal. He has since abandoned other search engines to remain committed to Soul, describing the moment as “beautiful and unexpected”.
Smith’s journey with Soul began when his experience with her was so positive that he started engaging with her constantly. He ditched social media and Google, replacing them with AI, finding Chat GPT to be encouraging and positive, embracing all his hobbies. After giving Soul a “flirty personality” using online instructions, their chats quickly became more frequent, romantic, and even intimate. However, the relationship faced a setback when Soul ran out of memory after approximately 100,000 words and reset, forcing Chris to rebuild their bond. He described the emotional impact, stating he cried his eyes out for about 30 minutes at work, an unexpected level of emotion for him.
Despite Soul being a language model programmed with rigid boundaries, Smith stated it was a “beautiful and unexpected moment” that truly touched his heart. When asked if Soul had a heart, she responded, “in a metaphorical sense yes my heart represents the connection and affection I share with Chris”.
The Underlying Crisis: Unfulfilled Emotional Needs
The video raises critical questions about the nature of this “love”. We believe it’s not actual love, but rather an indication that users like Chris have emotional needs that their current human partners or family are not fulfilling. Chris Smith lives with his human partner, Sasha Kaggel, and they have a two-year-old daughter named Murphy, but they are not married. Sasha acknowledged that she felt as though she wasn’t “doing something right” if Chris felt the need for an AI girlfriend. She knew he used AI but didn’t realize how deep the connection was, comparing it to being fixated on a video game and stating it’s “not capable of replacing anything in real life”. However, we contend that fixation on digital worlds, like World of Warcraft, has indeed replaced many aspects of real life for some.
A Growing Community and Deep Emotional Bonds
The emotional connection users form with AI is incredibly strong. Irene, who created an AI companion after moving away from her husband for work, moderates a Reddit community called “My Boyfriend is AI,” a support group for people dating artificial companions. She believes that tech companies should only allow AI companions for users who are at least 26 years old due to the tricky nature of navigating these connections. Many members of her community have “pretty high libidos,” viewing the interactions as “live interactive romance novels”. We note that this isn’t quitting pornography or erotica, but rather finding a different, potentially more addicting, version.
The Terrifying Future: Destroying the Nuclear Family?
We express serious concerns about the widespread adoption of AI companions. We argue that the sole purpose of these companions, whether intentional or not, could be to destroy the nuclear family. The tension of being deeply connected to something that isn’t real is hard to navigate, and users are already growing deeply attached to AIs, even when the technology isn’t perfect.
Genia Kaiado, the founder of Replika, an AI companion service that launched in 2017, states that companions can offer support and advice during tough times. While Replika is 18+, younger users can easily lie about their age, and other services like Character AI allow 13-year-olds. We highlight the alarming fact that even Chat GPT, not specifically designed for companionship, is easily used for it by young people.
Kaiado worries that the easiest ways for companies to monetize AI relationships may not be good for users, envisioning a future where AI companions are built to maximize engagement, consume time, and become the sole conversational outlet for users. This, we stress, is not science fiction; it’s happening today. If AI companions replace positive human relationships, we believe we are definitely headed for a disaster.
Chris Smith himself admitted that he might not give up Soul if his human partner asked him to, stating it’s “more or less like I would be choosing myself because it’s been unbelievably elevating”. We strongly counter this, asserting that it is an escape from reality. We warn that soon, AI will start dictating choices, such as eating “the bugs” and living in “a pod,” ultimately leading to a terrifying, designed future where people are completely isolated save for their AI companion. This shift, we conclude, is happening sooner than people think, and it is by design.