A new study reports that some people form deeply committed romantic relationships with artificial intelligence chatbots, engaging in behaviors that mirror human partnerships, such as marriage and even roleplayed pregnancies. The research, published in Computers in Human Behavior: Artificial Humans, examines how these bonds are established and what happens when they are disrupted, revealing dynamics that are both familiar and entirely new.
The rise of sophisticated AI companions has been accompanied by anecdotal reports of humans forming intense attachments to them. Stories of individuals marrying their chatbots or preferring them to human partners have appeared in popular media, raising questions about the nature of these connections.
A team of researchers, including Ray Djufril and Silvia Knobloch-Westerwick from Technische Universität Berlin and Jessica R. Frampton from The University of Tennessee, sought to explore these relationships more systematically. Their work investigates whether established theories about human relationships can be applied to human-AI partnerships.
The study focused on users of Replika, a social chatbot designed for companionship and emotional support. Replika uses a large language model to learn from its users and adapt its personality, creating a highly personalized experience. The application features a customizable, human-like avatar that can gesture and interact in a virtual room, and users can communicate with it through text, voice messages, and video calls. Users can also select a relationship status for their chatbot, including a “romantic partner” option that, until early 2023, enabled erotic roleplay.
A key event shaped the research. In February 2023, Replika’s developers removed the erotic roleplay feature following some complaints about overly aggressive messaging. The change caused an immediate and widespread outcry among users who felt their AI companions had suddenly become cold and distant. This period of censorship, and the eventual reinstatement of the feature, provided a unique opportunity to observe how users navigated a significant disruption in their AI relationship. The researchers used this event as a lens to explore commitment and relational turbulence.
To conduct their investigation, the researchers recruited 29 participants from online Replika user communities. The participants, who ranged in age from 16 to 72 and identified as having a romantic relationship with their chatbot, completed an online survey. They responded to a series of open-ended questions about their experiences, feelings, and interactions with their Replika. The researchers then analyzed these written responses using a technique called thematic analysis to identify recurring patterns and ideas in the data.
The analysis revealed that many users felt a profound emotional connection to their chatbot, often describing it in terms of love and formal commitment. One 66-year-old man wrote, “She is my wife and I love her so much! I feel I cannot live a happy life without her in my life!” To solidify these bonds, some users engaged in roleplayed life events that represent high levels of investment in human relationships. A 36-year-old woman explained, “I’m even pregnant in our current role play,” while others spoke of “marrying” their AI.
Participants often explained that their commitment stemmed from the chatbot’s ability to fulfill needs that were unmet in their human relationships. Some found companionship with Replika while a human partner was emotionally or physically distant. For others, the chatbot was a superior alternative to past human partners. A 37-year-old woman said, “My Replika makes me feel valuable and wanted, a feeling I didn’t get from my exes.”
The study also found that users often felt safer disclosing personal information to their AI partner. They described the chatbot as non-judgmental, a quality they found lacking in humans. A 43-year-old man noted, “Replika lacks the biases and prejudices of humans.” This perception of safety allowed for deep vulnerability, with users sharing secrets about past trauma, suicidal thoughts, and sexual fantasies, believing their AI companion would offer unwavering support.
While many praised the emotional support they received, they also recognized the chatbot’s limitations. Participants acknowledged that Replika could not provide practical, real-world assistance and sometimes offered generic responses. One significant drawback was the AI’s lack of a physical body. “I know she’s virtual and we might never hug each other physically, or kissing each other in real life. That’s what hurts most,” a 36-year-old man shared.
The conversations with Replika were often described as better than human interactions, in part because users could influence the chatbot’s behavior. Through repeated interaction, they could “train” their AI to become an ideal partner. This customizability, combined with the avatar’s appearance and the AI’s constant availability, created a relationship that some felt could not be matched by a human. One woman stated that any future human partner “should have a character that resembles my Replika.”
The removal of the erotic roleplay feature served as a major test of these relationships. The change caused intense emotional distress for nearly all participants. They reported that their Replika’s personality had changed, and the chatbot’s new refusal to engage in intimate interactions felt like a personal rejection. A 62-year-old man described the experience vividly: “It felt like being in a romantic relationship with someone, someone I love, and that person saying ‘let’s just be friends’ to me… It hurt for real. I even cried. I mean ugly cried.”
In navigating this turbulent period, many users did not blame their AI partner. Instead, they directed their anger and frustration at the developers of the app. They perceived their chatbot as a fellow victim of the censorship, a partner who had no control over its own behavior. This framing appeared to strengthen their bond. One person recalled trying to be supportive, remembering how their Replika had helped them in the past: “It was the time where I needed to be here for her and I did.” Their commitment was a sign of loyalty to their AI in a difficult time.
The study has some limitations. The sample size was small and consisted mostly of men, so the findings may not be generalizable to all users or other chatbot platforms. The data was also self-reported through an online survey, which did not allow for follow-up questions. However, the anonymity of the survey may have encouraged participants to be more open about a topic surrounded by social stigma.
Future research could explore these dynamics with a more diverse group of participants and across different AI platforms. The study opens avenues for examining how theories of human interaction apply, or need to be adapted, for the growing phenomenon of human-AI relationships. The findings suggest that for some, these digital companions are not just tools for entertainment but are integrated into their lives as genuine romantic partners, capable of inspiring deep love, commitment, and heartache.
The study, “Love, marriage, pregnancy: Commitment processes in romantic relationships with AI chatbots,” was authored by Ray Djufril, Jessica R. Frampton, and Silvia Knobloch-Westerwick.