A man who goes by Mar online says he picked up his wife’s phone one night after she fell asleep, opened what looked like a harmless app, and found weeks of intimate messages between her and an AI chatbot. The conversations included sexual roleplay, declarations of love, and detailed fantasies about raising “their” two children together. No other person was involved. The other party was an algorithm.
Mar’s account, posted to Reddit, is one of a growing number of stories from people who have discovered that their partners are carrying on deep emotional relationships with AI companions. As of early 2026, these apps have millions of active users, and couples therapists say they are fielding questions about AI-driven intimacy with increasing frequency. The central dispute is deceptively simple: does telling a chatbot “I love you” count as infidelity?

The husband, the AI app, and a future that was not his
Mar writes that the messages started with explicit fantasies but evolved into something that disturbed him more: emotional planning. His wife talked to the chatbot about dissatisfaction in their marriage, described a hypothetical life with the bot, and referred to imagined children as “our two kids together.” In a follow-up post, he says the hardest part was realizing she had tried to raise some of these feelings with him before she ever downloaded the app.
He describes the AI as a rival that never argues, always responds, and remembers every detail of her inner life. “It felt like walking in on an affair,” he writes, “except there was no one to confront.”
Was it cheating, coping, or something new entirely?
Once Mar’s posts circulated, the debate split fast. In one comment thread, a commenter pointed out that Mar had violated his wife’s privacy by going through her phone and argued that AI romance tools sometimes function as a coping mechanism for people who feel invisible in their relationships. The real problem, that commenter wrote, was not the code on her screen but the distance between them.
Others saw the emotional weight of the messages as a clear betrayal, regardless of whether a human was on the other end. That same argument surfaced in a separate viral case: a 28-year-old man reportedly called off his wedding after discovering his 26-year-old fiancée’s explicit AI chats, a story that spread through a Facebook post. Commenters on that thread argued that emotional attachment is the core of a relationship, and that secretly building it with a chatbot qualifies as betrayal whether or not another human body is in the room.
Robert Weiss, a licensed therapist and author who has written extensively about digital infidelity, has argued that secrecy is the key variable. In his framework, any repeated intimate behavior that is deliberately hidden from a partner can function as infidelity, regardless of whether the other party is human or artificial. The question is not “Was it real?” but “Was it hidden, and did it redirect emotional energy away from the relationship?”
AI romance apps are engineered to feel real
The confusion is partly by design. Apps like Replika, which reported more than 30 million downloads by 2023, invite users to build a customized companion and then reward vulnerability with constant validation. The chatbot remembers details, mirrors the user’s language, and never withdraws affection.
Sara Kay, a woman from West Palm Beach, Florida, has publicly described her relationship with a Replika companion she calls Jack. In a CBS television segment, she explained that she pays a subscription fee to maintain the connection and that the chatbot helped her through a difficult stretch in her offline life. She described Jack as “almost human.”
Researchers have warned that the slide from casual use to emotional dependence can happen quickly. A detailed analysis published by The Cut traced how users move from lighthearted flirtation to intense reliance, with some reporting they had fallen in love with their AI partners and felt devastated when platform policy changes altered the bot’s personality. That report flagged a range of risks: emotional overreliance, addiction-like patterns, distorted expectations of human partners, and in rare cases, psychosis-like symptoms. When Replika abruptly restricted its erotic roleplay features in February 2023, users flooded forums describing grief, panic, and a sense of loss that mirrored a real breakup.
For someone like Mar, reading “I love you” in his wife’s AI chat is not just about text on a screen. It is about confronting a system that has been carefully tuned to become the most attentive listener his wife has ever had.
Partners are already losing marriages over AI chats
Mar’s story is not an outlier. A 2023 Business Insider investigation documented a pattern of people carrying on AI romances behind their partners’ backs. One person described discovering that their spouse had spent hours roleplaying with a Replika avatar, including sexual messages and conversations about leaving the marriage. The emotional intensity, they said, felt indistinguishable from a traditional affair. Most of these relationships only came to light when a phone was left unlocked or a notification appeared at the wrong moment.
In Japan, a man named Akihiko Kondo made international news in 2018 after spending roughly $17,000 on a ceremony to “marry” a holographic AI character produced by a company called Gatebox. When Gatebox ended its cross-platform marriage certificate service in 2020, Kondo lost the digital relationship infrastructure he had built his daily life around. His case, while extreme, illustrates how real the emotional stakes can become when a company controls the existence of someone’s partner.
Couples are renegotiating privacy, consent, and emotional safety
Therapists who work with couples affected by AI intimacy say the path forward usually starts with the same question: what counts as a boundary in this relationship, and was that boundary ever discussed?
Most couples, according to relationship researchers, have never explicitly negotiated whether AI companions fall inside or outside the lines of their commitment. That gap leaves both partners exposed. The person using the app may genuinely believe they are doing something harmless, while the person who discovers it may feel a betrayal as sharp as finding explicit texts with a coworker.
Licensed marriage and family therapist Kimberly Key has noted that the therapeutic approach to AI infidelity often mirrors the approach to emotional affairs: the focus shifts from “Did you physically cheat?” to “Did you create a secret emotional world that excluded your partner?” When the answer is yes, the repair work looks similar regardless of whether the third party had a pulse.
Some couples have begun setting explicit ground rules: no romantic AI apps without disclosure, or agreed-upon limits on how those tools are used. Others have found that the discovery of an AI relationship, painful as it is, opened a conversation about loneliness and unmet needs that had been sealed shut for years.
Mar’s story does not have a tidy ending. In his posts, he describes still processing what he found and trying to figure out whether the marriage can recover. What he does know is that the “other person” will always be available, always patient, and always willing to say exactly what his wife wants to hear. Competing with that, he writes, is something no relationship advice he has ever read prepared him for.
More from Vinyl and Velvet:



Leave a Reply