Woman was upset that her boyfriend was using AI to process relationship issues — but the prompts revealed a troubling truth
Open communication between partners and seeking help can be really important while getting through the low points of a relationship. However, when a woman's partner decided to turn to AI for their relationship issues, she was really unhappy. She later checked the prompts and realized that the software had been working from her boyfriend's negative and biased prompts. It hurt her even more to see what her boyfriend of 18 years thought of her. The 38-year-old shared the entire incident on Reddit on December 24, 2025, where she goes by u/Technical-Fly4660.
"We've been having problems for years now, and things reached a peak earlier this year when our surprise baby was born," the woman began. They had been stuck in a loop with the same issues for quite a while. The couple hadn't been talking to each other apart from discussions about their kids and "holiday logistics" for some time now, too. The AI chat also came as a big blow to the woman. "This evening, I found his AI chat feed regarding us. It was quite unflattering to me." She added, "He'd input a few sentences about how he felt about me and get 5-10 paragraphs describing what kind of manipulation that was and my potential motives for using these manipulation tactics."
The woman believed that the AI had been used by the man for months now. So it went on about how the man needed to remain strong in the situation and focus on his long-term plans and ideas. There was a lot of information that the woman had to process. "To say I feel an incredible amount of betrayal is an understatement. The last few prompts were basically an outline for if he wanted to leave me, what division of assets and custody would look like." The woman also shared a few examples of the prompts and the responses from the AI. "Here's one, 'She's been so happy with the kids while ignoring me.' AI says that's because I'm trying to ice him out and manipulate him to break first."
The woman pointed out that she knew AI was a "powerful tool," but she felt really betrayed after she went through the conversation. It also didn't look like a healthy way to deal with things. An update revealed that she eventually ended up confronting her boyfriend about it, but it didn't go well. "He claimed he used it as a personal journal to bounce ideas off, and I violated his trust by reading it." The woman was not sure how they were going to work past such an issue. "I won't go into specifics, but he fully admits he hasn't been a good partner the past 18 months. We are both at fault for the issues we are dealing with currently. I'm not assigning blame."
"I compare his use of AI to chatting with a friend who already doesn't like me, about problems within our relationship. Which just feels gross," the update concluded. People weighed in on the situation in the comments section of the post. u/tinkermymind wrote, "AI is extremely dangerous when used in anything relating to mental health, relationships, or anything to do with human consciousness. It is programmed to make you like it by confirming and flattering your thinking." u/d3gu commented, "Your main issue is that your boyfriend is chatting to basically a glorified chatbot instead of you. He's created his own little echo chamber." Meanwhile, u/Separate_Ability4051 expressed, "Wow, AI is dangerous."
My[38F] long term bf[44M] has been using AI to discuss our relationship.
byu/Technical-Fly4660 inrelationshipadvice