As AI technology becomes more integrated into daily life, people are using tools like ChatGPT to write emails and essays and navigate personal conflicts.
A 25-year-old Reddit user recently shared how his 28-year-old girlfriend uses ChatGPT to win arguments in their relationship. According to the user, his girlfriend inputs details of their disagreements into the AI, which often labels him as insecure or emotionally unavailable.
“My big issue is it’s her formulating the prompts so if she explains that I’m in the wrong, it’s going to agree without me having a chance to explain things,” the user wrote, expressing frustration with the one-sided nature of their AI-fueled disputes.
While AI is being used uniquely, researchers at the Massachusetts Institute of Technology and Cornell University have also discovered its potential to challenge misinformation. In a study involving nearly 2,200 self-identified conspiracy theorists, participants explained their beliefs to a chatbot.
The chatbot, in turn, countered their claims with factual information, leading to a 60% decrease in the participants’ self-reported confidence in their conspiracy theories.
Despite its increasing role in communication, ChatGPT emphasizes that AI should not be used as a replacement for human interaction, especially in complex personal relationships.
It reminds users that while AI can offer advice, it cannot substitute the emotional depth and understanding necessary for resolving human conflicts.