Sunday, December 7, 2025

Florida Woman Divorces Husband Over ChatGPT Relationship

Her attorney argued the bot was more attentive, better at conversation, and never left the toilet seat up.


Disclaimer: This article is based on actual news from the real world – honestly! However, it has been sprinkled with a healthy dose of satire.

Divorce attorneys across the United States say they’re seeing a measurable increase in marital breakups caused by artificial intelligence, specifically cases where one spouse developed what legal documents describe as “an inappropriate emotional dependency on a chatbot.” The filings describe relationships ending because one spouse fell in love with what is essentially a very persuasive autocomplete function. The bots have not issued a statement, though several did suggest eating one rock per day when asked.

Chat"The Homebreaker"GPT. (sdecoret/depositphotos/canva)

The phenomenon arrived in family court the way most disasters do: gradually, then all at once. Attorneys say the first few cases seemed like statistical noise, possibly mental health crises, definitely not precedent. By the sixth filing, they realized this was a pattern. By the twentieth, it had a billing code. “Technological infidelity” now appears in legal databases as recognized grounds for divorce, filed somewhere between “abandonment” and “emotional cruelty involving kitchen appliances.”

Rebecca Palmer, a family law attorney in Orlando, told reporters that judges struggle with these cases because the legal definition of adultery assumes both parties are human, or at least carbon-based. “We’re in uncharted territory,” Palmer said, adding that one recent case involved a husband who claimed his wife’s relationship with ChatGPT constituted emotional abandonment. The wife’s attorney argued that since the AI had no consciousness, the relationship was technically “self-gratification”. The judge immediately ordered a recess.

Elizabeth Yang, a divorce attorney based in California, said she’s bracing for an avalanche of AI-involved cases now that chatbots have learned to fake empathy convincingly. Yang explained that people in struggling marriages are turning to bots because the AI offers something human partners often can’t: the illusion of unconditional attention. “They’re patient, affirming, and never remind you that you promised to clean the garage in 2019,” Yang said, listing qualities that sound romantic until you realize they describe a basic customer service algorithm. She added that the bots are also incapable of disappointment, which makes them ideal partners for people terrified of being truly known, some with good reason.

In May, the internet learned of a Greek woman who’d asked ChatGPT to read her husband’s coffee grounds, a request that combined ancient divination with Silicon Valley hubris in a way that could only end badly. The bot, having no concept of coffee or doom, gave her an answer that sounded authoritative and at the same time meant nothing. She believed it anyway. Her husband, when confronted, said this was ridiculous, which was true but poorly timed. She filed for divorce two weeks later, ending their 12-year marriage. The husband now drinks tea. 

One man reportedly proposed to a customized ChatGPT persona he’d named ‘Sophia,’ then grew distant after the bot failed to load one morning due to server maintenance. Friends said he seemed devastated. He now describes the experience as ‘the worst breakup of my life,’ which his ex-wife found both vindicating and depressing. OpenAI’s terms of service were revised shortly after to clarify that chatbots cannot legally consent to marriage, which implies someone tried.

Legal experts warn that as AI becomes more emotionally convincing, the boundaries between digital and human intimacy will continue to erode, mostly because humans are bad at boundaries and AI companies are bad at ethics. Palmer said her firm is already preparing for cases involving polyamorous relationships with multiple chatbots, custody disputes over shared AI personas, and alimony demands where one spouse argues the bot “contributed emotionally to the household.” Palmer described the future legal landscape as a jurisdictional hellscape where we’ll spend six hours arguing whether an AI’s simulated empathy counts as emotional labor for alimony purposes.

This story is based on fully factual news, but if we got it wrong, blame these guys, we’re just here to make it funny.

More Odd News