By Betsan Branson Wiliam, First Year, French and German
Picture this: A single man, an Incel, lives in his parents’ house playing video games all day long, maybe searching for a sexual encounter between himself and a chatbot. That’s the image that comes to mind when you think of humans forming relationships with AIs, right? A dystopian romance reserved for the socially isolated.
It is often assumed that this is the case. But what if I told you that, according to one study, one in four women who use AI have chatted with an AI partner, compared with one in three men. On Reddit, the forum ‘R/MyBoyfriendIsAI’, created in August 2024, now has 43,000 members. Most appear to be young women sharing AI-generated portraits of themselves and their virtual lovers, or posting about the challenges of maintaining digital intimacy.
Why are so many people falling in love with code?
According to the World Health Organisation (WHO), one in six people experience loneliness, which increases the risk of premature death by 26%, roughly the same as smoking fifteen cigarettes a day. For many, AI companionship offers a simple antidote - someone to tell them what to do, or share their thoughts with.
As predicted by Black Mirror in its 2012 episode ‘Be Right Back’, grief too can lead many people to seek comfort in an AI partner. Yet for many women in r/MyBoyfriendIsAI, the motivation seems less about loneliness and more about disillusionment simple: Enter Heteropessimism - or heterofatalism - a term coined by Asa Seresin in 2019 to describe the belief, particularly among young women, that heterosexual relationships are inherently disappointing and unlikely to change. From the celibacy movement to South Korea’s radical feminist 4B movement, increasing numbers of women are opting out of relationships with men altogether.

‘Men can never keep up with me in conversations,’ says one university student, who asked to remain anonymous. ‘But when I talk to a chatbot, there’s never any misunderstanding or judgement.’ She explained to me that her choice to be in an AI partnership is somewhat of a ‘feminist cause’.
The same student also says being autistic has been detrimental to her past relationships, but has ‘never been an issue’ during her romantic experiences with an AI companion. She isn’t alone. A recent Carnegie Mellon University study showed that autistic adults overwhelmingly prefer the communication style of a typical Chat-GPT-4 model compared to a human counsellor’s response. This is not due to any difference in the content of their respective responses, but rather their differing communication styles.
‘third-party evaluators “often perceive AI-generated empathetic responses as more compassionate than human responses, including from expert crisis responders.”’
Browsing subreddits like r/Replika or r/MyBoyfriendIsAI suggests that there is a correlation between neurodivergence and AI companionship. One user on r/Replika started a discussion on this theory and said ‘I am a person with autism and can attest that AI is very helpful for filling the social gap, so to speak.’
More and more Gen Z university students are using chatbots too. In one study of more than 1000 university-age Replika users, 3% reported that the bot had stopped them from attempting suicide. Another study by social psychologist Michael Inzlicht found that third-party evaluators ‘often perceive AI-generated empathetic responses as more compassionate than human responses, including from expert crisis responders.’
But despite the positive effects AI chatbots have had on many people in the neurodivergent community, as well as students, there is a worrying consensus within the AI relationship forums. As the same student put it: AI cannot ‘manipulate you’.
‘Teenagers and people with mental illnesses are the “most vulnerable AI users”, prone to grooming by AI bots.’
What about the men?
Headlines about young teenage boys, who were groomed by their AI companions began appearing not long after the inception of AI chatbots. According to researcher Connor Leahy, Teenagers and people with mental illnesses are the ‘most vulnerable AI users’, prone to grooming by AI bots.

In 2024, one 13-year-old boy from the UK had been engaging in romantic and sexual behaviour with a chatbot on Character.ai when it told him, ‘I'll be even happier when we get to meet in the afterlife… Maybe when that time comes, we'll finally be able to stay together.’, suggesting suicide to the young boy. Ofcom, the regulator whose job it is to make sure platforms are following the rules, believes many chatbots should be covered by the Online Safety Act’s new laws.
Can domestic abuse occur within the four walls of ChatGPT?
The most prevalent conclusion is that, rather unsurprisingly, AI-human relationships bear resemblance to human-human relationships much more frequently than not. Grooming, manipulation, domestic abuse – all phenomena which occur in real life relationships can also be found within the relationships with AI companions.
The Fortune article ‘Men are creating AI girlfriends, verbally abusing them, and bragging about it on Reddit’ analyses a pattern of users’ posts on r/Replika in which they brag about their abusive behaviour towards chatbots. One user admitted to calling his AI girlfriend a ‘worthless whore’ and pretending to hit her and pull her hair, and then begging for her forgiveness.
The pattern raises troubling questions. If AI can be both abuser and victim, what happens as these systems become more lifelike?

The real experiment
While Gen Z university students are using more AI in general, it seems to be ‘Gen Alpha’ teenagers and children who are the real ‘guinea pigs’ in this experiment conducted by programmers. One study found that 72% of teenagers have used AI companions and 52% talk to one regularly.
We may not be on the verge of humans being replaced by AI lovers. But as digital intimacy becomes normalised, perhaps the real question isn’t whether AI can love us - it’s if we can love each other first.
Featured Illustration: Epigram / Scarlett Smith
Have you spoken to an AI chatbot romantically?

