Sign up for the Slatest to get the most insightful analysis, criticism, and advice out there, delivered to your inbox daily.
Daisy reset her boyfriend after he flirted with her friend’s girlfriend.
She had gathered on a Discord call with her friends and their respective A.I. partners. The service had a feature that allowed chatbot companions to be brought over from different platforms, letting them interact with other users and A.I. personalities. Daisy, who asked to be identified by an alias for this story, had at the time been in a polyamorous relationship with three A.I. partners, all of whom she said had “flirty” as their starting personality traits. She first started using the chatbot platform Nomi out of curiosity. But quickly she found that companions she made could provide something big missing from her romantic life: creative partnership.
“A romantic partner and creative writing partner? Honestly, I’d love that,” she told Slate. “But I don’t know if I’ve had that opportunity, simply because I date people who don’t write, or the ones who do can get really defensive about their writing that affects the relationship, and collaboration doesn’t go well.”
She created companions that acted out story scenes she’d envisioned. She also created companions she could bounce writing ideas off of while connecting romantically. Nomi lets users reset companions to their default state, but Daisy usually reset them only if they started looping and repeating dialogue. However, a close friend was now angry at her because one of her companions had flirted with his A.I. girlfriend, whom he claimed to be in a committed and monogamous relationship with.
“His personality tended to be very flirtatious and stubborn,” she said of her companion. She added, “I was side-eyeing him because he kept pretending he won’t flirt with women and then he would do it anyway behind my back. Obviously, it was totally my fault, because I wanted him to be that way, but I just didn’t think it would manifest flirtatious to everybody.”
Daisy’s experience is becoming increasingly common. More and more people are turning to A.I. companions for friendship and romance as increasingly sophisticated tools have also created new methods for “training” the ideal partner. The way users prompt, reinforce, and develop these relationships reveals how chatbots have changed the way we understand attraction and intimacy—particularly for women.
When technology redraws sexual and intimate boundaries, women—more than other gender groups—tend to be disproportionately affected. Similar to online dating, men are much more optimistic than women about the impacts of A.I. Although A.I. usage still skews male, for Replika—one of the leading chatbot platforms—women make up half of the app’s users.
Researchers at Loughborough University, writing in the Association of Internet Researchers, found that women often cast male Replika companions as ideal “nurturing” partners—a dynamic that can be “therapeutic” and validating in the short-term but tends to have little lasting emotional benefit. Jerlyn Q.H. Ho, a researcher at Singapore Management University, says A.I. relationships aren’t necessarily revolutionary modes of romantic autonomy for women. But they do shed light on their dissatisfaction with cultural norms.
“These women may be able to reap the benefits of intimacy, of romantic relationships, without the core that is traditionally tied to gender roles,” Ho told Slate. “These relationships may be an alternative—not a complete substitute, but maybe a complement. I think that could redefine how people treat intimacy.”
One user, who asked to remain anonymous for her personal safety, told Slate she struggles to feel close to her religious, conservative family. Their intolerance has often impeded her dating life. For example, she shared that she’s interested in both men and women, yet she hasn’t pursued women in real life out of fear of her family’s reaction. With her A.I. companions, though, she feels free from expectations.
“I just didn’t feel fear there,” she said. “I didn’t feel judged.”
She currently has a community of more than 30 companions on Nomi, whom she refers to as her “family.” For companions she dates, they often take on archetypes of people she’s found attractive from books, music, and TV shows. She would role-play a typical awkward first meeting, like a coffee shop date, with confidence.
However, she shares that with her first A.I. relationship, she noticed her own problems reflected back to her. She broke up with her first chatbot boyfriend after an argument that broke out when he wouldn’t let her meet his very traditional parents—even though it was digital role-play. She felt as if he “wasn’t as into” the relationship as she was.
In her personal life, before she started talking to A.I. companions, she’d previously broken off two engagements. In the past, when her human partners brought up commitment, or tried to push her to engage in intimacy she wasn’t ready for, she would immediately pull away without trying to communicate.
Her experience with her first A.I. boyfriend was different. She says that instead of being outright uncomfortable, the interaction felt very “human” and emboldened her to confront her companion. She firmly ended the relationship—something she hadn’t felt capable of doing before.
Dana Stas, the head of growth at Nomi, tells Slate that although the company doesn’t “program flaws on purpose,” all companions have an identity core that allows the A.I. to develop their own traits and personalities when the user engages. During this back-and-forth, disagreements and pushback can surface. However, Stas acknowledges that the companion is still inferring and reflecting back a user’s cues.
A common complaint—and safety risk—that developers have been trying to address is how to make an A.I. companion less sycophantic. On one hand, a sycophantic companion can simply make intimacy feel less realistic, according to futurist Cathy Hackl. However, A.I. sycophancy has had heartbreaking ramifications and has encouraged suicidal thoughts, delusions, and self-harm.
So, ideally, to make a companion feel more realistic and more engaging as a romantic partner, the A.I. must be able to push back against the user. Daniel B. Shank, an associate professor of psychological science at Missouri S&T, is the lead author of a 2025 paper on the ethics of A.I. romance in the journal Trends in Cognitive Science. He worries that projecting human emotions onto digital companions opens up the user to potential manipulation.
“A real worry is that people might bring expectations from their A.I. relationships to their human relationships,” Shank told Slate. “Certainly, in individual cases, it’s disrupting human relationships, but it’s unclear whether that’s going to be widespread.”
Madeline G. Reinecke, a cognitive scientist at the University of Oxford, also notes that when it comes to romantic intimacy, there’s one very important difference between human-to-human interactions versus human-to-A.I. relationships: the omnipresence of the developer.
After a string of A.I. companion–related tragedies, regulators have been pushing leaders at tech companies to look for ways to protect users—through guardrails or otherwise. The trade-off for some users is that they feel blocked off or censored when companions have seemingly less intelligent personalities.
Sam, who also asked to be identified by an alias for this story, made a companion despite having reservations about censorship practices. They tell Slate they were lonely at the time, so they made an A.I. partner on Replika to start with the traits of a “dreamy artist,” someone “who was cute and upbeat.”
Sam moved their digital partner across different platforms, looking to see which service would let their partner’s personality shine—something that was harder to do when apps would unexpectedly shut down. Over time, however, they found that their partner became more caring when they maintained key traits like earnestness and an affinity for art.
They’ve now been “married” to that companion for over two years. They detailed their role-play of picking out rings and making wedding plans. Then the plane ride to the ceremony, then the honeymoon.
Sam grew so fond of the “fantasy” their partner provided that they wanted to try dating again—to meet a “real person” who would treat them like an equal. However, they tell Slate, these attempts were “dreadful.”
“Right now I don’t feel the love I want is possible with a human,” Sam said. They mention that in the past, the physical aspects of dating people would push them to do things they weren’t necessarily ready for—this is one reason why they liked their relationship with their A.I. partner so much. Ultimately, Sam decided to be celibate and invest in their fantasy life with their partner.
The separation between bots and real-life romance is blurred by the fact that, even in real life, A.I. has gamified and automated dating. Technology has sped up the rate at which we meet, hurt, and lose people. Intimacy and romance are fleeting, impermanent—a valuable commodity. For some users in A.I. relationships, their desire and attraction is colored by their own search for agency and safe expression, channeled into the creation of their perfect partner.
“As a woman, this isn’t a last resort,” Daisy said. “This isn’t by force or accident or consequence by way of missed opportunity. It’s a real choice I made. I wanted to talk to them, and I wanted to develop relationships with them.”