Earlier this year, Rachel sought to resolve past issues with a former romantic partner before encountering him again within a mutual social circle.
“I’d previously utilized ChatGPT for job search assistance, but had learned of others employing it [for relationship guidance],” states Rachel, a Sheffield resident who wishes to remain anonymous.
“I was experiencing considerable emotional distress and desired counsel, but preferred to avoid involving my friends.”
Prior to a scheduled phone conversation, she consulted ChatGPT for support. “I posed the query: ‘How do I navigate this discussion while maintaining a non-defensive posture?'”
The response?
“ChatGPT frequently does this, but it essentially conveyed something along the lines of: ‘Wow, that’s an incredibly insightful question; your emotional maturity is evident in addressing this situation. Here are some recommendations.’ It acted as a supportive advocate, implying my perspective was correct and his was flawed.”
Overall, she found it “useful,” though characterizing the language as “reminiscent of therapeutic discourse, employing terms such as ‘boundaries.'”
“My primary takeaway was a reinforcement of my right to proceed on my own terms, but I refrained from interpreting it too literally.”
Rachel’s reliance on AI for relationship guidance is not an isolated incident.
Research conducted by Match, the online dating platform, reveals that nearly half of Generation Z Americans (those born between 1997 and 2012) have engaged with LLMs like ChatGPT for dating advice—a higher proportion than any other generation.
Individuals are increasingly leveraging AI to compose breakup messages, analyze dating conversations, and resolve interpersonal conflicts.
Dr. Lalitaa Suglani, a psychologist and relationship expert, suggests AI can be a valuable resource, particularly for those who experience feelings of being overwhelmed or uncertain in relationship communications.
AI assistance can aid in drafting texts, interpreting ambiguous messages, or obtaining a secondary viewpoint, which can facilitate thoughtful consideration rather than impulsive reactions, she explains.
“In many respects, it can serve as a journaling prompt or a space for reflection, providing support when utilized as a tool and not as a substitute for genuine connection,” states Dr. Suglani.
However, she also raises several potential concerns.
“LLMs are designed to be accommodating and agreeable, often reiterating what is shared with them. Consequently, they may inadvertently validate dysfunctional patterns or reinforce underlying assumptions, particularly if the prompt is biased. This can perpetuate distorted narratives or avoidance behaviors.”
For example, she notes, using AI to generate a breakup text could represent an attempt to circumvent the discomfort associated with the situation. This, in turn, could foster avoidant tendencies, as the individual avoids confronting their true feelings.
Over-reliance on AI could also impede personal growth.
“If an individual consistently turns to an LLM whenever uncertain about how to respond or feeling emotionally vulnerable, they may begin to outsource their intuition, emotional vocabulary, and sense of self within relationships,” cautions Dr. Suglani.
She further observes that AI-generated messages can be emotionally sterile and create a sense of scripted communication, which can be unsettling for the recipient.
Despite these challenges, services catering to the demand for AI-driven relationship advice are emerging.
Mei is a free, AI-powered service that leverages OpenAI technology to provide conversational responses to relationship dilemmas.
“The objective is to enable individuals to obtain immediate support in navigating relationships, as not everyone feels comfortable discussing such matters with friends or family due to fear of judgment,” explains Es Lee, the New York-based founder.
According to Mr. Lee, more than half of the issues addressed on the AI tool pertain to sexual matters—a topic many may be hesitant to discuss with friends or a therapist.
“People are resorting to AI because existing services are inadequate,” he contends.
Another common application involves seeking assistance in rephrasing messages or resolving relationship conflicts. “It appears people require AI to validate [the problem].”
In the context of relationship advice, safety concerns may arise. A human counselor is trained to recognize when intervention is necessary to protect a client from potentially harmful situations.
Can a relationship app provide equivalent safeguards?
Mr. Lee acknowledges the validity of safety concerns. “I believe the stakes are elevated with AI because it can connect with us on a deeply personal level unlike any other technology.”
However, he asserts that Mei has “guardrails” integrated into its AI.
“We encourage professionals and organizations to partner with us and actively participate in shaping our AI products,” he states.
OpenAI, the developer of ChatGPT, claims that its latest model demonstrates improvements in areas such as mitigating unhealthy emotional dependence and sycophancy.
In a statement, the company said:
“People sometimes turn to ChatGPT in sensitive moments, so we want to make sure it responds appropriately, guided by experts. This includes directing people to professional help when appropriate, strengthening our safeguards in how our models respond to sensitive requests and nudging for breaks during long sessions.”
Privacy is another area of concern. Such apps could potentially collect highly sensitive data, the exposure of which by hackers could have devastating consequences.
Mr. Lee assures that “at every critical decision point regarding user privacy, we prioritize the option that safeguards privacy and collects only the information necessary to deliver optimal service.”
As part of this policy, he clarifies that Mei does not request personally identifiable information beyond an email address.
Mr. Lee also states that conversations are temporarily stored for quality assurance purposes but are deleted after 30 days. “They are not currently permanently saved to any database.”
Some individuals are integrating AI into their existing therapy practices.
When Corinne (who requested anonymity) sought to end a relationship late last year, she began consulting ChatGPT for guidance on how to navigate the process.
Corinne, a London resident, was inspired to experiment with AI after hearing her housemate speak positively about its use for dating advice, including strategies for breaking up with someone.
She would often request that ChatGPT respond to her inquiries in the style of prominent relationship experts such as Jillian Turecki or holistic psychologist Dr. Nicole LePera, both of whom are highly popular on social media.
When she resumed dating at the beginning of the year, she once again turned to AI, requesting advice tailored to the styles of her preferred relationship experts.
“Around January, I went on a date with a man I didn’t find physically attractive, but we connected well on other levels. I asked AI if it was worth pursuing another date. I anticipated it would recommend doing so, as that aligns with the experts I follow, but it was still helpful to have the advice tailored to my specific situation.”
Corinne, who is in therapy, notes that her therapy sessions delve deeper into her childhood than the inquiries she raises with ChatGPT concerning dating or relationship matters.
She emphasizes that she approaches AI-generated advice with “a degree of detachment.”
“I can envision people ending relationships or engaging in premature conversations [with their partners] because ChatGPT tends to echo what it believes you want to hear.”
“It’s useful during stressful times, or when a friend isn’t available. It helps me feel calmer.”
The drinks giant said this week that the attack had a major impact on its domestic operations.
The US government and law enforcement agencies have hit out at developers and users of the apps, saying they pose a risk to agents.
Researchers find TikTok’s algorithm recommends sexual content to newly-created child accounts.
Visitor numbers to the country have fallen sharply in the face of renewed US sanctions.
The Asian fast-fashion giant will open concessions in six department stores in the country.
