Recently, a study by the online dating company Match has revealed that nearly half of America’s Gen Z—those born between 1997 and 2012—have turned to large language models like ChatGPT for dating advice, marking the highest percentage across all generations. These young individuals are leveraging AI to craft breakup messages, analyze conversations with their dates, and even resolve emotional issues. The research forecasts that by 2025, the number of singles using AI to enhance their dating experiences is expected to surge by 333% compared to 2024.
Psychologist and relationship expert Dr. Lalitaa Suglani points out that for those feeling confused in relationship communication, AI serves as a remarkably practical tool. It can help users craft messages, clarify confusing conversations, or offer additional insights, allowing users more space to think before responding. She emphasizes that as long as AI is viewed as an assistive tool rather than a replacement for genuine emotional connection, it can have a positive impact in certain situations, such as serving as a journal prompt or a space for reflection.
However, Dr. Suglani also raised some concerns. She pointed out that large language models tend to provide positive and affirming responses after being trained, which could inadvertently support unhealthy relationship patterns and reinforce users’ biased assumptions. When users’ questions themselves carry prejudice, AI is more likely to amplify false notions or avoidance mindsets. For instance, using AI to draft a breakup message may simply be a way to escape anxiety, rather than truly addressing one’s feelings.
To cater to this market demand, several related services have emerged, such as Mei, a free AI service trained by OpenAI that answers questions about relationships in a conversational format. Founder Es Lee mentioned that over half of the inquiries come from topics related to sex, subjects that people often hesitate to discuss with friends or therapists. He pointed out that People choose to use AI precisely because current services fail to meet their needs. and another common use is to rewrite messages or solve emotional issues, which he described as It seems that people need AI to validate their thoughts..
Privacy issues also raise another concern; these types of applications could collect sensitive data, and if they fall victim to cyberattacks, the consequences could be quite severe. Es Lee assures that Mei will not require users to provide identifiable information, only their email addresses, and that conversation content will be temporarily stored for 30 days solely to ensure service quality, after which it will be deleted. OpenAI has also stated that its latest model has improved in helping users avoid emotional dependency and excessive flattery, and will appropriately guide users to seek professional assistance.
Additionally, some users also combine AI assistance with that of real-life therapists. A report mentions a London-based woman named Corinne (a pseudonym) who started seeking advice from ChatGPT when she was about to end a relationship last year. She would ask the AI to emulate the response styles of relationship experts like Jillian Turecki or Dr. Nicole LePera. She noted that her discussions with her therapist mainly focused on childhood issues, while the questions she posed to ChatGPT were often related to dating or emotional challenges. Corinne believes: When life’s pressures weigh heavily, AI proves incredibly useful, especially when friends aren’t around; it brings me a sense of calm..



