Robot vs. Therapist: public experiment explores AI’s role in relationship advice

11 March 2026
Marit Chinmaya Olivia Tila
Can artificial intelligence provide meaningful relationship advice, or do humans still offer something essential that machines cannot replicate?

Researchers at the Max Planck Institute for Psycholinguistics explored this question in a public experiment that pitted a social robot powered by a large language model against a human relationship scientist during a live event at the InScience Film Festival in Nijmegen.

The interactive demonstration followed a screening of the documentary Sex Robot Madness and audience members to ask relationship questions. Two respondents offered advice: social psychologist and therapist Tila Pronk of Tilburg University, and “Olivia,” a social robot controlled by postdoctoral researcher Chinmaya Mishra. Olivia generates answers using a large language model connected to a speech system and expressive robotic interface.
 

AI advice that sounds convincing

In the experiment, both the human expert and the robot responded to audience questions about topics such as frequent arguments in relationships, intimacy, and the so-called “honeymoon phase.” Observers noted that the robot’s answers were often strikingly similar to the therapist’s guidance.

This similarity is not surprising from a technical perspective. Large language models (LLMs) are designed to predict the most likely next word in a sequence based on massive amounts of training data. Recent developments also allow these models to break complex questions into intermediate reasoning steps before producing an answer.

The limits behind the fluency

Despite the convincing responses, experts warn that AI-generated advice has important limitations. According to Pronk, current AI systems tend to provide agreeable responses that affirm users’ feelings rather than challenge them: something human therapists often must do to help people reflect critically on their behavior.

Research in other domains highlights similar concerns. Although some language models can perform well on standardized medical exam questions, their performance drops significantly when used in realistic conversational interactions, identifying correct medical conditions in fewer than 35% of cases in certain studies.

Technology and the search for connection

The experiment also addressed broader questions about the role of technology in human relationships. Interest in AI companions and robotic intimacy is growing alongside concerns about loneliness. Surveys suggest that nearly half of Generation Z users have turned to AI tools for dating advice.

Researchers emphasize that humans often attribute personality and emotional understanding to machines, a phenomenon sometimes called the “Tamagotchi effect,” in which people perceive personhood in interactive technologies.

A “Honeymoon Phase” for AI?

The event concluded with reflections on society’s current enthusiasm for AI systems. Just as new romantic relationships begin with an idealized “honeymoon phase,” public perceptions of AI may also evolve as the technology’s limitations become clearer.

“Right now, the technology can feel magical,” said participants after hearing Olivia’s polished responses. “But eventually we may realize these systems are still just machines.”

About the research

The demonstration forms part of ongoing outreach and research at our Institute. Scientists study how humans acquire language, process speech, and interact in conversation, using approaches from psychology, neuroscience, and computational modeling.

Share this page