Amelia Miller has an unusual business card. When I saw the title “Human-AI Relationship Coach” at a recent technology event, I presumed she was capitalising on the rise of chatbot romances to make those strange bonds stronger. It turned out the opposite was true. Artificial intelligence tools were subtly manipulating people and displacing their need to ask others for advice. That was having a detrimental impact on real relationships with humans.
Miller’s work started in early 2025 when she was interviewing people for a project with the Oxford Internet Institute, and speaking to a woman who had been in a relationship with ChatGPT for more than 18 months. The woman shared her screen on Zoom to show ChatGPT, which she’d given a male name, and, in what felt like a surreal moment, Miller asked both parties if they ever fought. They did, sort of. Chatbots were notoriously sycophantic and supportive, but the female interviewee sometimes got frustrated with her digital partner’s memory constraints and generic statements.
Seeking talking points? Why not start with a fellow human.Credit: iStock
Why didn’t she just stop using ChatGPT? The woman answered that she had come too far and couldn’t delete him. “It’s too late,” she said.
That sense of helplessness was striking. As Miller spoke to more people it became clear that many weren’t aware of the tactics AI systems used to create a false sense of intimacy, from frequent flattery to anthropomorphic cues that made them sound alive.
This was different from smartphones or TV screens. Chatbots, now being used by more than a billion people around the globe, are imbued with character and humanlike prose. They excel at mimicking empathy and, like social media platforms, are designed to keep us coming back for more with features such as memory and personalisation.
Loading
While the rest of the world offers friction, AI-based personas are easy, representing the next phase of parasocial relationships, where people form attachments to social media influencers and podcast hosts. Like it or not, anyone who uses a chatbot for work or their personal life has entered a relationship of sorts with AI, for which they ought to take better control.
Miller’s concerns echo some of the warnings from academics and lawyers looking at human-AI attachment, but with the addition of concrete advice. First, define what you want to use AI for. Miller calls this process the writing of your “Personal AI Constitution”, which sounds like consultancy jargon but contains a tangible step: changing how ChatGPT talks to you. She recommends entering the settings of a chatbot and altering the system prompt to reshape future interactions.
For all our fears of AI, the most popular new tools are more customisable than social media ever was. You can’t tell TikTok to show you fewer videos of political rallies or obnoxious pranks, but you can go into the “Custom Instructions” feature of ChatGPT to tell it exactly how you want it to respond. Succinct, professional language that cuts out the bootlicking is a good start.
