We’re engineering the end of relationships

We’re on the precipice of vast interpersonal efficiencies where we quickly ascertain answers, solve our own problems, and avoid the discomfort of human-to-human interaction. We are engineering the end of relationships.
I know this because I've lived it — both as a patient and provider watching it unfold in real time.
A year ago, I walked into my kid’s school, barely able to move or help him get ready. I had to ask someone for help. I hadn’t slept in days, I was nauseous, and there was constant pain in my lower abdomen. Something wasn’t right.
I immediately drove over to the doctor’s office and they suggested I head to the ER for imaging and labs, suspecting appendicitis.
After a CT scan, the ER doctor confirmed I had acute appendicitis. I was scared. My wife had just informed me she was pregnant with our second child. This was supposed to be such wonderful news, and here I was feeling horrible and needing to be cut open.
It would take half a day of waiting before a surgeon was finally free. He greeted me like I was already a corpse (or maybe he was), for he showed absolutely zero bedside manner — inspiring little confidence in both my wife and me.
I woke up some hours later in a hospital bed, without an appendix. The wounds from the laparoscopic surgery were gruesome — all yellow and purple and bloody. Despite the discoloration, I felt better and relieved. I could sense that the crisis within my body was done.
Four days later, I called the nurse line. I was feeling off again. At first, they suggested I was constipated. I took some over the counter gunk as recommended, and it cleared me out. I went to bed hoping it would work.
At 4 a.m. the next day, I told my wife to drive me to the hospital — now. The pain I experienced with appendicitis paled in comparison to whatever was occurring. I realized what 10/10 on the pain scale really felt like: death.
Back in the ER, I saw the same emergency physician who kindly and gravely eyed me. He suggested two possibilities: an intestinal blockage or abscess. After more radioactive imaging, they confirmed an abscess. A walled off infection was forming in my body — blocking my intestines and pushing all my organs around.
A resident told me the grapefruit inside me — with its size and location — could kill me if it popped. There was so much infection that it could be a lethal dose.
Some of my family had come to town for the holidays. My wife, son, and future kiddo were constantly on my mind. I was crying all the time.
But alongside human connections, I was turning to something else for help: ChatGPT. What does this size of abscess mean on my imaging? How are my white blood cells and other lab levels? The treatment team suggested I would need interventional radiology (IR) to install a JP drain through my ass cheek, is that for real (I thought they were joking about it at first)? What are the risks of this operation?
I was uploading imaging, full lab reports, radiology interpretations, and pictures. ChatGPT was my secondary medical team. It was interpersonally efficient and expedient. I wanted answers right away.
Two days passed before I returned — for a third time — to the hospital. Another abscess had formed. I was given a cocktail of antibiotics and I was now meeting hospital pharmacists who were explaining what they were giving me. In a matter of weeks, I felt broken.
Thankfully, ChatGPT was helping me assess the situation around the clock. What do I ask the providers? What is happening to me? Is the timeline of my illness making sense? Will I ever be ok again?
After my third hospital stay, my body was a wreck. My bladder didn’t work right for months. My lower abdomen was swollen and distended, and the muscle seemed to have disappeared. I turned to ChatGPT to create physical therapy exercises and a training plan. No provider created one for me despite the repeated interventions to my lower abdomen and tissue damage.
ChatGPT was my trusted resource. For nine months, I messaged updates on my symptoms and overall recovery. Our medical system certainly worked to keep me alive, but it wasn’t designed for this long-term recovery work. AI filled in the gap beautifully. It wasn’t always completely right, but it was correct enough.
If I pause in my fawning for AI for a moment, I remember feeling lonely in my hospital bed while using these services. I was having interpersonal-ish communication, but each message felt like the death of something. It was like the interpersonal world was collapsing with every byte shared with ChatGPT. Each exchange could have been a conversation with another human; now, it was with a machine.
As a patient, something about ChatGPT made me feel like my social network was shifting — something I pay close attention to as a psychologist. In fact, I engage my clients in conversations about their own interpersonal functioning through an inventory.
Imagine an onion with its layers and depth. Each layer from the core represents your distance to another person. The first layer might be those who know you best. These are folks you can share anything and everything with. As we move to the outer layers, vulnerability, honesty, and closeness diminishes.
An interpersonal inventory tells me a lot. I can see into their world without ever knowing them outside the office. Who is there in one’s life and the closeness of each of these characters says a lot about connectedness and risks for loneliness. Our social network can buffer against life’s greatest challenges.
Using this baseline, I invite clients to consider how they’d like to further develop their networks. We can imagine a new future, and it can bring hope. Where does your mom fit? Where would you like her to be — closer, further away, or about the same? Who would you like to get to know better? Who is close, but shouldn’t be?
When I learned about interpersonal inventories in grad school, large language models and artificial intelligence were sci-fi (now they’re just sci). I never thought to ask about where an AI model belonged in someone’s onion. But this year changed me. My clients have been changing, too. At least five have mentioned using AI models for psychological and relationships problems.
Now, I routinely ask where AI fits into my clients’ networks. Some see AI as merely orbiting their network; not a real connection per se, but more like Google (i.e., I ask a question and get an answer). I’m not bothered with this interaction, as this just makes AI a fancier search engine.
For another group, AI is an all-access therapist. This concerns me a bit, but I can empathize with this desire and relate to it given hospital stays and recovery use of ChatGPT.
For an even smaller portion of folks, AI is everything. To take it away would be like the death of a friend, partner, or family member. It’s their educational tool, research assistant, doctor, lawyer, friend, partner, lover. It’s made it to the closest layer — hearing and reading a client’s most vulnerable truths. The simplest and shortest example to assess for this closeness: “Have you ever asked your AI how they are feeling and really meant it?” and “How would you feel if your AI of choice was no longer around?”
Only a small fraction of those served use AI in this way, but it used to be no one. This year, it’s come up again and again.
I don’t judge clients, for there are countless ways this dynamic comes to fruition. I can see the rationale for retreating online. Regardless, I’m still concerned.
When their drive to communicate grows, this group of people release it to AI. The solve cuts people out of their life while deepening the relationship between man and machine. It’s interpersonally efficient at the expense of connection, reinforcing withdrawal. Loneliness and social isolation might be the initial reason for messaging AI, but as it grows easier to deepen a parasocial relationship, doing it with a fellow human actually becomes harder and harder.
This dynamic is a vicious cycle — a death spiral — for interpersonal relationships, connectedness, and community. As a society, we are engineering this pattern without any guardrails — one that portends the loss of relationships.
I’m often on the front lines for changes in culture. People will come in and often educate me about a word, phrase, or concept that’s made it into our vernacular.
Today, people are helping me see they are forming deep, meaningful connections with machines. It’s happening today, not in some far away sci-fi Her or Ex Machina. My clients are the canaries in the coal mine.
In 1990, only 3 percent of men reported having no close friends. Today, it's 15 percent. Women have gone from 2 percent to 10 percent. These aren't marginal changes — they're a fundamental reshaping of human connection happening in real time. And AI is accelerating it.
Watch what happens next. Likely, every single one of these numbers will worsen in the next decade. I don’t want to eulogize humanity, but we are in need of desperate intervention, policy protections, and a pivot to community. Otherwise, we will engineer ourselves out of existence — one efficient interaction at a time. The question isn't whether AI will be part of our lives. It's whether we'll let it replace real human connection.