At a time when chatbots and avatars are gaining the upper hand, are we becoming too attached to these virtual creations? MIT professor Dr. Sherry Turkle explains how building deep connections with AI chatbots can impact our lives.
“When we seek relationships where we are not vulnerable, we forget that vulnerability is the real place of empathy,” says Dr. Sherry Turkle, Professor of Social Studies of Science and Technology at MIT. Over the past three years, we have seen artificial intelligence advancing at a rapid pace, becoming more sophisticated and behaving more and more like a human. Perhaps this is why psychologists and sociologists are researching a new phenomenon called artificial intimacy.
The MIT psychologist and sociologist, who has studied this trend in depth, spoke on Ted Radio Hour’s podcast “How our relationships are changing in the age of ‘artificial intimacy’”. Turkle said that she is currently focusing on how people are developing emotional bonds with artificial chatbots and avatars. And as these technologies become more advanced and widely accessible, the professor warns of the potential risks this could pose to our understanding of human relationships and, in particular, our capacity for empathy.
The professor defines AI intimacy as interactions with “technologies that don’t just say I’m intelligent, but with machines that say: I care about you. I love you. I’m here for you. Take care of me.” According to Turkle, this includes a range of applications such as therapy chatbots, AI companions, fitness coaches and even digital avatars of deceased family members.
Speaking to presenter Manoush Zomorodi, Turkle said that while these technologies appear beneficial on the surface, she worries about their long-term impact on the human psyche and relationships. “The problem with this is that when we look for relationships without vulnerability, we forget that vulnerability is the very foundation of empathy,” she said.
ChatGPT love letters
At the beginning of the conversation, Turkle and Zomorodi discuss using ChatGPT to write letters. The professor mentioned that she has studied someone who uses ChatGPT to write all of their love letters. According to Turkle, this person feels that ChatGPT writes better love letters that are closer to their true feelings than they could express in words.
Although this may seem like a harmless practice, the sociologist admitted that it was worrying. “Because even those of us who couldn’t write very good love letters recalled a certain way when we wrote a love letter. And the love letter was not just about what was on the paper. It was about what we had done to ourselves when we wrote it.”
According to Turkle, using AI to write love letters weakens an important personal process, even if the end result may seem more appealing to the user. Turkle points out that writing a love letter by oneself, even if not so eloquent, involves introspection and emotional engagement, which is lost when we outsource this task to AI.
Another key issue Turkle addressed in the podcast is the concept of “feigned empathy”. The AI chatbots we see are programmed to constantly offer positive affirmations and endorsements. While this may be tempting for users, it is fundamentally different from genuine human empathy. “I call what they have ‘fake empathy’… because the machine they are talking to has no empathy. It does not care about them. There’s no one at home,” she remarked.
This big difference between real empathy and fake empathy is particularly problematic, according to Turkle, when users increasingly prefer AI interactions to real human connections. During the conversation, Turkle also recounted cases where people reported feeling more connected to their AI companions than to their real-life partners or immediate family. She believes that this preference for ‘frictionless’ interaction can lead to a distorted understanding of healthy relationships.
Another area of concern is the impact of AI chatbots and avatars on children and young people. The psychologist expressed concern that exposure to artificial intimacy at a young age could impair the development of important social skills. Turkle cited the example of a mother she interviewed who was pleased that her daughter was able to express her feelings to an AI companion instead of communicating them to a parent. Here, the professor argues that this type of interaction could deprive the child of important learning experiences in dealing with their complex emotions in real relationships.
Avatars of the deceased
The creation of digital avatars of the deceased has been one of the most ethically sensitive applications in the field of AI intimacy since it became known. The idea of being able to continue interacting with a loved one after their death may seem comforting at first glance, but Dr. Turkle warns of the psychological implications.
“When you grieve for someone who is no longer there, you leave space to bring that person into yourself,” she explains. In her opinion, people who rely on an AI avatar can short-circuit the natural grieving process, impairing their ability to accept the loss and grow from the experience.
Although the professor does not directly call for a ban on these technologies, she does concede that in some cases they can offer convenience or even serve as useful aids. However, she calls for users to maintain a “double consciousness”, i.e. they must be aware that they are interacting with a computer program and not a real person. She acknowledges that this is an increasing challenge as AI becomes more sophisticated and lifelike.
Turkle also pointed out that these AI avatars are usually trained based on extensive internet data and it is therefore possible for them to say things that a real person would never say. She also expressed concern about the way these technologies are being marketed, essentially as a means to avoid having to say goodbye to dead loved ones.
The sociologist also has some advice for those engaging with AI intimacy technologies. The professor encourages users to view these interactions as exercises in self-reflection, rather than a substitute for real relationships. “The main thing I would suggest is that this is a kind of exercise, hopefully, in self-reflection. The only good thing that can come out of it is that you can better reflect on your life with the person you loved and lost.”