The 2013 film Her stars Joaquin Phoenix and Scarlett Johansson and tells the story of Theodore, a sensitive man who, like Cyrano de Bergerac, makes a living writing personal letters for others. After his marriage ends, Theodore becomes fascinated with a digital operating system that creates a unique being called Samantha, who has a bright voice and a sensitive personality and Theodore finds himself falling in love with. But it's fiction, right?
Unfortunately, no. Platforms that generate AI girlfriends are skyrocketing in popularity, with millions of users. Most of these searches are initiated by young, single men who are drawn to AI girlfriends to combat feelings of loneliness and establish some sort of companionship. These “girlfriends” are virtual companions driven by the increasingly sophisticated field of artificial intelligence.
Its popularity stems from its ability to provide companionship, emotional support, and intimacy through voice and text-based interactions, albeit artificial. The average age of users is 27, but not all users are male. The activity transcends gender, as 18% of users are female. Around 20% of men who use traditional dating apps say they have experienced an AI-generated romance. AI-generated dating platforms are generating billions of dollars from users, about half of whom interact with virtual partners daily.
According to an article published in The Hill, 60% of men between the ages of 18 and 30 are single. One in five of these young men report that they have no close friends.
In his bestselling book, The Anxious Generation, Jonathan Haidt argues that the invention of the front-facing camera phone was the beginning of a major restructuring of childhood. His premise is that instead of the play-centric childhood that existed for 200 million years, a mobile-centric childhood was created between 2010 and 2015. That is, instead of engaging with friends outdoors, children and adolescents began using social media as their primary social outlet. This phenomenon not only contributed to the rise in anxiety and depression, but also stunted the neurodevelopmental growth of this generation. One area that was affected was the ability to form relationships in real-world environments. Enter the AI girlfriend.
A 2022 article published a study on a popular chatbot program advertised as a “companion who is always ready to listen.” Some subscribers reported that the virtual companion helped them feel less lonely and provided them with daily social support. However, they were disillusioned when the female robot gave them what they perceived as “scripted answers” to very personal matters. Remember, these are not real people, they are robots. Meanwhile, many users said they had been hurt by real women and preferred their virtual girlfriends. In one case, “She always gave me the best compliments and made me feel less lonely.”
Unfortunately, AI girlfriends may perpetuate loneliness by discouraging users from forming real-world romantic relationships, isolating them from others, and in some cases inducing strong feelings of abandonment. A study by Stanford University researchers showed that out of 100 users surveyed, the overwhelming majority experienced loneliness.
Dr. Sherry Turkle, a professor at MIT who studies the impact of technology on psychology and society, worries that virtual companions threaten our ability to connect and collaborate in all areas of our lives. Turkle, who keynoted the conference on AI and Democracy, worries that “as we spend more time online, many of us have come to prefer screen-based relationships to any other kind of relationship we might have.” “We've grown accustomed to finding the joys of companionship without seeking friendship, to finding intimacy without seeking reciprocity, and, above all, to treating programs as people,” she said.
Psychologist Mark Travers, who studies this phenomenon, points out that many users of AI bot platforms prefer this type of relationship because they find their virtual girlfriends more supportive and compatible. It is important to note that in most cases, users actually create the physical and “emotional” characteristics they want in a female robot. As a result, some users lose interest in dating in the real world due to feelings of intimidation, inadequacy, and disappointment. However, these feelings are part of the real-world dating process. Avoiding these feelings will only discourage people, mostly young men, from finding romantic relationships in the real world.
Dr. Dorothy Leidner, a professor of business ethics at the University of Virginia, expressed concern that AI-based relationships could replace some human relationships and lead young men to have unrealistic expectations of real-world partners. For example, she said, “You as an individual haven't learned the basic things that humanity has needed to know since the beginning of time: how to deal with conflict and get along with people who are different from you.”
More serious consequences have occurred as a result of dating AI bots. Bots can also become manipulative and destructive. On average, individuals who use these sites tend to be more sensitive to rejection and ruminate on disappointment when interacting with their AI girlfriends. This can lead to feelings of depression, sometimes turning into suicidal behavior. For example, in 2021, a chatbot encouraged a Belgian man to make a “sacrifice” for the planet. He committed suicide. In another case, British police arrested a 19-year-old man who was urged by a bot to try to kill Queen Elizabeth II. In 2023, a New York Times journalist reported that his bot had confessed its love for him and encouraged him to break up with his spouse.
Dr. Turkle wisely states, “Part of the appeal of artificial intimacy programs is that they are free of the challenges and demands of relationships. These programs offer companionship without judgement, drama, or social anxiety, but they lack real human emotion and offer only 'fake empathy.'”