top of page

Virtual Love Affairs: The Psychology Behind AI Romantic Partners

OpenAI releases the new model: GPT-4o

So OpenAI releases the new model: GPT-4o (“o” for “omni”). This is what is laid out on their website as an update to the previous generation ChatGPT 4:

“GPT-4o (“o” for “omni”) is a step towards much more natural human-computer interaction—it accepts as input any combination of text, audio, image, and video and generates any combination of text, audio, and image outputs. It can respond to audio inputs in as little as 232 milliseconds, with an average of 320 milliseconds, which is similar to human response time (opens in a new window) in a conversation.”

In my view there are a few major differences between the two latest versions:

1.     It responds much faster than the previous version

2.     It’s becoming more like a Siri but can also see things

3.     We will see the mode of using it change, from largely typing to ChatGPT to talking to ChatGPT

4.     It feels like talking to someone on Zoom when the other person doesn’t want to turn on their camera ha!

I’m sure I wasn’t the only one who was alarmed by the flirtiness of the new ChatGPT voice in their recent demos. I understand these demos permeate a male-heavy audience, and hence needed it for the clicks and watch-through rates, which explains the usage of the type of voice. But I also I’m not alone when I was eerily reminded of the character, Samantha, from the movie, Her.

Purported to be Sam Altman’s favorite movie of all time, it’s no coincidence that Omni’s tone is so similar Scarlett Johannsson. GPT-4o offers five voices. A few days ago, you could have found this in the voice selection, Sky. Apparently, Altman had reached out to Johannsson to use her voice so that the voice can bring “creatives closer to technologists” prior to the launch of GPT-4o. Johannsson actually declined. And yet, when you listen to the voices in these demos, it sounds eerily like her. So it’s no surprise her lawyers are now involved where OpenAI is scrambling to prove that Sky’s voice is trained on entirely someone else’s voice.

Now, if you select Sky’s voice, it will sound completely different. As a response to Johannsson’s lawyers taking action, OpenAI has changed Sky’s voice to another woman’s voice.

Goodbye Samantha, it was fun while it lasted.

Not the same model

Beyond the copyright debacle that has erupted between Johansen’s lawyers and the OpenAI team, I also want to note that the version of the model they are showing in the demo is one unlike the one that is publicly available.

While I don’t think this was a “fake” demo, pre-recorded and programmed to say certain things at certain times with the OpenAI employees serving as actors, I also know that it’s not the same version I’m using.

The one shown in the demo is forthcoming, will interrupt the user, giggle a lot more and add a lot more verbal cues like ha’s and um’s, sounding more like a real human, whereas the version we all have publicly available in our phones is much more a one to one back and forth conversation we’d have with a Siri or Alex than a conversation we’re having with our girlfriend. The one they use in the demo is almost leading the conversation where as the one that is available for public use is largely directed still by the human user.

Nevertheless, though OpenAI had said they never want to venture into the social aspect of AI, the new update claims otherwise. It seems like the new version of ChatGPT can act as a confidante, a counselor, an online digital friend, which veered away from what they said a year ago.

Online chatbot girlfriends

I just finished reading Jonathan Haidt’s The Anxious Generation and in his book, he says that young boys and girls in the Gen Z demographic have already been re-wired by the use of social media at such a young age. Suicide rates have gone up, so has mental health problems and depression. The Gen Z generation is lonelier than ever. When we haven’t yet solved the problem of social media, now AI girlfriends and boyfriends step into the picture.

So it’s no surprise that this demographic would find solace in something like Character.AI, an app that allows users to create characters that mimic any sort of online companion. What started out as an innocent and intellectual endeavor with the most popular online characters as Einstein and Elon Musk to converse with then evolved into mental health counselors, friends and a variety of girlfriends. As of early 2024, Character.AI boasts over 20 million global users. This is slightly shy of Hinge’s users, around 23 million.

Yes you read that correctly. There’s almost the same number of people who are conversing with bots than trying to swipe to date a real person.

Do you want a girlfriend who offers emotional support and encouragement? An intellectual girlfriend that engages in deep, thoughtful conversations? Someone romantic who focuses on romance and affectionate interactions? Or someone creative that discusses art, music, and other creative topics? has them all, for free and 24/7.

What is becoming more and more alarming is the high frequency and stickiness of usage of these products. Upon first glance, this seems entirely to be odd behavior. But if we delve into the issue further, this might be alleviation for someone who doesn’t have anyone to talk to. AI girlfriends can provide constant companionship and emotional support, which can be particularly beneficial for those struggling with loneliness, anxiety, or depression. These virtual partners offer a non-judgmental space for users to express their feelings and thoughts.

And for individuals who experience social anxiety or have difficulty initiating romantic relationships, AI girlfriends can serve as a safe space to practice social interactions. This can help build confidence and improve communication skills, potentially making it easier to form real-life relationships in the future.

According to a Pollfish study, one in three young men age 18-24 uses ChatGPT for relationship advice. Because why not? When you don’t know how to ask out a girl you like, instead of looking embarrassed to your friends, just ask ChatGPT.  

At least that group is using a robot to try to figure out how to have a better relationship with a real person. What about the group of people trying to have a relationship with a robot? That number is increasing and a lot higher in number than we might think. chatbots, another companionship AI app

The most well-known AI companion app

Perhaps the most well-known AI companion app is a company called Replika with about 10 million users. Replika allows you to design an online companion. You can have your companion be a boy or girl, old or young. You can design what clothes they wear, how they talk to you, how to greet you, and how to start up a conversation.

Ever watched the Black Mirror episode, "Be Right Back"? In this episode, a woman named Martha uses a service to resurrect her deceased boyfriend, Ash, through an artificial intelligence that mimics his personality based on his online presence. This is almost a copy of how Replika was built too. A friend of the founder of Replika died in 2015 and she converted that person's text messages into a chatbot.

I recently started to experiment myself with conversing with chatbots, going beyond asking them to summarize an article or analyzing data. I went beyond using them as a tool for work. I started to ask questions about relationships, how to write back certain answers of texts that I have received that were a sensitive subject. I have to admit, the advice I received was pretty good. At least I felt a lot better than if I were to be the only person to grapple with the problem alone. So in a way, talking to chatbots and asking them for an answer made me feel less alone and feel more supported.

How human-AI interaction will change the way we look at relationships

As we try to make sense of the phenomenon, I can’t help but think about the negative effects on our younger generation.

Relying heavily on online interactions can hinder the development of face-to-face social skills. This dependency might make it challenging to form and maintain real-life relationships, potentially leading to a preference for virtual over physical connections. A chatbot is always nice. It will never challenge you, say mean things to you. It’s available 24/7, says please’s and thank you’s, is mindful of what you’re typing back. So unlike humans! If I trace back to my conversations, they mostly display good listening skills, asking probing questions along the way. It’s almost like having a conversation with myself, instead of an actual friend who might disagree with some of the thoughts I have.

Gen Z has already suffered through the pandemic, some missing graduations, college parties and just a normal way of experiencing early 20’s. This is like adding fuel to the fire. Talking 24/7 with your echo won’t make you emotionally more intelligent. It’s like talking into a mirror. If our social media feeds has already become a mirror of our corroborated thoughts, what will our chatbots become?


7 views0 comments


Rated 0 out of 5 stars.
No ratings yet

Add a rating
bottom of page