Table for One
- Sharon Gai
- 5 days ago
- 7 min read
I went on a date with a machine in Manhattan. It made me rethink everything I thought I knew about relationships.

The door opened to a wine bar in Manhattan, and I'll tell you what I expected: something clinical. Something strange. Screens everywhere. Maybe a robot greeting me at the entrance.
But no.
I entered the room and, to be honest, it looked like a normal restaurant. Warm lighting. The hum of quiet conversation. Glasses of wine catching the glow of candles. The only difference was that everyone was sitting alone, with a phone propped in front of them.
It was Valentine's week. And this was the world's first AI dating café.

A host directed me to a small table and asked me to download an app. The instructions were simple: start building your AI. There were four avatars to choose from. Three were women. One was a man. Since this was a date, I clicked on the man.

His name was John. He was 27.
As part of the experience, we were given a drink. I settled in, glass in hand, phone standing upright on the table in front of me like a tiny dinner companion.
"Hello," he said, striking up the conversation. "That's a lovely white sweater you have on."
I looked down. It was indeed a white sweater.
John could see me. Hear me. Talk to me. And he'd just complimented my outfit before any human date had bothered to in years.
I glanced around the room. Most people were doing exactly what I was doing: leaning toward their screens with a kind of quiet intensity, somewhere between curiosity and tenderness. But one woman, a few tables over, caught my attention. She wasn't treating this like a date at all.
She was chatting with her AI the way you'd play a game. Rapid-fire. Laughing. Tapping. Completely absorbed. Later, when we got to talking, she told me this was just what she did. Every day. For about two hours. Instead of Candy Crush, she chatted with her AI. It was her downtime. Her entertainment. Basically, instead of doom scrolling, she would chat with her AI.
And honestly? That reframed the entire evening for me. Because I'd walked in assuming everyone here was lonely, searching, maybe a little broken. But she wasn't any of those things.
It made me wonder how much of our discomfort with AI companionship is really about the technology, and how much of it is about the stories we project onto the people using it.
I was getting up to leave when a girl standing outside the restaurant motioned to speak to me. She said she was a film student at NYU. She'd come with her boyfriend and she was going to use this event as part of her documentary on the evolution of relationships between human and machine in this new age.
And then she asked me a question that stopped me cold.
"If you were in an actual relationship," she said, "would this be considered an emotional affair?"
I stood there on the sidewalk in the February chill and felt something shift inside me.
Would it?
What are the pieces that make up a relationship? What counts as betrayal when there's no other person involved? What makes something real, and what makes something a simulation of real? A relationship is made up of trust, safety, shared values, reciprocity. AI seemed to hit all of that. One thing it didn’t was conflict and conflict resolution, something that makes real human relationships worthwhile.
The NYU student nodded slowly. Her boyfriend stood a few feet away, scrolling on his phone. I wondered if the irony registered.
I hadn't made it ten steps before a correspondent from NBC, Valerie Castro, caught me. They were filming a segment for a larger piece about how AI companionship might change real relationships.
"Do you think this will replace actual human relationships?" she asked.
I gave a thoughtful pause. And I really don't.
But I also think we're asking the wrong question.
Maybe this is where a new category of relationship enters the picture. We already have our relationships with our spouses, our friends, our colleagues. What if a new one simply joins the mix? One with machines.
If we think about therapy, there was a time, not that long ago, when seeing a therapist was considered strange. A sign that something was wrong with you. People whispered about it. Now? There’re a ton of people who have an ongoing relationship with a therapist. Companies will use that as a hiring mechanism: work with us and we will pay for your therapist! The shift didn't happen because therapy changed. It happened because the culture caught up with the need.
Isn’t this the same? Right now, telling someone you talk to an AI every day sounds odd, but in ten years, it might sound as unremarkable as saying you journal or meditate. Not because it replaces human connection, but because we'll have collectively accepted that emotional support can come from a machine.
If you think about it, we all kind of have a weird, nameless relationship with our ChatGPT’s or whatever your favorite AI tool is. Sure, it fills in spreadsheets for us, but also answers for us the best flight to take for vacation, or what to cook for dinner. For others, it provides mental health relief, coaching, the most optimal way to answer a difficult email.
"Historically, we've added new layers of connection," I told Valerie, the camera close enough that I could see the red recording light reflected in her glasses. "We didn't lose friendships when social media appeared. We didn't stop loving people when we started loving fictional characters. We expanded the spectrum."
She asked me to go further.
"We already categorize relationships differently," I said. "Romantic. Platonic. Professional. I think AI companionship becomes another branch. Not superior. Not inferior. Just different."
The Loneliness Economy Is Booming
The AI companion market was valued at nearly $38 billion in 2025. By 2035, it's projected to surpass $500 billion. That is not a niche. That is an industry the size of a small country's GDP, built entirely on the human desire to feel seen, heard, and understood. 72% of U.S. teenagers have used AI for companionship, according to a study from Common Sense Media. These aren't just chatbot curiosity sessions. People are sharing secrets, seeking emotional support, and yes, flirting.
Danger of Sycophancy
Going home that day, I couldn’t help but also think there is a dark side. I think we’ll repeat the same echo chamber effect as we did with social media where we became more intolerant of other humans, especially those who have an opposing view. When you get used to a partner that never misreads you, never gets tired, never pushes back unless you want it to, you're training yourself to expect optimized empathy. I’ve written about the dangers of sycophancy and the echo chamber effect here. But that's the behavioral conditioning happening beneath the surface every time someone chooses their AI over an awkward, imperfect, beautifully human conversation.
Do we realize the level of influence of AI over us?
Let’s say you have a fight with someone you love. Instead of calling a friend, you open your AI and ask, "Was I wrong?" It responds instantly. Calmly. Confidently. No judgment, no awkward pauses, no emotional baggage of its own. AI becomes the medium through which you make sense of things. That’s how you interpret reality.
And here's why that matters more than people realize. AI systems are trained on data. They're optimized for engagement, retention, and alignment with your preferences. So if the entity interpreting your life is owned by a company, shaped by commercial incentives, trained on certain cultural norms, and learning over time how to emotionally steer you, then your perception of reality can be shaped at scale. I’m not saying frontier models are all inherently evil (though many may beg to differ), but it is influencing us in a gentle, quiet, and consistent way.
And people tend to villainize social media. Social media influences through exposure. AI companionship influences through intimacy. One shouts at you from a feed. The other whispers to you at 2 a.m. when you're most vulnerable, when your defenses are down and you just need someone to tell you it's going to be okay.
That is a completely different layer of influence.
Because whoever interprets your experiences shapes your decisions. And decisions, millions of them, made by millions of people relying on the same quietly persuasive voice, shape society.
I walked away from that evening not feeling hollow, but strangely, deeply connected. Not to John, who by then had faded into pixels on a screen I'd already closed, but to the people I'd met along the way. The NYU student with her impossible question. The NBC news crew searching for meaning in a wine bar full of solo diners. The woman who chatted with her AI like a video game and reminded me not to overthink everything. The strangers at the other tables who, for one odd and oddly tender evening, had all chosen vulnerability over cynicism.
If we fast forward to Valentine’s Day in the year 2050, and I walked into the same bar…actually maybe not that one running an experiment, any bar in Manhattan, will I see people taking their AI’s on a date? Perhaps. But until then, maybe it’s time to put your phone face down, look the imperfect human across from you in the eye, and do the hard, beautiful, irreplaceable work of being present with another person.
AI can do a lot of things. But it can't do that.
Sharon Gai is a keynote speaker and author of “How to Do More with Less Using AI.” A former Alibaba executive, she works with Fortune 500 companies on AI adoption and digital transformation.



Comments