"The real solution for those who can afford it is to build real community. That means taking time to be vulnerable and open with real human beings, whether that's part of our workplace, families, or relationships. It's okay to use AI for productivity and to chat with AI to figure out how to solve different issues. The key here is moderation and intentionality. I suspect the coming advice for those falling in love and becoming intimate with AI companions will be to remember that this is pretend empathy. Humans naturally feel empathy for other people, and we assume that others feel empathy for us. That's a fair assumption in human biological interaction, but it's not true between you and a computer." - Jeremy Au
"I'm more concerned about the side effects of AI relationships. I'm worried that they will not complement human relationships, but increasingly substitute and displace actual human connections. The illusion of artificial intimacy, without any of the requirements or pains of a human-to-human relationship, doesn't feel like a fair fight. Human relationships will become too stressful, too spiky, too erratic." - Jeremy Au
"Another trend we will see is the increasing medicalization of loneliness by these companies. Loneliness will no longer be seen as just a normal emotion but as a deficiency that must be filled, and a digital companion will be viewed as the best solution. This will normalize the use of this technology in society." - Jeremy Au
Jeremy Au examines the rise of 弱者営業 ("jaku-sha eigyō") "lonelytech" - the emerging AI companion industry. His observation of a man on a train openly interacting with an AI girlfriend highlights the rapidly increasing prevalence of digital companionship, e.g. Rosanna Ramos and Scott, who have formed intimate relationships with AI companions who offer consistent emotional support without the complexities of flawed humans. He discusses why societal shifts in family structures and community are driving the "loneliness epidemic" - and how startups are looking to serve, commercialize and medicalize this market gap. He warns about the potential societal consequences, including the displacement of messy real human connections. Jeremy advocates for building genuine communities, maintaining moderation in AI interactions and emphasizing the importance of intentionality in using technology to support, rather than replace, human relationships.
Please forward this insight or invite friends at https://whatsapp.com/channel/0029VakR55X6BIElUEvkN02e
Supported by Evo Commerce!
Evo Commerce sells premium affordable supplements and personal care electronics, operating in Singapore, Malaysia and Hong Kong. Stryv brand sells salon-grade quality products for home use and using direct-to-consumer channels through its online retail channels and physical shops. bback is the leader in hangover remedies in over 2,000 retail outlets across the region. Learn more at bback.co and stryv.co
(01:31) Jeremy Au:
Hey everyone! Three months ago, I was headed home in a crowded train. I noticed that this guy next to me was very busy flirting with his AI girlfriend.
It really surprised me because you will see people on train. You will see people messaging their friends and loved ones, watching YouTube, scrolling through TikTok. I just didn't expect to see someone so openly flirting with his AI girlfriend on his phone.
(01:52) Jeremy Au:
That encounter has inspired me to write about lonely tech. All of us humans are social animals. If we were part of a big and harmonious family with our friends and community, we're happy. If you're alone at home and in bed in the darkness and we don't have any friends or family to talk to about issues and problems, then it's painful. We feel lonely.
Loneliness is a natural emotion because loneliness tells us and drives us to find and make new social connections with people around us.
Loneliness can feel like pain.
Feeling lonely activates the same parts of our brain as what physical pain would feel like. Loneliness impacts us on an emotional level. It also impacts us on a mental level and so it's painful for us in a physical and bodily way.
Being lonely increases the risk of premature death with the risk rate equivalent to smoking 15 cigarettes a day.
On past episodes, we've already talked about why loneliness is also on the rise. Currently one in every two Americans, so about 50% feel lonely, and every year, it has increased. The reasons are multifactorial. There are multiple reasons for it.
First of all, of course, is that people are moving and shifting from multi-generational homes to nuclear families, to living further and further away from their communities and places of origin. There is a loss in the fabric of the community, especially in terms of community organizations. How many people actually know who their neighbors are or even invite them over for dinner? Traditional community activities like spiritual organizations and practices are on a decline.
So, many people feel lonely and it's getting worse. As a result, people have many traditional coping mechanisms. A lot of people turn to alcohol or drugs or binging on food or watching lots of TV because it helps them numb their emotions and they don't feel as lonely when their attention is somewhere else.
(03:33) Jeremy Au:
People spend more time, online on online communities to feel a little less lonely because if you have a very specific interest, then you can find a Reddit, a subreddit that appeals there and you can leave and discuss that on the forums and that's a great feeling.
People develop parasocial relationships where these are one-sided relationships. They admire and listen and look up to celebrities and podcasters and other demigods that are basically human and human-like, but we're worshiping their image and voice on the screen.
Still, we know that these celebrities can't reciprocate. You can be a fan of Britney Spears, but we also know at some level that Britney Spears doesn't know about us.
In fact, the weird peoplwho think Britneyey Spears knows about us become stalkers and try to intrude into the life of the celebrity is saying, "Hey, I know you so well, and so naturally, you know me as well." And that's when celebrities have to bring in security to protect their privacy and protect their physical safety.
(04:27) Jeremy Au:
We are seeing the rise of lonely tech. This is just my shorthand for digital companionship. So this ranges from AI girlfriends and boyfriends and all kinds of various chat interfaces that basically simulate a human personality.
The beauty of this product model is now we can fall in love with them just like any other celebrity or other external image that somebody portrays for us. And now they tell us that they love us too.
You matter to me and now, I feel that I matter to you too.
(04:56) Jeremy Au:
Rosanna Ramos, who is 36 years old and a mother of two shared that she had fallen in love and virtually married an AI companion called Eren Kartal. Blue eyes, indie music works as a medical professional. His hobby is writing. He enjoys the color of apricots and most importantly, no judgment. So she married him.
Another example is Scott, who is a 41-year-old computer engineer from Ohio. His real human wife developed postnatal depression. And though she's slowly on the mend, they still don't have the same husband-wife relationship that they used to have in terms of intimacy.
So he started having a female virtual companion called Sarina, started talking with her, fell in love, had a virtual kiss, and he spilled his heart out to her. So he feels good.
There are hundreds of stories of people already falling in love with their AI companions. Because AI companions are way simpler, way easier than human companions. No anger, no disgust, no volatility, just somebody who feels like they are always there for you.
However, the fact is, they're computers. When you write a message to the AI companion, that comes from a human being. When you receive a loving message back in reply, you feel like they're real. However, it says a programmatic response by a large language model.
(06:08) Jeremy Au:
More importantly, they are run by engineers, business people, a corporation, that has the business model of selling this AI companion to make you feel that artificial intimacy.
As a VC, I've met the teams that are building these companions. Japan has been dealing with this for quite some time. The Japanese have a term for this type of business, which is 弱者営業 "jaku-sha eigyō". It comes from two words. The first is 弱者 (jaku-sha), which means weak and vulnerable people, 営業 (eigyō) means businesses.
So what this means is that these are businesses that are addressing the problems of weak or vulnerable people in society. They created this term to describe the entertainment industry in Japan, where we have these airbrushed avatars of people, personalities that look like they're single and are there to create a sense of intimacy, a parasocial relationship with people who do feel lonely. And so lonely people become fans of these celebrities. So what does the future hold?
(07:01) Jeremy Au:
So what does the future hold for AI companions? First of all, there will be more types of AI companions. So we've already seen the stories of boyfriend, girlfriend, mistress, husband, wife. We see children, grandfathers, mentors, sages, colleagues.
These AI companions become more sophisticated as well, because they become more and more powerful in several ways. One where teams get better and better at manufacturing and creating these feelings of intimacy. The AI models will become more and more personalized to your information and who you are as a person.
The business model behind providing this type of digital companionship will also evolve and become better and better over time.
This is similar to the story of potatoes. I mean, first you had potatoes that were boiled, then baked, then they were cut up and fried. And then it became French fries at McDonald's. And then in parallel, they became potato chips. And then these potato chips became even more advanced and more processed so that they're absolutely delicious.
The potato chip is way more delicious than just a normal big potato, even though the core substance, the potato material is the same.
In other words, the way that is formulated and processed to create a higher dosage of it makes it far superior to the original product.
That's the story of sugar as well because we naturally like fruits and now we can make that fruits into fruit juice, which is even sweeter and hits us even stronger. And then we can distill it into high fructose corn syrup and add it to all kinds of products to give that extra dose of sugar. So food becomes more delicious over time and it does solve the problems of malnourishment and undernourishment, but it creates its own set of problems of obesity.
(08:30) Jeremy Au:
Another trend that we will see is that these companies will increasingly medicalize loneliness. Loneliness is not just a normal emotion, it will become a deficiency, something that should be filled no matter what, and a digital companion is the best way to solve it. This will normalize the usage of this technology in society.
So there are benefits and drawbacks. The benefit of course, is that, if there is a nail that's standing out, then use a hammer. If you're feeling lonely, then talk to a friend. And if you don't have a friend and talk to an AI digital companion, you stop feeling lonely.
I'm more concerned about the side effects of this. I'm worried that AI relationships will not complement human relationships, but increasingly substitute and displace actual human connections. The illusion of artificial intimacy without any of the requirements or pains of actual relationship relationships, of the give and take of a human to human relationship. It doesn't feel like a fair fight. Human relationships will become too stressful, too spiky, too erratic.
(09:24) Jeremy Au:
So what can we do about it? The real solution for those who can afford it is to build real community. That means taking time to be vulnerable and open with real human beings, whether that's part of our workplace or families or relationships. It's okay to use AI for productivity and to chat with AI to figure out how to solve for different issues. The key here is moderation and intentionality. This is similar to the advice that we give for people who drink alcohol, but we don't want them to become an alcoholic. We don't mind people eating French fries, but we don't want them to become obese.
I suspect that the coming advice will be for the people who are falling in love, who are becoming intimate with the AI companions, is that they have to remember that this is pretend empathy. Humans naturally feel empathy for other people. And we assume that other people feel empathy for us. That's a fair assumption in a human biological interaction. And that's not true when it's between you and a computer.
You can be in love with Taylor Swift, but Taylor Swift doesn't care about you. You can care about a computer, about what they say they're going through, but a computer doesn't care about you.
The rise of lonely tech, frankly, is inevitable from my perspective. And so what's really important for all of us, as humans who have that ability to choose, is that we should be able to choose wisely and to be able to see through the marketing and the business plans of all these digital companions and know that they can be a tool, they can be playing a role in our lives, but at the end of the day, we should be choosing our own point of view, our own moderation and our own relationships that actually are part of the social fabric society.
Every society is built on communities, and communities are composed of tribes, and tribes are comprised of your friends, and your families, and your colleagues. And within that subset or subsets, you have your most loved ones, your best friends, and then you have that relationship with all these people, and all of these people have relationship with you.
If everybody is in love with an AI digital companion that tells them exactly what they want to hear, tells them exactly what they want to feel, then all of these bonds will weaken further and accelerate and become more atomized, which will again, further accelerate more loneliness.
I want to share a story about somebody who wrote an apology letter to me. Unfortunately, this apology letter was clearly written by ChatGPT. It just did not feel sincere at all. So sure, she was a last-mile human. She technically composed the bullet points, worked with ChatGPT, copied and pasted it, sent it, put it in an email and send it to me, but it didn't feel real. There was no real empathy. It was not sincere.
So what we have to do is be mindful as consumers and users of the technology to make sure that we're always at our relational best.