The Emotional Economy: When AI Becomes a Friend
From friendship apps to therapy bots - how tech is answering our need for connection
How many good friends do you have?
I mean really good friends. Not a follower on LinkedIn, but the type of friend that will help you move house. Research over time shows that the answer is about 5 to 10 people, across different cultures and generations.
✋🏼 on the one hand, we have the rise of self-reported loneliness in society from people of all ages.
👐🏼 on the other hand, the most common use of AI bots is now ‘therapy and companionship’
Today’s article is the joining of hands of these two topics.
This week, as we open the doors to the Work 3 Community, we’re reflecting on the most human part of work: connection.
Join the Work3 Community — And Help Shape the Future of Work
🎉 We’re live. Today, we launch the Work 3 Community — a new space to explore the transformation of work, together. Whether you're a researcher, freelancer, policy nerd, L&D lead, or curious human — this is your invite to think more deeply about how we work, and what comes next.
💥 Join as a Work 3 Community Member to get access to monthly webinars and group discussions.
🚀 Or step up as a Work 3 Founding Member (limited to 100) to get 1:1s, early invites, and our upcoming report: The Impact of AI on Work.
📬 Work 3 Weekly (Free) - stay subscribed and keep receiving our weekly ideas, essays, and briefings. Free forever.
We believe the next chapter of work needs more questions than answers — and that the best way to figure it out is together.
I Kissed An App and I Liked It
What happens when the fastest-growing category in tech isn’t productivity—but intimacy?
In a survey created by Marc Zao-Sanders, ‘therapy and companionship’ is the no.1 use of LLMs in 2025 indicating a shift from the practical to the emotional.
In this context, Therapy involves structured support and guidance to process psychological challenges, while companionship encompasses ongoing social and emotional connection, sometimes with a romantic dimension.
There have been some interesting examples of people using apps for companionship and therapy :-
Replika – is an app that allows users to form romantic relationships with AI avatars.
“the AI companion who cares – always here to listen and talk and always on your side”
The app can send voice messages, and even do live video calls with bot in human-like form on the screen. There’s a fun investigation here.
Character.AI – lets users chat with fictional and real celebrity avatars from Napoleon to Katy Perry.
Dan – which stands for Do Anything Now - is a “jailbreak” version of ChatGPT. This means it can bypass some of the basic safeguards put in place by its maker, OpenAI, such as not using sexually explicit language.
In China, the Maoxiang app is very popular. It allows users to create their own virtual characters to interact with. It has 2.2m monthly active users, for comparison DeepSeek has about 14m.
HereAfter.ai creates an interactive avatar of a deceased person – trained on their personal data.
AI Therapists like Woebot, Youper – provide emotional health assistants.
Wysa is being used by national health services in the UK, it offers chat functions, breathing exercises and guided meditation, with escalation pathways to helplines.
All very 2025. But is all this bonding with inanimate objects new?
What Came First Technology or Bonding?
From 1950s letter writing pen-pals, to Tamagotchi and Furbys, and even pets.
From letter-writing pen pals and Tamagotchis to Furbys, furry friends 🐱🐶 and Facebook, humans have long formed bonds with the non-human and the distant.
A few years ago, I heard the philosopher, Alain de Botton, explain that platforms need to fulfil deep-seated problems.
What problem does (did) Facebook solve?
The answer is Loneliness.
So it is a logical progression that Facebook might try and solve the loneliness problem without relying on the unpredictable humans who click ads and fund the model. In recent speeches, Mark Zuckerberg, has outlined how they see Meta AI helping people simulate difficult conversations they need to have with people in their lives.
Over time, with AI, personalisation will happen in AI companionship, and will become rather powerful.
Going back to the original question, how many ‘friends’ do you need?
Dunbar’s Magic Number
The number of stable relationships a person can cognitively maintain is 150.
Our friendship groups can be thought of like layers in an onion according to Robin Dunbar, an anthropologist.
there is a layer of approximately 10 people with whom we share most of our social activities: trips to the theater, dinners, or hikes. The next layer — consisting of around 50 people — represents our weekend group: the folks we would invite to a barbecue, or a big party. And, finally, we come to the layer of 150 people, our entire social network. These are the acquaintances that ‘we could invite to a massive party’ or ‘greet without hesitation at an airport’.
This makes me think about my real social network, in real life and online (watch out for future Work 3 Newsletter articles on this topic)
Dunbar’s layers remain stubbornly intact—even in the age of social media.
So as humans there are cognitive limits to our network, what are the other challenges to having inanimate friends?
Fiendishly Difficult Ethics
Having compelling AI companions for work, to support mental health issues or as a romantic partner throws up some tricky ethical issues.
Emotional dependency – could AI reduce our need—or ability—to form human bonds?
Privacy – is it OK that your data sits on a server ready to be hacked and monitised?
Consent – can a relationship be meaningful if one side lacks consciousness?
Control - what happens when emotional support is owned by a single platform?
But I also see a massive upside in the combination of pattern recognition and apps.
Democratising Coaching, Therapy & Growth
If you are relatively lucky you will have experienced the benefits of a coach, maybe for sport or music, or a counsellor.
We happily pay an expert for an hourly rate lets say $100.
But not everyone has access to this expertise.
An estimated one million people are waiting to access mental health services in the UK alone.
From $100/hour to $100/year—emotional support at scale could become the most profound shift of the decade.
An app will never replace an emotionally empathetic expert with decades of experience.
Today’s apps might be a 5/10 coach - but with enough data and time, they’ll become a solid 8.
This could be a massive growth opportunity for billions of people.
Synthesis and Conclusions
If the 2010s were about people’s need for ‘status’ online—manifesting in curated Instagram feeds and filtered selfies—then maybe the 2020s are about people’s need for ‘belonging’.
As Marshall McLuhan said:
"All media are extensions of some human faculty — psychic or physical."
How we use technology holds a mirror to us as a society.
Maybe, with the right safeguards, technology can help us not only survive—but connect and flourish in surprising ways.
This article is not really about work, but two thoughts spring to mind.
Some work is emotional, and AI companions might just be able to handle some emotional tasks even better than humans.
The employer brand - aka Why would I want to work for you? Modern workforces include all sorts of people all with different needs, including emotional and social. If smart employers in competitive industries can solve these with physical or digital solutions - they might just have an edge.
In the emotional economy, relationships aren’t just personal—they’re programmable.
One of your many friends,
Andy
Hey Andy, what a delightful read. It touches on something I believe we all feel but don’t always express: that the emotional layer of work — trust, energy, how people perceive showing up — often drives things forward or stalls them. (Funny sidebar, I am Latino and have worked from the outset in the U.S., and now the UK. The emotionality was programmed out of me in order to succeed in these kinds of environments. And here I am again.)
I’ve been building GoodMora to help strategists understand how companies are truly structured — and one recurring theme is how invisible emotional dynamics shape outcomes. We discuss systems and data, but it’s often misalignment, fear, or simple disconnection that disrupts things. In the business ontology maps, this tends to fall under culture. However, when examining human versus company performance, I see them as distinct. Culture fosters common ground. Emotion can serve as a lever to enhance individual contributions. I have only just begun contemplating that and require more evidence.
That said, the concept of an “emotional economy” really resonates. If we’re integrating AI into our working lives, it cannot merely focus on outputs. It has to improve its ability to read the room — and assist us in doing the same.
Thanks for writing this — seriously. More conversations like this, please.
Great read as always! I think it's scary that people are turning to chatbots (e.g. ChatGPT) and are starting to use them as psychologists. I see the place and the benefit for people having someone to turn to, but chatbots are not psychologists, and I believe that most people will not prompt them correctly to make them act as a true psychologist (can they even do that?) and that they will have someone that is confirming them but leading them down the "wrong" path.