6 Comments
User's avatar
Gary The AI Strategist's avatar

Hey Andy, what a delightful read. It touches on something I believe we all feel but don’t always express: that the emotional layer of work — trust, energy, how people perceive showing up — often drives things forward or stalls them. (Funny sidebar, I am Latino and have worked from the outset in the U.S., and now the UK. The emotionality was programmed out of me in order to succeed in these kinds of environments. And here I am again.)

I’ve been building GoodMora to help strategists understand how companies are truly structured — and one recurring theme is how invisible emotional dynamics shape outcomes. We discuss systems and data, but it’s often misalignment, fear, or simple disconnection that disrupts things. In the business ontology maps, this tends to fall under culture. However, when examining human versus company performance, I see them as distinct. Culture fosters common ground. Emotion can serve as a lever to enhance individual contributions. I have only just begun contemplating that and require more evidence.

That said, the concept of an “emotional economy” really resonates. If we’re integrating AI into our working lives, it cannot merely focus on outputs. It has to improve its ability to read the room — and assist us in doing the same.

Thanks for writing this — seriously. More conversations like this, please.

Expand full comment
Andy Spence's avatar

Thanks so much Gary and glad this one resonated with you. The topic forces us to think about our lives at different levels. About relationships - if we can be companions with an app, what does this tell us about our human relationships. And work too, how do we design work systems that take into the account of emotions, bonds, trust. My experience working on large complex change programmes helped me learn just how important this is.

Expand full comment
Gary The AI Strategist's avatar

This is a space to explore further, my friend. Not long ago, the term "emo" was a curse in business. I remember the CEO telling me that the thing they treasured more was an employee's ability to kill a colleague.

Expand full comment
Johannes Sundlo's avatar

Great read as always! I think it's scary that people are turning to chatbots (e.g. ChatGPT) and are starting to use them as psychologists. I see the place and the benefit for people having someone to turn to, but chatbots are not psychologists, and I believe that most people will not prompt them correctly to make them act as a true psychologist (can they even do that?) and that they will have someone that is confirming them but leading them down the "wrong" path.

Expand full comment
Dorothy Dalton's avatar

This develoment is scary and beyond creepy. I would be interested to see the gender breakdown of these relationship bots. How many are female creators and what percentage are female users?

Expand full comment
Andy Spence's avatar

The gender impact of this technology is fascinating. Many earlier versions were designed by by software engineers who are mostly men - 80% and this seems to be the case in with AI engineers too. Sociologists have warned of the danger of using tech to mirror the sexist analogue parts of our society. Yet the dating story that caught my attention from China specifically mentions women looking for avatar boyfriends. What does this tell us about our society? ( Article here - Why Chinese women are looking to ChatGPT for love https://www.bbc.co.uk/articles/c4nnje9rpjgo )

Expand full comment