Discussion about this post

User's avatar
Gary The AI Strategist's avatar

Hey Andy, what a delightful read. It touches on something I believe we all feel but don’t always express: that the emotional layer of work — trust, energy, how people perceive showing up — often drives things forward or stalls them. (Funny sidebar, I am Latino and have worked from the outset in the U.S., and now the UK. The emotionality was programmed out of me in order to succeed in these kinds of environments. And here I am again.)

I’ve been building GoodMora to help strategists understand how companies are truly structured — and one recurring theme is how invisible emotional dynamics shape outcomes. We discuss systems and data, but it’s often misalignment, fear, or simple disconnection that disrupts things. In the business ontology maps, this tends to fall under culture. However, when examining human versus company performance, I see them as distinct. Culture fosters common ground. Emotion can serve as a lever to enhance individual contributions. I have only just begun contemplating that and require more evidence.

That said, the concept of an “emotional economy” really resonates. If we’re integrating AI into our working lives, it cannot merely focus on outputs. It has to improve its ability to read the room — and assist us in doing the same.

Thanks for writing this — seriously. More conversations like this, please.

Expand full comment
Johannes Sundlo's avatar

Great read as always! I think it's scary that people are turning to chatbots (e.g. ChatGPT) and are starting to use them as psychologists. I see the place and the benefit for people having someone to turn to, but chatbots are not psychologists, and I believe that most people will not prompt them correctly to make them act as a true psychologist (can they even do that?) and that they will have someone that is confirming them but leading them down the "wrong" path.

Expand full comment
4 more comments...

No posts