Let’s celebrate unsafe behaviour
Economists used to believe us humans were completely rational beings, who, given all available information, would always make the best decision. And the only reason we make poor decisions is when we don’t have the full picture.
That was until Nobel prize winning psychologist Kahneman changed everything.
In collaboration with behavioural economist Amos Tversky, their lifelong mission proved people aren’t rational. In fact they proved our brains are hardwired to make the wrong decisions under particular circumstances, because of shortcuts we take – called heuristics and biases. Perhaps you read Kahneman’s bestseller on the subject, called Thinking Fast and Slow?
It turns out that even with complete information, us humans still make the wrong decisions. And this transformed economics – with far-reaching effects for industry too.
All of a sudden, economists had to rethink the established theories and formulas that modelled how people behave – it turned their world on its head. And a lot of what Kahneman and Tversky theorised played out in real-world events, like the financial crash of 2008 – proving markets are far from perfect, mainly because humans are involved.
So what does this mean for safety?
Well, let’s look at how we interact and engage with people on safety, to try and influence the decisions they make at work (remember, not all decisions involve deliberation, they’re often choices made in a split second, by staff at every level).
In the safety world, we see organisations who’ve spent decades disciplining people for making the wrong decisions and taking risky shortcuts. That’s fair enough in the case of gross negligence. But in the majority of cases, more enlightened organisations realised long ago that blaming people for minor mistakes is unfair. It’s effectively reprimanding them for being human, and all this does is perpetuate the everyday stigma we attach to mistakes.
Yet many organisations still strive towards a vision of cultural perfection, where they put confidence in their systems to influence staff to make all the right choices, all of the time – so mistakes never happen, and neither do accidents.
You see this a lot with zero harm initiatives – people know they’ll never really reach safety target zero, yet still they strive for what’s effectively an impossible goal.
Remember Kahneman and Tversky? Well this is completely at odds with their findings – that humans are hardwired to make mistakes. Zero isn’t even theoretically impossible – it’s practically impossible.
It can never happen.
The sense of psychological safety
A better vision would be to accept people are going to make mistakes, then focus on creating an environment where you openly discuss, and celebrate admitting to, and learn from those mistakes.
That should be the true vision of cultural perfection.
The evidence backs this up too, in some especially fascinating research. With unlimited money and resources, Google have identified that in their most high-performing teams, a significant component was what they call a sense of psychological safety.
Their research found that in the most high-performing teams, people were more inclined or encouraged to speak-up and admit their biggest failures. People saying things like “I don’t get this” or “I’ve made this mistake” – that’s what seems to determine success. And it was down to leaders to create an environment where it was OK to do that (although at Tribe we’ve known this for years!).
Similarly, Google have a thing called ‘The Moonshot Division’, and the leader of that team decided to award bonuses – not for the completion of projects though, but for failure. So when someone puts their hand up and admits they might overspend, or miss a deadline – at that point the boss would reward them, put them on stage and perform a whole ceremony to recognise them speaking up!
This defies conventional wisdom. But it also highlights an important point: people don’t realise something is a mistake or a failure, until the point it happens.
Great, but who has Google’s budget?
Well, budget or not, what you can do is accept people make mistakes but they don’t do it knowingly. Most people act only with good intentions – they believe they’re doing the right thing for the right reasons, at the time.
The real mistake is letting people carry on making the same mistakes, or hushing them up – instead of learning from them, and preventing more in future.
In the context of your average factory floor, Joe Bloggs might leave the machine guard off so he can get his job done quicker. Feeling guilty, he slips off home without filing a near-miss report. Joe is lucky, but soon an accident happens – then we have the whole post-mortem digging around until it reveals the truth, at which point all Joe’s colleagues reveal they too took the same risk many many times.
A better way is to have Joe Bloggs feel like he can say “Hands up. I did something dangerous today, I made a mistake and I want to sort it because I know it’s the right thing to do”. Inspired by his courage, Joe’s workmates put their heads together, thus preventing the accident ever happening in the first place, making work safer and more efficient.
What follows is a snowball effect, of people looking out for each other and preempting even more mistakes and this whole cycle becomes self-sustaining. This is what Tribe mean when we talk about true sustainable culture change – this is the gold standard.
But won’t a lack of negative consequences encourage riskier behaviour?
I suppose it might encourage some people to push the boundaries a bit more in how they do their jobs, but is that really a bad thing? Especially when you encourage that thought-process of self-identifying a mistake and speaking up about it, with other people nearby doing the same.
Plus, when mistakes are fully explored at source, everyone will be far more aware and engaged in the real-life, painful consequences of what happens if mistakes continue. That in itself should temper dangerous risk-taking behaviour.
So let’s accept, recognise and learn when people make mistakes
And by that I mean unsafe behaviour too.
Our job is, yes, to try and make less mistakes. But can’t we also celebrate that precious moment when someone realises they’ve made one, shares it with colleagues, and avoids a major catastrophe?
That’s the fundamental change that’ll transform your business, on a journey of continual cultural improvement.
We need to move on from this negative culture of perceiving mistakes as bad – deserving of punishment, and stop building systems to reform and compensate for it. All that does is resist what’s natural, and bury valuable opportunities for improvement.
Instead, let’s accept our flaw – our natural propensity for mistakes. And let’s reward people when they speak up about the inevitable – when (not if) it happens. Let’s aspire for open, engaging cultures at work, instead of an unrealistic vision of the future where everyone behaves like flawless robots.
That’s an impossible fantasy – Kahneman and Tversky knew it, and now you do too.
About Mark Ormond
Managing Director
Mark is an experienced consultant and strategic leader with a background in information systems, culture and engagement. He has over a decade of experience partnering with a wide variety of large organisations internationally, helping them create more mature cultures and higher levels of engagement. Latterly this has been done as head of Tribe Culture Change, a consulting organisation formed from the partnership of Hill Solomon and JOMC, organisations with over 40 years collective experience in this area.