There's a risk in your change programme that rarely makes it onto the risk register.

It sits at the executive table. It arrives early. It often opens the first steering committee with something like: “I’ve been through one of these before. Here’s what we need to do.”

It’s not a bad actor. It’s not a toxic leader. In most cases it’s a capable, well-intentioned person whose confidence has quietly outrun their calibration.

It’s one of the more nuanced risks in complex change — and it’s almost never named directly.

There’s a concept that explains it precisely. And it’s rife right now, for anyone paying attention to the upswell of evangelists in a certain recent trend in technology. (More on that in a moment.)

David Dunning and Justin Kruger put it on paper in 1999.

The Dunning-Kruger effect — and why it gets more interesting with seniority.

The original paper’s title is one of the great understatements in academic history: “Unskilled and unaware of it: How difficulties in recognising one’s own incompetence lead to inflated self-assessments.”

The finding in plain language: people with limited knowledge of a domain consistently overestimate their competence in it. Meanwhile, people with deep expertise tend to underestimate theirs — because the more you know, the more clearly you can see what you don’t know.

I explain it with a story about my good friend Angus Kurtze.

I played Australian Rules football at lower-grade amateur level. If my son asks me how good I was, I’ll probably say: “I went alright, mate. Good hands...hit up targets, solid 8 out of 10.”

Meanwhile “Kuta” played well over 120 games at SANFL level, captained South Australia, and was regularly scouted by AFL clubs. From the age of 19 he trained alongside men who went on to professional careers. Ask Angus (a laconic and humble Gent by nature) the same question and he’d probably shrug: “Oh, I went OK. Maybe a 7.”

That’s the Dunning-Kruger effect. My narrow frame of reference gives me (vastly) misplaced confidence. Angus’s deep immersion in the game (which continues today, off the field) gives him a far more accurate read of the actual standard required.

The only real casualty in my case is my son’s football development.

Raise the stakes to a complex programme of work and the consequences are somewhat different.

What this looks like for those leading change

The executive sponsor who has “been through a SAP implementation before” and therefore knows how long change management really takes. (Less time than the change manager is asking for, apparently.)

The portfolio director who led a restructure five years ago and is running the same playbook on a digital transformation with fundamentally different dynamics.

The CEO who watched a competitor’s programme stumble and concluded they know exactly what went wrong — without ever having sat in the rooms where the real problems were happening.

None of this is bad faith. But they’re working from a reference class that’s too narrow, too old, or too far removed from the actual complexity in front of them. And because they’re senior, their confidence carries weight; shaping resourcing decisions, shortening timelines, overriding the people closest to the work.

As I’ve written about in the Planning Fallacy, leaders consistently underestimate cost, time and risk while overestimating the value of their own prior experience. The Dunning-Kruger effect offers one explanation for why: they don’t know enough about this specific context to know what they’re missing.

(And yes — if you’ve noticed a certain trend involving AI tools lately, where absolute confidence and very limited experience are combining in interesting ways...the Dunning-Kruger effect is rife.)

Let’s play a game..“Never have I ever..”

Have you ever walked into a programme feeling like you understood the culture, only to discover three months in that you’d read it partially (if not completely) wrong?

Have you confidently scoped a change that “wouldn’t take long” — and then watched it quietly double in complexity...then cost?

Have you sat through a governance forum and realised your read of the stakeholder landscape was based entirely on the last organisation you worked in, not this one?

I certainly have.

The Dunning-Kruger effect is not a leadership problem. It’s a human one. The question isn’t whether it applies to you; it’s in determining which situations, and whether you’ve created the conditions to find out before it’s expensive.

The inverse of DKE is just as real — and just as underused.

The people in your programme who are most hesitant, most likely to flag risks, most inclined to slow things down — are often the ones who know the most.

The change practitioner saying “we need more time on embedding” or the respected team leader questioning the benefits realisation...isn’t necessarily being difficult. They may have just seen enough programmes to know where this one heads if you skip that step.

In change programmes, those people are your most accurate early warning system. Whether they feel safe enough to use it is a leadership question. (More on who these people are and how to find them — here.)

One practical thing that actually helps.

Here’s something I’ve used across very different organisations (think: councils, manufacturing sites, universities) that consistently surfaces the Dunning-Kruger dynamic without anyone needing to name it.

Draw the work out. Visually. In the room. Together. After setting the context on ‘no egos, no hierarchy’.

Not a polished process map built offline and presented as a fait accompli. A genuine working session where the people who actually do the work map it out on a whiteboard in real time, with the stakeholders (who think they understand it) watching and asking clarifying questions only.

What happens in those sessions is consistently revealing. The executive who “knows this process” discovers three variants they didn’t know existed — think system workarounds or customer preferences. The change manager who scoped the training requirement realises the work is twice as complex as the project plan assumed. The subject matter expert who’s been quiet in every steering committee suddenly has the most important thing to say — because the visual gives them something concrete to react to or helps them find the words they were looking for.

I’ve written before about learning through experimentation as a core change principle — and this is one of the most useful experiments you can run. Crude prototyping of the actual work, in the room, with the right people around it. It’s the fastest way to recalibrate a room full of mismatched confidence levels.

Making work visible levels the reference class. It’s hard to maintain unfounded confidence about something everyone can see doesn’t match their assumptions.

The question worth sitting with.

Look around the table at your next programme review and ask honestly: who in this room has a reference class that’s actually relevant to this change?

Not who has been through a change. Who has been through this kind of change, at this kind of scale, in this kind of organisation?

And then: are those people doing most of the talking?

Flyvbjerg’s reference class forecasting is one structured tool for addressing this at a programme level. But the more immediate intervention is cultural — creating the conditions where deep knowledge gets heard, and where confidence without calibration gets respectfully interrogated.

Most organisations manage the risks they can see on a register or parroted from a Board report.

The subtler ones — in the form of well meaning but unjustified confidence, sitting at the table — are worth paying attention to too.

The Dunning-Kruger effect was first described by David Dunning and Justin Kruger in their 1999 paper “Unskilled and unaware of it,” published in the Journal of Personality and Social Psychology. The Planning Fallacy piece referenced above explores the companion bias — and why experienced leaders are often the last to update their estimates.