Importance of calibrating expert opinion and the consensus trap

Industry
|
By: Petar Bielovich
2025 Cyber Skills Challenge at the QT Hotel in Canberra. Photo: CPL Hamid Farahani

Opinion: When a key decision goes against the consensus view, it can rattle nerves and undermine confidence. How can costly misreads be avoided and fresh certainty be brought to the decision-making process?

Opinion: When a key decision goes against the consensus view, it can rattle nerves and undermine confidence. How can costly misreads be avoided and fresh certainty be brought to the decision-making process?

The Reserve Bank of Australia’s decision to hold interest rates steady in July 2025 elicited a shock response. As was noted, “Financial markets had previously ascribed a 97 per cent chance of a rate cut, with a consensus of economists agreeing that a cut was the most likely scenario”, and yet the rates remained unchanged. This cast doubt on other future rate cuts that were considered locked in for the remainder of 2025.

So, how is it that the decision was the opposite of what most experts thought? The answer lies in how expert opinion is treated and how the uncertainty is factored into decision-making processes.

 
 

Put simply: It’s not enough to capture the consensus among the experts. There also needs to be a mechanism available to test and calibrate what the experts are saying to accurately assess the likelihood of it occurring.

This requires more data points, complex math and specialised decision-making tools and methodologies to make these kinds of assessments. The good news is that these enablers and inputs all exist today and leaders can leverage them with clear outcomes and demonstrable return on investment.

Back in 2023, Atturra created an explainable AI solution to capture and assess human expertise and aid decision-making speed and accuracy for our clients. This solution has since been widely used by several clients for multiple use cases in their business. It is used to assess future confidence in regulatory and business decisions and the risk management around them, as well as to identify and prioritise actions to address any gaps. Clients have reported considerable time savings and appreciate the improved rigour, transparency and consistency to their decision making.

Naturally, the same level of explainability and certainty is not present in every important and complex decision that gets made, in government, in business, in Australia or the world at large, and that affects us all.

If even the experts can occasionally miss, mistakenly dismiss or underestimate the impact of different data points or uncertainties on a critical decision with material impact; as the July cash rate decision shows, it should give all organisations a reason to pause and reflect on the confidence they have in their own decision making.

The factors that complicate decision making

Complex decisions involve high stakes, many factors and diverse stakeholder interests, yet uncertainty is the hardest element to manage. Ignoring it can lead to overestimating positive outcomes, falling prey to confirmation bias and being caught off guard when reality differs.

Humans are uniquely capable of recognising both the “known unknowns” and the “unknown unknowns”’ in any situation. Many things can create these “unknowns”.

For example, the sheer volume of data to be processed, the questions about its accuracy or origin, the difficulty of applying complex logic consistently, the ambiguity and bias that come from language and opinion, organisations and even whole industries have developed ways to factor uncertainty into their decision making.

In hazardous industrial environments, for example, a culture of ‘chronic unease’ is encouraged. This mindset keeps people alert to possible hazards and failures and prompts them to consider how to prevent them. The goal isn’t to create fear, but to ensure that unknowns are always part of operational decision making.

Cyber security works in a similar way. Given that the threat landscape constantly evolves, practitioners accept that incidents are inevitable. It’s not a matter of “if”, but “when”. Their job is to assess the current state while also planning systems, controls and protections that can withstand unknown threats.

In both cases, subject matter experts play a vital role. They interpret data and analytics, apply context and use their expertise to “fill in the gaps” contextually, providing a likely explanation given the available evidence.

Here, explainable AI can add value. Not by replacing the human expert, but by objectively reviewing and sharpening their judgement. AI can reinforce whether a decision aligns with the evidence, but it can’t spot what’s missing or act on a hunch. Humans still provide the critical insight, especially in specialist fields. What matters most is being able to calibrate those human judgements in a way that is visible and transparent.

By having a means to capture and calibrate expert opinion, organisations can improve the efficacy of their decision making in the most complex scenarios, while having the assurance and governance needed for good decision making now and into the future.

Petar Bielovich is data and analytics director for Atturra. He has more than 25 years’ experience working with clients, including Australian Defence, Boral, Telstra and Nestle.

Tags:
You need to be a member to post comments. Become a member for free today!