“Productivity isn't everything, but in the long run, it is almost everything.”
‘Productivity’ is not a word that inspires. Indeed, it’s often taken to mean ‘working even harder’. And yet productivity determines so much - including what those of us inspired by the NHS really value: better treatments, enhanced patient care, improved working lives.
Assumptions on NHS productivity therefore really matter. Over the long-term, even small changes in this assumption can have radical implications for planning. We can see this in debates over the productivity assumptions for the 10 Year Health Plan.
And so we were delighted to work with the Health Foundation to explore this question – and to develop methods for improving estimates while doing so.
We went to the experts. Why?
Human judgement – and even the very notion of expertise - has come under fire in recent years. Experts are human, and humans are fallible. We’re prone to biases, overconfidence, low blood sugar, and the occasional mishap.
And so, it might seem that data would offer a safer guide. And the NHS has plenty: objective, consistent, and impartial. Data that doesn’t tire, get bored, fall prey to groupthink or respond to social incentives.
But data has its limits. Historical trends can tell us a lot, but they cannot predict the future. And historic data are limited when it comes to thinking about breaks in future trends. How will AI influence healthcare productivity? And how do we account for shocks like the pandemic, which delivered an unprecedented hit to efficiency?
This is where the method of expert elicitation comes in.
In simple terms, we gathered and combined predictions from experts using a structured, rigorous approach. This isn’t a casual brainstorm or a productivity-themed game of chance. This is a serious attempt to harness the wisdom of an expert crowd.
The method is everything. And this approach to expert elicitation is an area the Strategy Unit has been developing for several years now. (You can read more about that here: https://bmjopen.bmj.com/content/14/10/e084632.full)
In essence, the selected experts are presented with the best available data to inform their forecasts. The method combines this evidence with their experience and judgement. The process is designed to minimise bias and capture not just what experts expect to happen, but how confident they are in their predictions. Subjective, yes; but far from unscientific.
The outcome is a richer, more nuanced view than data alone could provide. It blends historical insight, professional expertise, and a careful measure of uncertainty. This is essential when trying to predict something so fundamental a decade into the future.
The report below provides the detail. We think the results are illuminating and thought-provoking. We also see lessons for the future of expert elicitation.
CC BY-NC-ND 4.0
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.