Data analysis presents a limitless opportunity to improve decision making within the NHS. Good analysis can allow us to identify myriad ways of improving population health and also the quality and effectiveness of patient care, by thoroughly understanding the present, modelling the future and evaluating whether different courses of action are having the intended effect.

But the current use of data analysis to support decision making in the NHS is variable. Although ‘better use of data’ is a central feature of the NHS Long Term Plan, too much complex analytical work is still being done in isolation.

Every day in the NHS, decision makers are being fed information – often by external providers - with little sense of the ingredients or method. This information is often presented in a black box: where the data and the approaches used are entirely hidden – perhaps shrouded in ‘IP’ or ‘commercial confidence’.

Surely we all want a learning NHS where decision makers are clear-eyed and open about the information they use? But this requires real change.

Fundamentally, it requires different groups of analysts to share methods, data and code. A one-off piece of analysis has application to a particular time and place. But sharing, reviewing and improving the methods and underpinnings used would provide a source of dynamic learning for groups across the network.

Openness allows critique and improvement. We have the means to do this, with modern analytic methods and open source tools such as R and Python.

So at the Strategy Unit we are working with others to change things. Many organisations seem to be swimming in the same direction and we see a shared agenda between ourselves and people like the Health Economics Unit, ApHA, NHS-R, the Health Foundation, Ben Goldacre, OpenDataSavesLives, the Midlands Decision Support Unit Network and many, many others. 

We aim to produce knowledge with staying power. If we develop a new analytical model for one NHS client, we won’t sell it again. When we produce innovative models, we publish them for everyone on open access principles, sharing code and data.  

We did this in the summer with our work on waiting lists for planned care; and we recently released our mental health surge model to help Trusts estimate the post-Covid burden on their mental health services.

The idea is that anyone who needs our analysis can take it, adjust it and improve it; and it gets fed back into the network for the next user. This is a fundamental guiding principle for us.

If we’re doing it, why doesn’t everyone? And critically, why is the NHS willing to purchase analysis from providers who won’t share code or document what they’ve done in a transparent way that allows it to be replicated?

The NHS in its clinical activities sits atop of great scientific endeavour. A fundamental principle of the scientific method is ‘replicability’ of analysis. We all know that this is the route to advancing knowledge.  

I am privileged to be part of national discussions underway about how we build from the innovative spirit that COVID-19 unleashed and ensure that the new Integrated Care Systems are designed as ‘learning systems’. One simple proposition that I offer into that discussion is that we should all agree that learning systems and ‘black box’ analysis are incompatible.

That has important implications for how we support our NHS analysts to have the skills and the space to ensure that their work is done in ways that facilitate sharing. But it also requires a culture change in how the NHS commissions analytical support from external suppliers.

In my view, that should not happen without securing the use of open source code and full transparency of method. In the few instances where there is real IP or innovation in external analysis (and typically there really isn’t), then let’s talk about that explicitly, and consider what is a fair price to pay for that knowledge to be made ‘open’.

The consequences of choices made in the NHS can be profound. Who would defend making these choices in ignorance?