Confused man

What should decision makers do with analysis that challenges deeply held assumptions? In this blog, Fraser Battye reflects on a surprising recent finding about community services 


Sometimes analytical results fit your preconceptions. For example, the Institute for Fiscal Studies (IFS) found that Sure Start led to gains in educational achievement – and that gains were greatest for children from ethnic minorities and low-income backgrounds. 

Sure Start is policy cat nip for me. Preventative, early years, neighbourhood based, addressing disadvantage, supporting families, (etc). I accept evidence in its favour easily, allowing me to retain an apparent belief in evidence-informed decision making. 

But sometimes analytical results are counterintuitive. More than that: sometimes they hit hard at closely held beliefs. What then? 

We recently completed analysis for the Integrated Care Boards of the Midlands, having an all-too-rare look at community health services, with a specific focus on the over 65s. 

Reading the draft report was an instructive experience… 

Some of the findings went along with my preconceptions. For example, we found that people living in deprived areas had better access to post-discharge community services than those from less deprived areas. 

Ah! Excellent! NHS services working to counteract the inequalities that blight our society. This was good news - especially after our work showing gross and growing inequity of access to planned care. And here was a clear implication: more community services please.  

But then I read a finding so stark that I read it, then re-read it, and then re-re-read it. We found that:

“…a decrease in community service contact rates was associated with a decrease in emergency admission rates…areas with the largest decreases in community services contact rates tended to have the largest reductions in urgent care use.”

Less contact with community services = less contact with urgent care!?! 

The effect was small, but statistically significant. The analysis was carefully, appropriately hedged and caveated, and we had – tentatively – explored explanations and hypotheses. But it still felt like something of a bombshell (especially to someone whose mum was a District Nurse…).

The logic for local decision makers – and especially desperate local Finance Directors - seemed sadly clear: to reduce pressure on urgent care, simply hack away at community services. This way you save money by saving money. 

Obviously that’s madness. But why nod along to findings on Sure Start, while squirming and seeking to reject this finding on community services? 

Do I really believe in evidence-informed decision making - or does this only apply to results that I like? 

This wasn’t the first time Strategy Unit work has induced a squirm. I remember us presenting to a room full of GPs, where our evaluation demonstrated that their scheme to prevent emergency admissions was, erm, causing, er, emergency admissions. That was awkward. And quiet. We weren’t invited back. 

We also found no evidence that cuts to social care budgets led to increases in emergency hospital admissions. Eeek. Not the result we had expected – or, perhaps, wanted. 

So what is to be done? If squirming and rejecting isn’t useful, what might be? What should decision makers do with our findings on community services?

The first thing they should do is nothing. They should sit on their hands and tie their purse strings. Single studies often pop out strange results that won’t replicate or be supported over time. More evidence will emerge.  

In the case of our social care analysis, we drew confidence as the IFS and others produced similar results. The finding we presented to the GPs eventually became one of many popped balloons in the world of admissions avoidance. And in this case, there is a large NIHR project looking at community services that is due to report soon. 

The second thing decision makers might do is change the analytical lens. Our lens was wide, looking at all community services. Maybe that perspective is simply too broad; maybe a more detailed look is needed to guide action. For example, we recently found that increased contact with community services was associated with reduced use of hospital care for those at the end of life. If demand reduction is the aim, maybe ‘more / less community services’ is an unhelpful way of looking at things.   

The third thing decision makers could usefully do is consider the basic rationale for community services. Community services do work of high and immediate value to patients; maybe they shouldn’t be justified so much in terms of reducing demand for hospital care. If findings like ours begin cumulate, then it may even be that this rationale falls away altogether. 

Finally, and most challengingly, one thing we should try to do is retain a commitment to evidence. We’re all human. We all have preconceptions, and initiatives that we like and dislike for reasons that have nothing to do with evidence – even while, if we’re honest, sometimes pretending that they do. But if we abandon evidence - and only use findings that support our preconceptions - then we have abandoned something utterly foundational. Maybe the truth is uncomfortable, but discomfort is a price we should be very happy to pay.