The report below is from a short project that explored a simple set of questions:

Are there ways of spotting poor analysis that don’t rely upon technical knowledge? 

Are there ‘rules of thumb’ that could orient decision makers – who typically don’t have analytical expertise - when thinking about the likely reliability of quantitative analysis? 

How might decision makers know when a more detailed, and specialist, review of technical work is needed? Can their intuitions be primed?

To explore these questions, we interviewed 11 experienced analysts, asking:

  • When you come across quantitative analytical work, how do you – as you read it – start to assess its quality? What tests are you using?
  • Could any of these tests be applied by a non-analyst? 

The results were fascinating. Firstly, expert analysts raised many similar tests: they seemed to be looking for the same quick markers of quality. Secondly, they thought that some of these markers could be spotted by a non-analyst – if they were primed to look for them.   

In the report, these findings come together into some ‘rules of thumb’. They are set out on a single page. 

There is no suggestion that rough pointers can replace analytical expertise. There is zero doubt in the Strategy Unit mind that decision makers need access to high grade analysts whose advice they can trust.

Instead, the ‘rules of thumb’ are perhaps best seen as ways of priming the senses, and:

  1. Guiding the non-analyst decision maker to a position where they know when to seek a more detailed review by a specialist. The ultimate value of the ‘rules’ might therefore be in promoting better working relationships between decision makers and analysts.
  2. Being better equipped to spot obviously bad analytical work.

The full report is below. Background to the work is also described in a joint blog with the Health Foundation.