Feature / Programmed for improvement

28 May 2013

Login to access this content

Analysing programme spend alongside outcomes could be the starting point for clinical commissioning groups keen to get the most out of investments. Steve Brown reports

While programme budgeting data has probably been used by all commissioners in some form over the past 10 years, examples of its use to fundamentally inform health investment decisions are rare. Yet there are increasing calls for the new clinical commissioning groups to tap into this rich data source as they look to challenge spending patterns and redesign pathways around patients.

Programme budgeting data, published in the past by the Department of Health and now by NHS England, provides a breakdown of all NHS spending by 23 programmes of care. At a national level it provides an interesting insight into how NHS money is spent across activities. For example, at 12% of a total of £92bn in 2011/12, mental health disorders account for the single biggest category of spending. Circulation problems, including coronary heart disease, consume a further 7.5%, while gastrointestinal system problems take another 5%. But the real value comes when spending is analysed locally alongside outcome data.

‘Unless you know where you are spending your money, you can’t make informed decisions about how to improve commissioning and population health,’ says Phil DaSilva, author and lead at the NHS Right Care programme, whose work to help NHS bodies tackle variation is being taken forward by a range of national bodies. ‘We need to encourage CCGs to understand how and where they allocate resources. There are nonsensical annual conversations about increasing budgets without any real understanding of what is currently being spent on a service.’

He highlights one recent discussion between commissioners and a provider, where the sticking point was a requested increase of £240,000. ‘The commissioners couldn’t and wouldn’t find the extra funding for the services, but no one actually knew what the total budget was. We helped them understand what they were spending and what they were using it for. We found the £240,000 in the budget by stopping doing things that were adding no value to patient care and reallocating it.’

He adds that typically, commissioners would not know how much they spent on specific programmes, but would know the exact value of their contract with their main provider. Mr DaSilva says the focus needs to be on systems and improving population health, rather than on individual organisations and contracts.

There are three aspects to the current quality and productivity drive: current services could be made more efficient; services could be provided in new ways – for example, more community-based services that help avoid hospital admissions; and commissioners could reallocate resources between budgets in ways that deliver higher value for patients overall.

Mr DaSilva says the starting point for all these improvement journeys must be understanding existing levels of spend and outcomes on a programme by programme basis, in absolute terms and relative to commissioners in similar areas.

In general terms, the NHS is being pushed to think more in terms of whole patient pathways rather than the discrete contributions made by individual organisations. Programme budgeting supports this by detailing spend in overall programme areas and, following recent changes, breaking this down into care settings.

Several overlapping tools, developed by various bodies, then enable this spend data to be put alongside quality and outcome data to help build up a bigger picture about spending patterns. These include an Atlas of variation (above), setting out access and outcome measures for the different programmes of care; NHS comparators, providing data on admissions, attendances and prevalence; and an inpatient expenditure variation tool.



SPOT makes its mark

But perhaps the most powerful – or at least most useful visually to get things started – is the spend and outcomes tool (SPOT). This uses a quadrant analysis to identify outliers in terms of spend and outcome to give a high-level indication of areas that would benefit from more scrutiny. A simple spine chart (overleaf) shows variation in spend and outcomes compared to similar CCGs, while a bar chart shows spend by programme compared with CCGs in the same Office of National Statistics cluster.

In the past, there have been criticisms about the spend data within programme budgeting, which is based on PCT returns. Phil Wilcock, senior manager in the analytical service at NHS England, says this is still a typical response when people are first presented with the analysis.

‘Trust in the data is an issue,’ he says. ‘People either don’t believe it or complain it is too old. But we have increasing numbers of case studies where people have overcome initial resistance and where it has been used to improve care.’

He admits the spending data isn’t perfect, but it doesn’t need to be perfect to be useful and is improving each year. And the fact that it is also used alongside a number of other data sources, including hospital episode statistics and quality measures, means organisations can gain greater confidence in the key messages.



Asking questions

Bryn Shorney, senior analytical service lead at NHS England, says the aim is to build up a full picture. ‘The idea is to triangulate spend, outcomes and the potential drivers of spend.

If you are looking at a service with high relative spending and poor outcomes, any organisation would want to ask questions. Programme budget data is only the starting point. It is there to present questions, not to answer them. Organisations should ask questions and dig down in more detail, bringing in other data sources, local and national, to explore services and variation in more detail.’

Both the NHS England analysts say programme budgeting remains a work in progress. But some aspects of the data are better than others. ‘Primary care prescribing is very high quality,’ says Mr Shorney. ‘Inpatient spend is mainly very good, while community data needs to improve.’

Recent developments to collect and analyse the spending data by care settings should help commissioners navigate the data. ‘It allows users to look at some aspects of spend with confidence and highlights the areas to be more wary of, where you need to bring in other sources of data – activity, prevalence and outcomes – to tell a story,’ he says.

Similarly, Mr Wilcock says there are areas where the outcome measures are more readily available. There are several outcome measures for cancer, for example, but very few for disorders of blood.

Professor Matthew Cripps, now a programme director with NHS Right Care, first used programme budgeting data and the associated tools while at Western Cheshire PCT. Having been through a turnaround programme to eliminate a £42m deficit, the PCT was keen to adopt a process based more on continuous improvement across the whole system. It used programme budgeting and spending and outcome tools to prioritise service areas for review and identify those parts of the pathway needing attention. Applying business process engineering techniques , it generated £7m of savings across three service areas.

The 2011/12 programme budgeting data, released in March, remains based on PCT spending submissions. However, some of the tools have started to be adapted for CCG use – with spend and outcome fact sheets now available for all CCGs. Currently this involves applying spend per head data from the predecessor PCT to the populations of the constituent CCGs.

Professor Cripps argues this is a temporary hurdle, as future publications (from 2015 covering 2013/14 spend data) will be based on CCG returns. But this is no reason to delay. ‘A lot of CCGs are the same shape as the [predecessor] PCT, but most others have only split into two or three,’ he says. ‘I’ve yet to find any that is an outlier on, say, diabetes purely because of the performance of the other two.’

Sue Baughan, associate director of Public Health England’s Northern and Yorkshire knowledge and intelligence team, which developed the SPOT tool, agrees that the data is perfectly valid for CCG use. For example, on the outcome side, some of the data is already based on GP information and so already reflects actual CCG performance.

Like his colleagues at NHS England, Professor Cripps insists the data is robust enough to provide signposts for improvement work. But he thinks more could and should be done to improve the data, making the programme budgeting tools even more valuable. He says it is noticeable that data improves in areas where the data is being used and says a campaign is needed to secure further improvements.

Could do better

While he recognises that improvements to data collection elsewhere – in community services and as a result of a focus on costing – will benefit the programme, he believes that ‘finance could do better’ and that formal audit might help accelerate improvement.

Warrington CCG – and Warrington PCT before it – was an early adopter of the improvement approach started at West Cheshire, building on the programme budgeting analysis. A Right Care case study suggests that in its first year, this programme delivered a £15m turnaround programme

and helped eradicate a recurrent deficit problem in year two.

Cheryl MacKay, programmes director at the CCG, says the PCT was undertaking service reviews three times a year as part of turnaround. ‘Programme budgeting information was fundamental in identifying the areas we started looking at in terms of determining our expenditure compared to other PCTs,’ she says.

Trauma and injuries, musculoskeletal and mental health were its key outliers with all these areas contributing to high levels of unscheduled care. This has led to a focus on non-elective work and, while unscheduled admissions remain high compared with national levels, there has been significant progress. ‘Last year we managed to hold non-elective admissions flat against predicted growth of 8%-9%,’ she says. ‘And this year they are coming down.’

For chief finance officer Stephen Sutcliffe, programme budgeting data is all about forcing people to think in terms of programmes and have a conversation about the key issues. ‘We can agree what we are spending as part of a programme and then start to drill into what we are getting for the spend.’ Mr Sutcliffe says he is keen to tie resources more to programmes as the CCG moves forward.

Despite pockets of good work across the country, Professor Cripps says there are still places he visits that ‘do not know about programme budgeting or the atlases and SPOT tool’. ‘If they are not using these tools, how are they deciding where to put their reform efforts and focus quality improvement work?’ he asks.

Ms Baughan agrees that concerns about data quality are misplaced. She was involved with pilot work with Derbyshire CCGs looking at how they could use spending and outcome data. Initial analysis of outlier programmes was undertaken using national data. But, because of some concerns over the data, the model was recreated using local data. She says the overall messages were the same.

The work in Derbyshire prioritised four programmes for improvement - cancer, respiratory, circulatory and endocrine/ metabolic disorders – and identified potential gains of £20.8m across the whole cluster.

The work involved developing more detailed commissioning for value packs. These packs build on the basic SPOT factsheets by adding in local analytical data and drilling down into spend across care settings and looking at a broader range of outcomes. Public Health England is now talking to NHS England and Right Care about the possibility of producing these packs for all CCGs.

NHS England is keen to see other economies applying the data in similar ways. It is thinking through how it can streamline the various tools on offer, bringing them together into a single analytical system. But even ahead of this, there are powerful opportunities to inform local investment decisions.