Feature / Reference point

01 March 2011

Login to access this content

Reference costs provide a huge tome of data – one that is more valuable and offers more insight than many finance managers give it credit for. The Audit Commission’s Chris Raspin delivers the analysis.

The Department of Health recently published the 2009/10 reference cost data. Calculating the unit costs of most NHS treatments, covering £51bn of spending, is no mean feat. Of course, there are examples of poor-quality data – Audit Commission audits are finding them – but the current information is fundamentally right. It is more useful than you might think, there are some interesting correlations and the data is getting more consistent.

The Audit Commission’s review of the uses of reference costs found that 92% of trusts and PCTs use them. Some finance practitioners, such as those developing patient level costing, think reference costs are unnecessary. But even for these trusts, reference costs are the only opportunity to compare costs across the NHS and therefore to question whether they could improve service and cost efficiency. In short, it is QIPP (quality, innovation, productivity and prevention) friendly. 

Far more people question the accuracy of reference costs. On a scale of one to four, users scored reference costs at two for data quality in our review.  They are right to do so. The Audit Commission’s audits are finding submissions where care over preparation was inadequate, and where tariffs could subsequently be wrong.  And one large error by a single trust in the 2009/10 submission caused the Department to have to recalculate the entire data set, and affected not only the trust in question, but many others too.

But there are recent improvements in consistency and variability. And our pilot and early audits suggest that many, if not most, trusts’ submissions are materially correct in overall terms – the total cost quantum, total activity counts, and overall allocations of costs to activities. This does not mean the cost of every healthcare resource group (HRG) at these hospitals is accurate, but it does point to accuracy at a higher level.

Unravelling the index

Others fail to use reference costs because of the complexity. The reference cost index (RCI) is simple conceptually, reflecting how expensive the trust is relative to a theoretical trust with national average costs.  But trusts have struggled to unravel the workings of the index. They could not find how it was calculated without delving into the huge database itself. 

To plug this gap, the Department has this year published an intermediate level analysis (organisation level data) that shows the cost variance (the difference between local and average unit cost in total financial terms) for each specialty in a hospital. 

If you haven’t looked at this, you should. Most trusts have several multimillion-pound variances and you will want to ask yourself whether differences in efficiency or problems with data quality are the cause.

The Department has also published the full detailed database online. Better organised than previously, it still takes some knowledge of Microsoft Access. But the Department has provided a series of ready-made queries to help, for example, to find the individual RCI for any department or specialty in your trust.

The index itself shows a similar pattern to previous years (see diagram overleaf).

Specialist heart, children’s and orthopaedic trusts are clearly more expensive than district general hospital-type acute hospitals, which suggests the latest version of healthcare resource groups, HRG4, struggles to compensate fully for casemix – hence the continued need for specialist top-ups. 

But other specialist trusts, notably the two women’s trusts and Moorfields Eye Hospital NHS Foundation Trust, appear at the lower end of the index. This suggests benefits from economies of scale, although it could be for other reasons. Cancer hospitals are at both extremes, possibly because of the difficulty of counting and costing chemotherapy consistently.

London teaching hospitals, closely followed by their provincial counterparts, dominate the top of the acute list (the acute group on the chart above includes both district general and teaching hospitals).  But there are some at the other end of the scale: Imperial College Healthcare NHS Trust and Southampton University Hospitals Trust each have an index of only 90.

Mental health trusts do not show any particular pattern except at the extremes. The lowest two values are in London trusts, and all three very high trusts are in the provinces. All five appear to be outliers, suggesting data quality may be important. Ambulance trusts (not shown) occupy a narrower range (85 to 112) than other types of trust.

Efficiency

The reference cost index measures only relative efficiency – each year it is rebased on 100. It is hard to judge small year-on-year changes in efficiency by looking at individual reference costs at the HRG level because they can be erratic at this detailed level (see table right).

For example, the reduction in CABG costs in 2008/09 may have occurred because one trust radically overstates its activity. This may either be because of a coding error (causing another activity to be similarly understated) or an undetected error in allocating costs.

To measure overall efficiency reliably, activity needs to be summed. This is complicated by differences from year to year in casemix, and because different ‘currencies’ – for example, bed days, outpatient appointments and finished consultant episodes – cannot simply be added together. 

But by considering cost-weighted activity, it is possible. On that basis, over the two years from 2007/08 to 2009/10, total acute spending including costs of non-admitted patients within reference costs increased by 18%.  Over the same period, total workload increased by 9%, implying that unit costs have continued to rise by 9% over the two-year period.

Correlations

At first glance, acute trusts appear randomly spread.  However, a link was noticed between NHS financial allocations and reference costs in 2005/06. Trusts in better funded areas tend to have higher reference costs. The Audit Commission has not updated its attribution of allocations to trusts since 2005/06. But even using these old data as a proxy for current resource shares, the link is still observable in 2009/10 (at the 99% confidence level).

Other links are observable, too.  The link between the reference cost index and the private finance initiative is statistically significant (p=99%). PFI hospitals seem to have higher reference costs.

Another statistically significant link implies that income outside of payment by results (PBR) may in part be funding inefficiency in trusts. In fact, this is likely to be the mechanism by which PCTs with higher allocations support local trusts, increasing their unit costs despite strict PBR rules. 

Each of these effects, though statistically significant, is weak. The links with PFI and non-PBR income each explain about a tenth of the total variation in reference costs. But they add up.  Regression analysis, based on 63 non-specialist, non-foundation acute trusts, confirms that the two effects are statistically significant and between them explain over 20% of the variation in RCI.

Data quality

The spread of reference costs index scores between trusts has reduced markedly over the past few years, and this trend has continued during 2009/10.

While this is not a guarantee that accuracy has improved, it does suggest that the method, calculation and collection of reference costs may have become more robust. For both main sectors, variability (the degree to which scores are spread out) in index scores is lower now than ever before (see diagram above).

Consistency of results from year to year also suggests data quality in reference costs is getting better. The order of trusts in the RCI league table in 2009/10 was more similar to the pattern in the previous year than it had ever been. In other words, trusts tend to occupy the same position from one year to the next.

Interestingly, despite the general improvement over time, both analyses show the same temporary worsening in 2006/07, which coincided with the introduction of HRG4 to reference costs.

Audits of 2009/10 reference cost submissions, the basis of the current publication, are now under way at all acute and specialist trusts as part of the Audit Commission’s PBR assurance framework. The audits will assess the accuracy of individual trusts’ data.

The findings are being discussed with trusts and PCTs locally and we will summarise the findings in our annual report on the assurance framework in the summer.

The analytical tools developed to identify exceptional activity figures and cost allocations within the audits are already available to NHS users via the PBR benchmarker (www.audit-commission.gov.uk/pbrbenchmarking). By using the tools we hope that trusts will be able to identify data quality problems from last year’s submission. If they do, they will be able to take steps now to make sure next year’s submissions are easier and more accurate.

In summary, the data seems to be improving, expected links are finally beginning to emerge. But costs stubbornly continue to rise.

INDEX FOCUS

The 2009/10 reference costs index (RCI) ranges from 157 to 80, revealing a slightly broader range of costs than 2008/09, though this may be in part due to data issues, writes Seamus Ward. The 2009/10 figures – 57% above national average costs down to 20% below – compares with a 2008/09 range of 148 to 75 (48% above and 25% below the average).

The RCI ranks organisations based on cost of delivering the same activity. An organisation with a reference cost of 100 has costs equal to the national average. The figures include an adjustment for the market forces factor, to account for unavoidable cost differences, and includes excess bed days.

The RCI for acute providers – other providers can have problems with data collection and have more complex, and thus more expensive, casemix – ranges from 114 to 80, which compares well with the 2008/09 acute provider range of 119 to 79. Trusts where the RCI position moved include Southend University Hospital NHS Foundation Trust (89 in 2008/09 and 80 in 2009/10); Heatherwood and Wexham Park Hospitals NHS Foundation Trust (102 to 84); and North Devon Healthcare NHS Trust (91 to 84).

Heatherwood deputy chief financial officer (financial planning) Alastair Haggart says that although the trust had started to make big cost reductions in 2009/10 as part of the three year turnaround plan agreed with Monitor, it was likely that the 102 figure for 2008/09 was overstated. 'We had been significantly understating costs and activity in unbundled areas of the tariff,’ he says. ‘A recent audit of 2009/10 reference costs by the Audit Commission identified two areas where activity was overstated and suggested the RCI was actually around 89.’ The trust accepted these conclusions and implemented changes.

Other trusts’ RCI moved for different reasons. The Whittington Hospital NHS Trust index rose slightly from 111 in 2008/09 to 113 in 2009/10. Director of finance Richard Martin says there is a straightforward explanation. ‘The increase is due to a reduction in the trust's market forces factor. Had this not occurred, our index would be unchanged from the previous year,’ he says.


Image removed.

Image removed.

Image removed.

Chris Raspin is senior development manager in the Audit Commission health directorate.