Comment / Costing: moving beyond the average

05 September 2011

Login to access this content

If you lie with your head in the oven and your feet in the fridge, on average you are at a comfortable temperature. The NHS payment system operates in a world of averages.

Those of us in the finance function know the reliance on national average cost data goes deeper than simply using the average cost of a particular procedure across all providers. First, healthcare resource groups are themselves a result of an averaging process. They bring together clinically similar procedures or treatments that consume similar levels of resources to turn an unmanageably large number of codes and classifications into a usable currency – a form of averaging nonetheless.

But it is in the current reference costing process itself where the average is really king. The predominantly top-down approach means the costs submitted by each organisation – for example for the oft-quoted hip replacement – are simply the average cost across all the procedures it undertook within that particular HRG. And the process itself is bedevilled by averaging – for example, assuming equal shares of theatre or staff time, irrespective of the actual time spent in theatre or on wards by individual patients.

Costs are in truth an average of an average of an average. That is not to say that reference costs haven’t been useful and don’t have their uses. But for some purposes we can now do so much better. Patient level costing provides an opportunity to sidestep some of the problems with average costs.

You can still build up costs to whatever level you want – HRGs being the obvious example. But by assigning costs directly to patients on the basis of actual usage wherever possible,  you eliminate much of the unhelpful averaging that can undermine reference costs. And – this is the crucial point – you have the detailed data on which to understand your higher level averages.

So if you have a higher than average cost for doing your hips, you can interrogate the data and find out why. You may be able to confirm the ‘more complex casemix’ argument or you may find out that you are using an unnecessarily high-cost prosthesis or that the length of theatre sessions is sub-optimum to maximise throughput. This is how to engage clinicians. Not simply tell them they are expensive, but give them the data to prove it, understand why and, if possible, fix it.

With QIPP (quality, innovation, productivity and prevention) improvements, we need to start looking at redesigns across our local systems. This can only happen with a full understanding of current costs.

Beyond the improved business understanding that better patient level costing can deliver, there is the tariff. The Department of Health already uses patient data to inform adjustments to some reference cost-based prices. But patient level cost data could and should have a more fundamental role.

Monitor will take responsibility for tariff setting in the new world and will set out the rules for cost collection that will provide the data to underpin this task. It has to make sense to move towards a data set that is built up in a more accurate and clinically meaningful way. Whether that means national average costs based on data built up from the patient level or some form of sampling process is up for debate. But patient level costing is surely the clear direction of travel.

Many organisations are already pursuing more detailed clinical costing. But the economic context and the push to improve quality and productivity make the case more compelling for those yet to start. We can vote with both our feet and our heads.



Tony Whitfield is finance director and deputy chief executive at Salford Royal NHS Foundation Trust