A&E statistics 'worse than reported'

  • Published
Accident and Emergency DepartmentImage source, PA

The official monthly accident and emergency figures in England for the final months of last year claim more than 85% of patients were treated or assessed within the four-hour target - but those figures are almost certainly wrong.

Estimates by NHS England - taking account of factors that have since come to light - suggests December's figure was 84.9%.

At the time, this would have been the worst on record, but February's figure was even lower.

A&E stats may have to be recalculated

These estimates suggest that October, November and December saw national levels below those still appearing on each month's performance spreadsheet.

The new "alternative" figures are buried in the NHS England website and not highlighted in any way.

In a letter to NHS England, Ed Humpherson, director general for regulation at the UK Statistics Authority (UKSA), says: "I am… disappointed that A&E data collection guidance, based on agreed principles, has still not been published.

"This uncertainty continues to undermine the public confidence in the official statistics compiled from that data."

A&E figures form one of the key benchmarks for assessing NHS performance.

Money handed to hospitals by NHS national organisations is often linked to performance including A&E.

So if they are incorrect, even by a small amount, there are important implications.

The latest rap over the knuckles is embarrassing for NHS England.

Walk-in centres

All this dates back to revelations by BBC News in January that some hospitals were adding patients from other local walk-in centres into their A&E data.

They were doing what they thought was wanted by NHS managers, who were anxious that all routes for patients to emergency care were covered. This included direct referrals to wards by GPs.

The trouble was that it wasn't clear who was adding what to their data.

At least six hospitals saw their A&E performance improve significantly in the month of October after including these new "patient pathways".

An email trail led to suspicions that health service leaders were encouraging hospitals to pull in a wider range of data in order to make their figures look better. This was subsequently denied.

If you can't see the NHS Tracker, click or tap here, external.

Following the BBC story, NHS England got to work combing through spreadsheets submitted by trusts.

The order went out to exclude everything not strictly recognised as A&E activity.

The organisation's statistics experts were then able to provide assurances that the figures from January were clean.

They told the UK Statistics Authority that there would be ongoing work to look at what sort of extra activity might be included in future in the A&E statistics, for example minor-injuries clinics.

In March, NHS England said:, external "An assessment exercise has now raised questions over the inclusion of up to 26,000 pathways of care per month in October, November and December 2017.

"It is estimated that national A&E performance in previous months could have therefore been affected by between 0.11 to 0.18 percentage points."

This information was put on the NHS England website but not linked in any way to the regularly updated figures.

The spreadsheet recording historical data has not been changed. And there has been no asterisk noting that recalculations have taken place.

There is no indication of any update at the six trusts where performance improved considerably in October.

Labour's Jon Ashworth said it was "damning for ministers that the UK Statistics Authority has had to step in and confirm that last winter was even worse than presented in the NHS".

The letter from the statistics watchdog was sent to Mark Svenson, who is head of profession for statistics at both the Department of Health and Social Care and NHS England.

Ed Humpherson, of UKSA, has called on NHS leaders to prominently publish an explanation of the issues and the next steps being taken.

Some may think this is a pedantic debate about fractions of percentage points.

But at stake here is the credibility of what should be a gold-standard benchmark allowing patients and the media to hold local hospitals and the NHS nationally to account.