Why death rates aren't an exact science

  • Published
Smoke alarmImage source, Thinkstock
Image caption,

Some say a high mortality rate should act as a smoke alarm - an alert that needs looking into

Interpreting mortality data in the NHS is a tricky business.

The health service is damned if it doesn't act quickly enough - as was the case with Stafford Hospital - and then damned if it does - as has happened with child heart surgery in Leeds.

The problem is that compiling and interpreting the data is not an exact science.

A high death rate could be said to be a smoke alarm - a sign that something may be wrong and, therefore, needs looking into.

But just as a smoke alarm can be set off for a rather innocuous reason - the smoke from frying food, for example - so a high death rate can be accrued for perfectly justifiable reasons.

Mortality figures are risk-adjusted, which means they take into account factors that could increase the likelihood of death, but even so their accuracy does depend on all the data being submitted properly.

There are over 300 different variables that hospitals need to provide, covering issues such as age, time of diagnosis, deprivation, lifestyle and existing illnesses.

It is not unheard of for data to be submitted incorrectly or not at all.

'Poor quality'

In the case of Leeds General Infirmary, which had its child heart surgery halted just before Easter amid concern about high death rates only for the suspension to be lifted this week, crucial details were missing which appear to have skewed the overall picture.

NHS medical director Sir Bruce Keogh explains: "In order to take case mix into account so that we compare apples with apples and oranges with oranges, one of the simple things you need is the weight of a baby.

"But weight was missing in 35% of the cases... so that data was very poor quality."

So does that mean they should not be trusted?

No, according to Prof Brian Jarman, one of the UK's leading experts on mortality data.

He believes the NHS is getting better and better at using and understanding the data.

There are now several different ways of compiling data. Dr Foster, the research group he works for, looks at four key measures - deaths in hospital, deaths within 30 days of discharge, deaths after surgery and deaths among patients in low risk conditions.

It allows them to cross reference hospitals with high rates in one area with data from another.

For example, according to latest figures, there were more than 50 cases where a trust had higher than expected death rates on one measure.

But Dr Foster was able to narrow that group down to 12 by looking for those trusts with high rates in two categories.

Prof Jarman says: "It really does depend on us getting good quality data.

"That is why we say mortality rates are not perfect, they are a sign that something may be wrong and should definitely not be ignored.

"If we have got good data I would say a high mortality rates is nearly always a sign of something not being quite right."