Measures of the quality of hospital treatment could begin next year, Stephen Dorrell, the Secretary of State for Health, indicated yesterday. His comments followed publication of the latest league tables of hospital performance, to the usual barrage of criticism that they measure quantity rather than quality.
Pilot studies on the success rate of different types of operation and treatment should be completed late this year, Mr Dorrell said. "We shall then consider how best to use the results to inform the public about the clinical standards achieved." This will be subject to the pilots' establishing that sensible measures of clinical performance can be produced.
Measures used include waiting time for in-patient and out-patient appointments, day surgery rates, and operations cancelled and not performed within a month. Forty-two per cent of hospitals scored five stars, a near 25 per cent increase, up from 29 per cent two years ago. A general improvement was marred by a small overall increase in the number of operations not performed within a month of cancellation.
Jim Johnson, chairman of the British Medical Association's consultants committee, pro- tested that the tables could be misleading. "It is nonsense to say a hospital gets five stars when a nurse sees 95 per cent of patients within five minutes of arriving at casualty, if they then wait six hours to be actually treated."
Mr Dorrell, however, argued that it was entirely right that the urgency with which a patient should be seen was assessed as soon as they reached hospital. He added: "We would like to publish more about clinical success rates." A dozen potential measures are being examined, including deaths within a month of operation, surgical wound infection rates, deaths in hospital after heart attacks, and damage to the brain or other organs following surgery.
The measures are being developed in conjunction with the Joint Consultants Committee, which represents the medical Royal Colleges and the British Medical Association. Professor Sir Norman Brouse, its chairman, said the committee had no objection to publishing clinical outcomes "provided they mean something". The committee would resist publication of death rates, as has been done in Scotland. "But we are looking for four or five conditions or problems we could focus on, where we could make real comparisons."
Mr Johnson, however, argued that clinical indicators were "even more difficult" than the present ones. In his region, for example, one hospital had been found to have twice the death rate compared to another for aortic aneurysm - weakening of the main artery wall, which can lead to rupture. But one was a teaching hospital taking less urgent cases for specialist care, while the other was a district general hospital taking emergencies. Once that was allowed for,the difference in death rates disappeared.
There are also concerns within the Department of Health that focusing on only one or two clinical indicators could distort treatment, as hospitals strive to achieve high ratings on the few indicators that are measured.
Yesterday's tables show no hospital is universally good or bad, but Halton Hospital on Merseyside scored the highest proportion of five-star ratings for the second year running, with 39 out of 52. Other good performers included the South Manchester Hospitals, East Gloucestershire NHS Trust, Epsom, Central Middlesex, the Central Nottinghamshire Trust and Walton Hospital, a community Trust in Chesterfield.
In general, big city hospitals, including some famous teaching hospitals, tended to fare worse than smaller units, a feature, Mr Dorrell said, of their being "the biggest management challenges".
5 National and regional versions of the tables are available by calling 0800 555777, or on the Internet at http://www.open. gov.uk./doh/tables96.htm
Leading article, page 11Reuse content