Last week's publication of the first league table from the Financial Times contained a minor upset: Oxford University was put at number three rather than the expected number one or number two position. The world's most famous university had been beaten by Cambridge and Imperial College London. What ignominy! It was surely no coincidence that the FT's table appeared just before the annual offering from the Times. Next week, beginning on 11 May, the Times will weigh in with its league table, honed again this year to take account of objections from universities.
For despite their fascination with the tables - and the importance they accord them - most universities dislike them and make strenuous objections in an attempt to influence the methodology. In essence, they object to newspapers deciding which measures are important, giving them a weighting and then feeding all the data in some mysterious way into a super-league table ranking one institution against another.
Professor Michael Sterling, vice-chancellor of Brunel University and chairman of the Higher Education Statistics Agency, says that virtually every attempt to produce a league table by combining a range of performance indicators in some weighted sum is wrong. "It's very misleading," he says. "It's trying to compare chalk and cheese."
Universities are too different from one another, the argument goes. The indicators are subjective and much of the information they contain is unreliable. Yet the league tables acquire the aura of gospel, winging their way around the world and influencing student decisions from Johannesburg to Jakarta.
New universities are particularly sore about being judged by criteria that suit the purposes of old universities: research rankings, A-level points and proportion of postgraduates. They want other measures to be used, such as the percentage of mature students, whether the university complies with the Government's desire to widen access, and ethnic-minority intake.
"Research informs the bulk of the tables," says Professor Leslie Wagner, vice-chancellor of Leeds Metropolitan University. "That must be an explanation why the old universities are always on top. The tables are essentially measuring a mission that we don't have and that the Government tells us we shouldn't have, and it's not in the interests of society for us to pursue. What's needed are lots of tables, not one super-league table."
Apparent staff-to-student ratios are improved by the inclusion of staff who are partly funded for research purposes, he points out. Library and computer spending may also be boosted if the university is big on research. And teaching-quality assessments take resources into account, too.
Other critics say that the percentage of students achieving first-class honours degrees is a suspect measure, because universities can control the numbers receiving good degrees. Other critics wonder why the percentage of overseas students and the percentage of students going into postgraduate research should be indicative of excellence, as the FT table suggests.
Gill Grinstead, academic registrar and company secretary at London Guildhall University, believes that a fairer method would be to have some core indicators for all institutions - teaching quality, research, library spending, for example - and some optional ones. "Then you could see the kind of institution you are talking about," she says. "You could nominate the performance indicators you wanted to be judged by, but some would be mandatory. It would be more informative than what we have now."
This year, the Times has changed its table - by dropping accommodation as a measure and replacing it with spending on facilities per student (sports, social and health) - but that still doesn't satisfy critics. All this concern has spurred some universities to come together into an informal think-tank to look at what they can do to minimise the effect of league tables on student recruitment. "Too much attention is paid to them in the view of most people," says Christine Hodgson, head of communications at the University of East London, and a think-tank member.
The Committee of vice-chancellors and Principals is also taking action. On 29 June it is holding a conference entitled "The Use and Abuse of League Tables", which is expected to hear from, among others, Professor Wagner and a speaker from North America, where university league tables were invented. But it's not only universities and newspapers that are interested in universities' relative performance. The Treasury, concerned about value for money, has become involved too. At its prodding, the Department for Education and Employment (Dfee) has asked the Higher Education Funding Council of England (Hefce) to look at the matter. As a result, a steering group was set up earlier this year under the chairmanship of Bahram Bekhradnia, Hefce's director of policy, to investigate performance indicators for groups of universities. It is expected to report by the end of the year.
Another joint working group wrestled with the issue earlier in the Nineties under the chairmanship of Professor Sterling, and found it remarkably tricky. Some universities spend more on each student, for example, than others, but their spending is higher because they have raised more money from outside sources. Does that make them more, or less, efficient? After much discussion, a set of performance indicators was proposed and accepted by the Scottish and Welsh funding councils and the Dfee, but not by the English funding council. Hence the second attempt to sort the matter out.
The concern of higher-education experts is that indicators are blunt instruments that are capable of being badly misused and misinterpreted. In general, the sector objects to the tables on the grounds that it's difficult to capture the quality of a university authentically just through numbers.
It's hard enough for research purposes, according to Professor Alan Smithers, director of the Centre for Education and Employment Research at Brunel University. He says that if you start attaching importance to the numbers by putting them into league tables or using them as a basis for allocating funds, universities will simply find ways to improve those statistics. "You may get the numbers changing without any real change in the institution," he says.
"The only point of having targets is to improve things. I'm all for improving the quality of higher education but I don't think this is a way of doing it. I've no objection to the Times and the FT having their fun. If I were advising my own daughter I would still be very keen for her to go to Oxford, whatever appeared on the listing. I'd still operate on the gut feeling that this is where the bright people will congregate, this is where the staff are, and this is where the employers will come to recruit."
Given the Treasury's interest, will we see official league tables? Probably not, and certainly not a single super-league table. But there may be official league tables on certain aspects of higher education - for example, student drop-outs, student performance and universities' ability to recruit students from ethnic minorities. "It's an extraordinarily difficult thing to get agreement on - what should be included and what weight should be given to different aspects," says John O'Leary, editor of The Times's league table. "That's what most of the argument about our league table is, every year."