Parents need to consider numerous factors when assessing the league tables, writes Education Editor Judith Judd
This year's secondary school league tables, which are published by The Independent, today offer more information to parents about the state of the nation's schools than ever before. Despite repeated criticism from teachers, the Government is as as convinced as its Conservative pre- decessors that the publication of schools' results helps to spur teachers on to greater efforts. Ministers say that parents have a right to know how their children's schools are performing in relation to others.

But the controversy about publication remains as fierce as ever. Headteachers and teachers say that the tables, even with the extra information which is included this year, present an unfair picture of schools and that none of the measures used gives a true indication of quality.

The Department for Education provides an alphabetical list of around 5,000 schools and more than 400 colleges in each local authority. As in previous years, our tables rank schools according to the percentage of pupils awarded five A* to C grades at GCSE.

Teachers attack this because it takes no account of the ability or background of a school's intake - for some pupils, a grade D or E may represent a huge achievement. So parents need to be aware of the backgrounds of a school's pupils before rushing to judgement. For some schools, the national average score of 46.1 per cent may be a very good result.

Critics have also pointed out that it encourages teachers to concentrate on those students who may just scrape a C while neglecting those at the very bottom of the heap. They suggest that league tables are to blame for the widening of the gap between the results of those at the top and those at the bottom of the heap.

Ministers say that parents understand the A* to C measure and that it is here to stay. But they are keen to reduce the number of young people leaving school without any qualifications and they want schools which are complacent about the number of pupils achieving C grades to push them on so that they get As and Bs. So this year's tables include, alongside the A* to C measure, a "points score" or the average number of points per candidate, which gives an overall indication of how the school is doing in relation to all of its pupils.

GCSE is graded from A* to G and the score is worked out by giving one point for a G and eight points for an A*, adding these together and dividing by the number of candidates.

Teachers are still unhappy with this as a performance indicator because it gives an advantage to schools which enter their pupils for a large number of subjects - even a G is worth a point. Schools which enter pupils for 11 or 12 GCSEs are competing with those which concentrate on eight or nine. Quality, say heads, may be sacrificed to quantity. And wouldn't it make more sense for young people to be out on the football pitch or playing the clarinet than piling on yet more academic qualifications?

Both politicians and teachers want to find a way of comparing schools which allows for differences in intakes and judges them on the progress their pupils make - known as "value-added." This year's tables were originally intended to do this with a new "progress measure" which graded schools A to E according to the progress their pupils had made between the national curriculum tests at the age of 14 and GCSE. Schools with a similar performance in the 14-year-old tests were compared with each other. This was dropped after ministers decided it penalised unfairly schools which did well - the North Halifax grammar school where 96 per cent of pupils got five top grades received the same mark for progress - D - as the nearby Ridings School, the failing school, now given a clean bill of health, where the figure was 3 per cent.

The revised tables pick out the 25 per cent of schools which have improved most, according to this measure. However, no score is available for schools, including many independent schools, which do not take the 14-year-old tests.

Parents may also want to consider the column showing the difference between a school's results in 1995 and this year. Though this offers some indication of progress, teachers point out that differences in the ability of year groups cause fluctuations from year to year.

Parents must also remember that the results reflect a school's past and not its future. Results are for pupils who started secondary school five years ago. Schools change quickly if the head moves on. Poor schools can be turned round within two years and good ones may decline just as quickly.

A-level is a good yardstick for judging a comprehensive school because it shows how well it does once the least able and least motivated pupils have left at the age of 16.

For independent schools, the picture is complicated. Some refuse to take pupils into their sixth forms unless they have reached a good standard at GCSE, so they are off to a flying start. Schools which enter pupils for more A-levels and those which enter them for General Studies - sometimes without offering any teaching in the subject - have an advantage over the rest.

At the other end of the scale, truancy rates provide a useful key to a school's quality. Inspectors say that high truancy is one indication that a school is failing, though the battle against poor attendance is harder in deprived areas.

So there is a limit to what the tables tell parents. Ministers emphasise that none of the measures in the tables should be read in isolation and that parents should look at them all before making a judgement. If you are choosing the school, there is no substitute for a visit, a meeting with the headteacher, a first-hand assessment of the atmosphere in the classroom and the behaviour of the pupils themselves.