Pupils' homes 'no excuse' for poor results

Click to follow
THE BEST schools can add as much as two grades to their pupils' GCSE results compared with the worst schools, according to a new study.

The dramatic difference, revealed by research into the exam results of nearly 12,000 state school secondary pupils in Lancashire, contradicts the claim that children's academic performance is determined mainly by home background and that quality of individual schools makes little difference.

The researchers - from London University's Institute of Education - concluded that, if an average child went to the best school in the county, he or she would end up with seven grade Cs at GCSE - results equivalent to the old O-level passes. The same child, if he or she went to the worst school, would get seven grade Es - well below O-level standard.

Professor Peter Mortimore and Dr Sally Thomas carried out the study as part of the search for ways of comparing schools fairly. The Government is already publishing annual league tables of exam results but critics say that these are unfair. Some schools have poor results, they argue, not because they teach badly but because they have a poor intake of pupils.

Ministers have asked their advisers to look at new ways of calculating schools' results - proposals are likely to be published soon - but they have resisted suggestions that the results should be adjusted to make allowances for pupils' social and economic backgrounds.

The Lancashire research shows that home background can have an important influence, but only on the results of a small minority of schools.

Dr Thomas said that pupils' homes should not be used as an excuse for poor performance. 'It is only important in those schools where there are very high numbers of disadvantaged pupils,' she said.

The most significant influence on exam results, the researchers conclude, is how much pupils have achieved when they arrive at 11. Lancashire tests all children at this age and thus had this data. This enabled the researchers to find which schools did better than expected, given their intake, and which did worse.

The researchers also found that results were affected by the number of girls in a school's intake - on average, girls perform better than boys at GCSE - and that children who had changed secondary school did worse than their peers.

When all these factors, including home background, were taken into account, most schools did about as well as expected. But 20 per cent did significantly better, 24 per cent significantly worse. At the extremes, the difference between 'best' and 'worst' was equivalent to two GCSE grades for the average child.

However, the research results suggest that parents should not choose schools purely on the basis of averages. They need to look at performance in different departments; a school which teaches English well is just as likely as any other to teach maths badly. And different types of pupils do better in different schools. Some schools that do exceptionally well with the most able pupils do exceptionally badly with the least able. The reverse also applies.

Dr Thomas said: 'The major finding is that most schools are doing reasonably well but some schools can be identified as doing particularly well and others as doing badly. In some schools which look as though they are doing well, the maths department is terrible.'