So at a time when access to information about public services is seen as a right rather than a privilege, it seems churlish to complain about the dissemination of any data about schools. But when does information become misinformation, or even disinformation?
The league tables are compiled by newspapers on the basis of the Department for Education's school-by-school summary of A-level results. As last year, there will be complaints about the technical accuracy of the tables, but their major flaw is they do not compare 'like with like'.
The issue of appropriate and valid ways of reporting on schools' performance (as measured by examination results) is of vital importance. But taken on its own, raw information about schools' examination results is an inadequate measure of performance and can be highly misleading. Raw results are the appropriate answer to the question 'How has my son or daughter performed?' but not to questions about the school's performance. For these questions, other information is needed to identify what researchers call the value-added component.
By value-added, we mean an indication of the extent to which schools have fostered the progress of pupils over a particular period (usually from entry to the school until public examinations). The term relates to the relative boost a school gives to a student's previous level of attainment. Therefore, accurate baseline information about pupils' prior attainment is needed. It is also important to point out the statistical uncertainty attached to any individual school's results; this applies both to the measurement of value-added and to the measurement of raw results.
The most important predictor of examination performance is prior attainment at entry (whether this is measured at the start of infant, junior or secondary schooling). Pupil intake of schools varies considerably, and research has shown that a school's relative effectiveness can be determined only if like is compared with like. Factors related to students' background, such as fluency in English, sex, age and parents' education, are powerful predictors of examination achievement. As a result, schools in disadvantaged inner-city neighbourhoods face quite different challenges from schools in more affluent suburban areas.
The use of league tables based on raw results alone can lead, therefore, to complacency among schools serving educationally advantaged communities. In reality, the progress made by pupils in some of these schools is poorer than that of others in schools serving less advantaged intakes, and their examination results, which on the surface may seem relatively good, are lower than they might be. Conversely, there are schools in disadvantaged communities where pupils make far better progress than pupils in schools with more advantaged intakes, and yet which, in terms of raw results, appear mediocre. Neither kind of school is assisted by the publication of raw league tables. In the former, the need for improvement may not be appreciated; in the latter, such league tables may seriously demoralise staff.
We can demonstrate the difference the value-added approach makes by comparing schools' ranking in terms of raw results with their ranking based on the progress of pupils. In a study of reading attainment in 49 schools we found that a fifth of the schools moved up or down 20 or more places when their value-added results were compared with average raw scores, and more than half moved between five and 19 places.
For the majority of schools in today's league tables, the rank position in raw results will provide little indication of the boost they have given to student achievement.
The publication of examination results alone is therefore not enough for proper judgements to be made about schools' performance. Other aspects, such as student attendance, attitudes to school and learning and behaviour are also important. In our work on the effectiveness of schools, we have always looked at a broad range of achievements. Nonetheless, academic results are important, and it is essential that useful information about pupil progress is employed. A school that was able to promote non-academic achievements, yet failed to promote academic progress, could not be classified as truly effective.
There is no point in providing more information about school performance unless such information can be used to assist schools in assessing themselves and provide a stimulus for improvement. The Institute of Education in London is setting up an international Centre for School Effectiveness and Improvement in the belief that such a centre can help to raise standards.
Our work includes a comparative study of secondary schools. It is examining variations between schools in student performance at GCSE in terms of value-added in total examination scores and subject differences at the departmental level. We are also concerned with the question of consistency from one year to another. If school performance does vary greatly between departments and/or over time, this has implications for schools. It may also have lessons for parents who can only make one choice of secondary school.
Sir Ron Dearing, the Chairman of the Schools' Curriculum and Assessment Authority, recently recommended that further work on value- added measures of school performance be conducted as a priority, and the Government appears to have accepted this. At present, however, there is no national data base that would enable value-added comparisons for all schools to be calculated, except at A-level. For A-levels, GCSE results can be used as baseline measures.
In the meantime we are conducting a project to examine the feasibility of using nationally available data about schools' catchment areas to put GCSE results in context. Clearly, data about prior attainment would allow more accurate value-added measures of performance to be calculated on a regular basis. Only in this way can an adequate profile of school performance be created and schools that are consistently particularly effective or ineffective identified.
The author is a research officer at the Institute of Education in London. The article was written jointly with Professor Peter Mortimore, deputy director of the institute and Dr Sally Thomas, research officer.
(Photograph omitted)Reuse content