Judge schools over five-year period, says exam board
- Published
Schools in England should be judged by five years' worth of results rather than just one, an exam board says.
A Cambridge Assessment study found "surprisingly high levels" of school results volatility year on year.
Variations in results were of "serious concern" in many of the 150 schools analysed, even after the impact marking quality had been removed, it said.
Heads backed the report's call, saying decisions on schools should not be made on the basis of one year's results.
School league table positions are based on headline GCSE results for one year only.
Under the current system, schools are considered to be failing if fewer than 40% of their students score at least five Cs at GCSE, including English and maths, and they do not meet national averages in pupil progress.
'Complex factors'
The exam board's group director of assessment and development, Tim Oates, said: "Underlying school-level volatility may be an enduring and persistent feature of education."
This meant "that school performance - in terms of exam results - should be judged on a five-year picture rather than one-off annual drops or increases", he added.
"This is a very important finding," he said, "and one which challenges many assumptions, with implications for the approach to accountability and for accountability measurements."
The study did not investigate all the causes of volatility - but it suggested marking quality and grade boundaries had little impact on variability of results, as they remained volatile when these were removed from the equation.
The exam board analysed the GCSE results in maths and history in all England's schools between 2008 and 2013 and then focused on 150 of the most stable schools, taking out the impact of marking quality and shifts to grade boundaries.
Study author Tom Bramley said: "Exam results in a school may go up or down in unanticipated ways, caused by a wide and complex set of factors.
"When swings occur, they could be because of what is happening in the school or the children's lives, they could be to do with the assessment itself or the way that national standards are applied, or to do with teaching and learning.
"But what our study shows is that when we've taken account of the variations which can be attributed to quality of marking and to the location of grade boundaries, surprisingly high levels of year-on-year volatility in exam results remain."
He added: "Schools should still monitor exam results for an administrative error which might have occurred and should still look for and alert exam boards to peculiar outcomes - but everyone in the system should be aware of the level of volatility typical in the context of the complex system which is schooling."
Brian Lightman, general secretary of the Association of School and College Leaders, said: "We strongly agree with the findings of the report.
"It highlights the risks associated with the impact of drawing sweeping conclusions about the impact variations in exam results which happen anyway.
"It rightly highlights the need to look at trends over a period of years.
"They do say that of course schools should scrutinise results carefully, and from time to time there will be human error, but what contributes to those results is a complex set of factors.
"The big thing for us is that we see decisions being made which affect the careers of our members because of one set of results and dramatic changes made to schools when actually these variations in results are just normal variations between year groups."
- Published29 January 2015
- Published21 August 2014