I do not wish to downplay the significance of nationally published data, and the purpose of this blog isn’t to be defensive; our results over time, as well as your own child’s experiences of Coleridge will hopefully allow you to make your own judgements about the school.
However, I do think that anybody viewing nationally published data (and certainly anybody using it to make decisions about where to send their child to school) should be clear about precisely what the information means, how it is collated, and what its limitations are for creating a picture of a school. Explaining this can be dull and technical, and probably won’t result in a blog post that goes viral. Nevertheless, I believe it to be important.
Nationally published data, such as that found on Compare School Performance, is only concerned with the progress and attainment of children in Year 6. The information, which appears on the summary page, and which seems to pose as the litmus test for the effectiveness of the school as a whole, is actually only based on the performance of 120 children during four mornings of pressurised testing in early May. Many schools tailor their entire curriculum and timetable towards the SATs tests – we do not.
Because no 120 children are ever the same, and because the needs and abilities of each group of children change year on year, so our results tend to go up and down over time – sometimes we’re well above average, sometimes we’re slightly below. This looks strange and gives the impression of a school which performs erratically, especially when our figures are compared to the national averages, which always seem to rise smoothly and consistently over the same period. It should be understood, however, that this is not necessarily about a lack of consistency in the quality of teaching, but has more to do with the individual strengths and weaknesses of the cohort, and the size of the data set being analysed: small quantities of data, produced by a single school, for example, generally tend to produce spiky results – but as the data sets become larger and more pupil’s results contribute to the averages, the spikes are smoothed. In short, performance which fluctuates from one year to the next is normal, and most schools experience it.
Progress measures are often deemed to be the gold standard of school performance, but again it is important to know how these measures are calculated in order to understand their value as a tool for making judgements about the effectiveness of a school. The methodology for calculating these is complicated, but anybody wishing to understand it fully, can read about it here. In simple terms, however, a child’s Key Stage 2 results are compared to those of other children nationally who are of similar ability. This is done by putting the children into Prior Attainment Groups based on the results of their Key Stage 1 tests which they undertook back in 2014. However, because these groups are determined on a combined score of the child’s English and Maths results, the progress measures at the end of Key Stage 2 for individual subjects are never completely accurate.
Of course, real meaning can be derived from nationally published performance data, and indeed it is; we regularly analyse data and identify patterns of strength and weakness which we use to drive school improvement and to monitor the impact of new practices and initiatives. However, this can only really be achieved by looking at data in detail over time, and by viewing it in conjunction with the school’s own information for other year groups too. Though there is now some facility for parents to do this kind of analysis on the DfE site, the emphasis still appears to be on producing simplistic standalone figures, wrapped in traffic light colours that everyone can understand. Ironically, however, the process of simplifying and reducing data in this way, has actually made it more misleading.
Above all, the most important thing to bear in mind about nationally published performance data, is that it only ever provides a pinhole view of what happens in schools. There is no measure of progress and attainment in other subjects areas like PE or the arts; of children’s happiness, self-esteem, and emotional wellbeing; nor is there any data regarding children’s creativity, capacity for critical thinking, or for their enjoyment and interest in learning. These are things which are equally, if not more important, than progress and attainment in the 3 Rs, and are also things at which Coleridge Primary School is, in my view, Well Above Average.
Image by Charles J Sharp, sharpphotography, CC BY-SA 4.0.