I started collecting data from test scores and teacher assessment judgements earlier this week. So far, around 80 schools (3500+ pupils) have shared their KS1 test scaled scores, and nearly 60 (nearly 2500 pupils) have shared their Teacher Assessment judgements (in the main three bands). So, what does it show?
Scaled Score Data
Despite – or perhaps because of – the concerns about the difficulty of the reading tests, it is Reading which has the highest “pass rate”, with 65.5% of pupils achieving 100 or greater. (Similarly, the median rate for schools was just over 65%)
Maths was not far behind, with 64.2% of pupils achieving 100 or greater, although the median score was slightly higher for schools, again at 65%. The results for GPS were lower (at around 57%), but this was based on a far smaller sample of schools, as many did not use the tests.
The spread of results can be seen approximately, by the proportion of schools falling within each band in the table below (click to enlarge)
For example, just 2% of schools have more than 90% of children achieving a scaled score of 100 in Reading, while 43% of schools had between 60-69% of children scoring 100+
Notably, the range in Maths results is slightly broader than in Reading.
Teacher Assessment Judgements
The order of success in the subjects remains the same in the collection of Teacher Assessment judgements, with Reading having the highest proportion of pupils reaching the expected standard or greater, closely followed by Maths – and Writing trailing some way behind. However, perhaps the most surprising difference (or perhaps not) is the fact that the proportions are all approximately 10% higher than the scaled score data.
According to teachers’ own assessment judgements, some 74% of pupils are reaching the expected standard or above in Reading, 73% in Maths, and around 68% in Writing.
Similarly, the spread of teacher assessment judgements shows more schools achieving higher proportions of children at the expected level – and includes one or two small schools achieving 100% at expected or above.
There are notable shifts at the bottom end. For example, 16% of schools had fewer than half of children achieve 100+ in Maths, whereas only 4% of schools have fewer than half of their children achieving the expected standard when it comes to Teacher Assessment.
It’s important to note that the data is not from the same schools, so any such comparisons are very unlikely to be accurate, but it does raise some interesting questions.
Have I said we’re dealing with a small sample, etc? Just checking.
But of that small sample the proportions of pupils being judged as “Working at Greater Depth within the Expected Standard” are
Obviously there are many flaws with collecting data in this way, but it is of some interest while we await the national data. If you have a Year 2 cohort, please do consider sharing your data anonymously via the two forms below: