Some initial thoughts on KS1 data


I started collecting data from test scores and teacher assessment judgements earlier this week. So far, around 80 schools (3500+ pupils) have shared their KS1 test scaled scores, and nearly 60 (nearly 2500 pupils) have shared their Teacher Assessment judgements (in the main three bands). So, what does it show?

Scaled Score Data

Despite – or perhaps because of – the concerns about the difficulty of the reading tests, it is Reading which has the highest “pass rate”, with 65.5% of pupils achieving 100 or greater. (Similarly, the median rate for schools was just over 65%)

Maths was not far behind, with 64.2% of pupils achieving 100 or greater, although the median score was slightly higher for schools, again at 65%. The results for GPS were lower (at around 57%), but this was based on a far smaller sample of schools, as many did not use the tests.

The spread of results can be seen approximately, by the proportion of schools falling within each band in the table below (click to enlarge)


For example, just 2% of schools have more than 90% of children achieving a scaled score of 100 in Reading, while 43% of schools had between 60-69% of children scoring 100+

Notably, the range in Maths results is slightly broader than in Reading.

Teacher Assessment Judgements

The order of success in the subjects remains the same in the collection of Teacher Assessment judgements, with Reading having the highest proportion of pupils reaching the expected standard or greater, closely followed by Maths – and Writing trailing some way behind. However, perhaps the most surprising difference (or perhaps not) is the fact that the proportions are all approximately 10% higher than the scaled score data.

According to teachers’ own assessment judgements, some 74% of pupils are reaching the expected standard or above in Reading, 73% in Maths, and around 68% in Writing.

Similarly, the spread of teacher assessment judgements shows more schools achieving higher proportions of children at the expected level – and includes one or two small schools achieving 100% at expected or above.


There are notable shifts at the bottom end. For example, 16% of schools had fewer than half of children achieve 100+ in Maths, whereas only 4% of schools have fewer than half of their children achieving the expected standard when it comes to Teacher Assessment.

It’s important to note that the data is not from the same schools, so any such comparisons are very unlikely to be accurate, but it does raise some interesting questions.

Greater Depth

Have I said we’re dealing with a small sample, etc? Just checking.

But of that small sample the proportions of pupils being judged as “Working at Greater Depth within the Expected Standard” are

Reading: 17%
Maths:     16%
Writing:   11%

More Data

Obviously there are many flaws with collecting data in this way, but it is of some interest while we await the national data. If you have a Year 2 cohort, please do consider sharing your data anonymously via the two forms below:

10 thoughts on “Some initial thoughts on KS1 data

  1. Matthew Clark 9 June 2016 at 10:03 pm Reply

    Interesting results. Will you do the same for KS2 data? Will you also separate writing data from moderated and non-moderated schools? I would be interested to know if that would show a difference. Thanks.

    • Michael Tidd 9 June 2016 at 10:05 pm Reply

      It is my intention to collect KS2 TA data, yes.
      And if I get enough data I’ll also look at moderated vs unmoderated schools.

  2. @bethben92 9 June 2016 at 10:15 pm Reply

    When we finalise TA I will add it. What scaled score did you take to show GD in maths? Our moderator told us maths test wasn’t enough evidence for the interim assessment criteria of GF. We had some over 108 but can’t give them it in TA so our TA won’t match test outcomes.

  3. Jane 9 June 2016 at 10:31 pm Reply

    Possibly a stupid question…. Do you know if we are going to be told what test score is equivalent to ‘above expected’?

    • Michael Tidd 9 June 2016 at 10:49 pm Reply

      We won’t be – they’re not comparable.

  4. Matt 10 June 2016 at 1:37 pm Reply

    I am wondering whether the reason for some schools achieving 115% is because they may calculate their levels in a different way.
    I used to show %age at Level 2C+ then Level 2B+ then Level 2A+ then Level 3+ (therefore the first figure would include all children at Level 2C, 2B 2A and 3.)
    When I filled in your questions I was unsure whether the %age of children at EXS should include the children at GDS (effectively a EXS+ figure). I presumed not as the question is quite carefully worded. Maybe some others weren’t sure but if they had 15% at GDS that would explain it.

    • Michael Tidd 10 June 2016 at 9:05 pm Reply

      I definitely found several of those – and indeed in some cases could iron out the issues.
      But some seemed far more erratic!

  5. Matthew 13 June 2016 at 11:40 am Reply

    Hi Michael
    the calculations page on your KS1 Scaled Scores page does not seem to be working properly the rows in the final table do not add up to 100, they used to but no longer… great piece of work tho.

    • Michael Tidd 13 June 2016 at 12:53 pm Reply

      Thank you for spotting that – I have now corrected it.

  6. teachingbattleground 13 June 2016 at 6:20 pm Reply

    Reblogged this on The Echo Chamber.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

%d bloggers like this: