As with the KS2 data, I haven’t got room for all the caveats that this needs, so let me cover it all by saying that **this could all be nonsense.** However, since when did ‘being nonsense’ ever stop something happening in education?

Having collected data from teachers kind enough to share their sample test results, I now have around 1000 results for each subject; enough to make some very broad generalisations about the scores that may be of interest to Year 2 teachers and school leaders.

So, what can we see so far?

### Reading

Looking at the data from tests taken this term alone (as earlier tests are obviously likely to show lower attainment), the following can be said of a sample of **985 **pupils’ test scores:

- The mean average score
*in the sample*was**20 marks** - The median score
*in the sample*was**21 marks** - The middle 50% of students scored between
**11**and**28**marks **If**the passmark were set at the 50% mark (i.e. 20/40), then 53**%**of pupils would have reached the expected standard

Obviously there is still some time until May and so children’s scores will unquestionably improve.

The graph shows the spread of results in the sample:

### Grammar, Punctuation & Spelling

Looking at the data from tests taken this term alone, the following can be said of a sample of **1277 **test scores:

- The mean average score
*in the sample*was 17**marks** - The median score
*in the sample*was**16 marks** - The middle 50% of students scored between
**9**and**25 marks** **If**the passmark were set at the 50% mark (i.e. 20/40), then**40%**of pupils would have reached the expected standard

Obviously there is still some time until May, and so – again – children’s scores will unquestionably improve.

The graph shows the spread of results in the sample:

### Mathematics

Looking at the data from tests taken this term alone, the following can be said of a sample of **1489 **test scores:

- The mean average score
*in the sample*was**29 marks** - The median score
*in the sample*was**28 marks** - The middle 50% of students scored between
**18**and**39 marks** **If**the passmark were set at the 50% mark (i.e. 30/60), then**47%**of pupils would have reached the expected standard

Obviously there is still some time until test week, and so children’s scores will unquestionably improve.

The graph shows the spread of results in the sample:

### Compare your own data

Thanks to Tarjinder Gill, you can now access an interactive spreadsheet that allows you to enter your own class’s data and compare it to the dataset through some simple graphs. Read more about these sheets at this blog post.

### More Caveats

I will have made mistakes here. Some people may have made mistakes when entering data. Someone may have made up their data completely. Some of the data will be from the first week of January while some will be from this week. Some teachers will have marked more generously than the test markers might; others will have been more strict. The schools who submitted data were entirely self-selecting. The tests are still weeks away, and practice will be going on up and down the country. And frankly, there is no predicting what the DfE might have in mind.

Nevertheless… if teachers can take some comfort from the fact that their data is not horrendously different from everyone else’s, then our work here has been worthwhile.

### More data please

We’ll never have anything like a random sample, or even a well-chosen one, but the more data the better, surely. Please do continue to contribute by sharing your data. See my previous post for more details.

cazzypot201314 March 2016 at 9:43 pmReblogged this on The Echo Chamber.