Monthly Archives: March 2016

One-page markscheme for KS2 GPS test

Just a quick post to share a resource.

As I plough through marking the 49 questions of the KS2 sample Grammar test, I find keep flicking back and forth in the booklet a nuisance, so I’ve condensed the markscheme into a single page document.

You’ll still want the markscheme to hand for those fiddly queries, but it means a quicker race through for the majority of easy-to-mark questions. For each question, where there are tickboxes I’ve just indicated which number box should be ticked; where words should be circled/underlined I’ve noted the relevant words. For grid questions, I’ve copied a miniature grid into the markscheme.

Feel free to share: One-page GPS markscheme

Of course, once you’ve marked the tests, please also share your data with me so we can start to build a picture of the national spread of results – see my previous blog.

Unfair?

DfEOfsted.png

Y2 and Y6 tests: data collections

Last month I started collecting data from schools who had used the Y6 sample tests to try to gather some useful information that we could all share. That project has been wildly successful, such that I have extended it several times. So, firstly let me provide the links for the forms where I’m collecting data from schools – please do share any data you have collected over the last term about your pupils by visiting these documents:

So, what have I found so far? Here is some summary information for key areas:

Key Stage 1 DfE Sample tests

Subject Mean Average Score Median Score Interquartile range
Reading 20 marks 22 marks 12 – 29 marks
Grammar, Punctuation & Spelling 18 marks 17 marks 10 – 26 marks
Mathematics 29 marks 29 marks 19 – 39 marks

More detail is available on this previous post. You can access a spreadsheet to present your own school’s results in a comparison table and graphs alongside the national data collected so far. Access the comparison spreadsheet here: Key Stage 1 Comparisons

Key Stage 2 DfE Sample tests

Subject Mean Average Score Median Score Interquartile range
Reading 29 marks 30 marks 23 – 36 marks
Grammar, Punctuation & Spelling 35 marks 35 marks 26 – 44 marks
Mathematics 59 marks 59 marks 39 – 79 marks

More detail is available on this previous post. You can access a spreadsheet to present your own school’s results in a comparison table and graphs alongside the national data collected so far. Access the comparison spreadsheet here: Key Stage 2 Comparisons

Key Stage 2 Writing Assessments

I did try to collect KS2 assessment data, but the results seem wildly doubtful in places. Nevertheless, I present some detail here:

  • Median %age working at expected level or above: 68%
  • Sample %age working at expected level or above: 66%
  • Sample %age working at greater depth: 12%

More details on the Writing data can be found at this blog post.

Further data

I am continuing to collect data from sample tests, and also for the Optional Tests produced by Rising Stars, so please do share your school’s data (completely anonymously) via the spreadsheet links below:

Collecting Data: Rising Stars Optional Tests

As the Key Stage tests loom, many teachers and schools have been kind enough to share thousands of pupil results from the sample tests published by the DfE. I have collated these and used them to give some comparisons for us. Hopefully, after the tests themselves we’ll also then get an indication of how the tests compare to the live versions this May.

Alongside this, I know that many schools are using the Rising Stars Optional Tests to track pupils’ progress towards the ‘expected standard’ in Years 2 and 6. Working with Rising Stars, I am now aiming to collect similar data from schools who have used the new versions of the Optional Tests (Set A). This will be collected in the same way as before, through an open spreadsheet, and then shared with Rising Stars. The aim is to allow them to draw out some key messages in the near future.

optionalsIn addition, Rising Stars will then go on to complete a correlation study with the live tests. This will give schools valuable information about how the Optional Tests marry up to the national tests, which will help teachers to make more accurate judgements and predictions in future years.

If your school has used the Rising Stars Optional Tests (Set A) in Years 2 and 6, please do share your data anonymously via the spreadsheets below.

Year 2 spreadsheet link here

Year 6 spreadsheet link here

If you cannot access the sheets, please feel free to email your school’s data to info@primarycurriculum.me.uk

Fancy Quizzing the DfE?

I have to admit, I was slightly surprised. When Optimus Education asked me to lead a session interviewing a representative from the department; I did wonder if they might find themselves unavailable on the proposed date. It was perhaps a little unfair of me, as the staff at the STA have actually done a fair deal to make themselves available over the past year or so, and this event provides a great opportunity to access the experts.

And so it is, that on Wednesday 25th May I’ll be interviewing the Head of Assessment Policy, Catherine Wreyford, at the Primary Assessment and Data Use conference in London. Naturally, I could happily fill the time with plenty of questions of my own, but the main aim is to allow delegates to send their questions in for me to put to Catherine.

Of course, by then, all the fun and excitement of this year’s tests will be over, so what questions could we possibly have? Well, I wouldn’t wonder if there weren’t quite a few. Some initial topics spring to mind…

  • Teacher Assessment (that won’t be completed until June 30th)
  • Challenges of reporting to parents
  • The prospect of Raise Online
  • Plans for assessment in 2017

And who knows what questions we might have about the tests that have just happened themselves?

More details about the event can be found at www.oeconferences.com/Primaryassessment
and a discount of 20% is available for my followers by quoting code MT20 when booking.

Delegates attending will have the opportunity to email questions to be asked – and I’ll do my best to get some answers!

Pri assess banner

Some data on KS2 Writing Estimates

Following on from the work on the KS2 Sample Test results, I asked last week for people’s estimates (predictions?) for their outcomes on the new KS2 Writing Teacher Assessment framework in 2016.

The results have been … well, shall I say… erratic?

Data were shared by well over 300 schools, covering nearly 12,000 children, which is a fantastic result. In fact, there were initially more than that, but it turns out that teachers are not good at reading instructions. I had to delete some data that clearly made no sense (29 out of 17 children reaching expected standard?!) Then I had to correct some where people had clearly entered percentage values rather than pupil numbers.

Having looked at the outcomes, I still have my doubts. There are several schools who seem to be predicting sizeable increases in percentages working at the expected standard, and several who seem to be predicting exactly the same outcomes as in 2015. I just think schools are feeling in the dark. Thus, I confidently pronounce this data nonsense.

So, huge caveats aside, I looked at the data, I could pick out the following details:

  • As a whole, the data set appeared to come from slightly higher-attaining schools than average. The data entered for 2015 results was higher than national figures at both levels 4 and 5 (88% vs 87% and 37% vs 36%)
  • Over the whole set of data shared, the proportion of children working at the Expected Standard or above was 66% (compared to 88% achieving L4+ in 2014)
  • Over the whole set of data shared, the proportion of children working at Greater Depth within the expected standard or above was 12% (compared to 37% achieving L5+ in 2014)
  • The median percentage of pupils working at Expected or above was 68%
  • There appears to be little correlation between how schools achieved in levels last year and their predictions for this year. (Excel says correlation value of 0.23)

The graph below shows the scatter plot of results, with the x-axis showing 2015 percentage values for Level 4+ attainment, and the y-axis showing estimates for 2016 outcomes at Expected Standard or higher. Make of it what you will!

writing

It’s worth noting that of schools who achieved 100% Level 4+ in 2015, the predictions for 2016 range from 20% to over 90% at the expected standard. Also, of the six schools predicting/estimating 100% attainment at Expected+ for 2016, two have cohorts of fewer than 10 pupils.

If anybody else with a mathematical eye would like the full data set to explore, I’d be happy to share it – drop me a line via the About page above, or on twitter at @michaelt1979.

The summary of my analysis of KS2 sample test data is here.

KS1 data – some initial outcomes

 

ks1warning

As with the KS2 data, I haven’t got room for all the caveats that this needs, so let me cover it all by saying that this could all be nonsense. However, since when did ‘being nonsense’ ever stop something happening in education?

Having collected data from teachers kind enough to share their sample test results, I now have around 1000 results for each subject; enough to make some very broad generalisations about the scores that may be of interest to Year 2 teachers and school leaders.

So, what can we see so far?

Reading

Looking at the data from tests taken this term alone (as earlier tests are obviously likely to show lower attainment), the following can be said of a sample of 985 pupils’ test scores:

  • The mean average score in the sample was 20 marks
  • The median score in the sample was 21 marks
  • The middle 50% of students scored between 11 and 28 marks
  • If the passmark were set at the 50% mark (i.e. 20/40), then 53% of pupils would have reached the expected standard

Obviously there is still some time until May and so children’s scores will unquestionably improve.

The graph shows the spread of results in the sample:

ks1read

 

Grammar, Punctuation & Spelling

Looking at the data from tests taken this term alone, the following can be said of a sample of 1277 test scores:

  • The mean average score in the sample was 17 marks
  • The median score in the sample was 16 marks
  • The middle 50% of students scored between and 25 marks
  • If the passmark were set at the 50% mark (i.e. 20/40), then 40% of pupils would have reached the expected standard

Obviously there is still some time until May, and so – again – children’s scores will unquestionably improve.

The graph shows the spread of results in the sample:

ks1gps

Mathematics

Looking at the data from tests taken this term alone, the following can be said of a sample of 1489 test scores:

  • The mean average score in the sample was 29 marks
  • The median score in the sample was 28 marks
  • The middle 50% of students scored between 18 and 39 marks
  • If the passmark were set at the 50% mark (i.e. 30/60), then 47% of pupils would have reached the expected standard

Obviously there is still some time until test week, and so children’s scores will unquestionably improve.

The graph shows the spread of results in the sample:

ks1maths

Compare your own data

Thanks to Tarjinder Gill, you can now access an interactive spreadsheet that allows you to enter your own class’s data and compare it to the dataset through some simple graphs. Read more about these sheets at this blog post.

More Caveats

I will have made mistakes here. Some people may have made mistakes when entering data. Someone may have made up their data completely. Some of the data will be from the first week of January while some will be from this week. Some teachers will have marked more generously than the test markers might; others will have been more strict. The schools who submitted data were entirely self-selecting. The tests are still weeks away, and practice will be going on up and down the country. And frankly, there is no predicting what the DfE might have in mind.

Nevertheless… if teachers can take some comfort from the fact that their data is not horrendously different from everyone else’s, then our work here has been worthwhile.

More data please

We’ll never have anything like a random sample, or even a well-chosen one, but the more data the better, surely. Please do continue to contribute by sharing your data. See my previous post for more details.

KS2 Maths – Question Level Analysis

As so many schools have evidently used the sample tests to help ascertain their pupils’ progress towards the expected standard (whatever that might be), I’m sure many will welcome the opportunity to analyse the outcomes.mathsqla

Emily Hobson, (@miss_hobson) of Oasis Academies, has kindly agreed to share the template she put together for analysing the KS2 tests.

The spreadsheet can be downloaded below, and then data entered to scrutinise your pupils’ progress in the main areas, and for each question.

Question Level Analysis (Sample Material) – Mathematics

Names need only be entered onto the first page; these will then carry across to later pages.

You can also adjust the % thresholds on the first page, and these will be reflected in the colour bands marked for each pupil.

 

KS2 Writing Estimates

Lots of people have asked me to do this, and although it makes me slightly nervous, I’m going to give it a go.

I am attempting to collect data from people of their estimates for numbers of pupils reaching each of the three main standards in KS2 Writing this year:

  • Working Towards
  • Working At
  • Working At Greater Depth

I’ve decided to do this via a completely anonymous Google form, so that people don’t have to worry about their results being publicly identifiable (particularly for schools with notably small or large numbers, for example). If you feel you can contribute, please complete the 2-page form below, and I shall endeavour to provide some summary data in the coming week or two.

Possibly useful analysis of Sample tests

Last week I started collecting data from schools who had used the KS1/2 sample tests. I have been overwhelmed by the response, with the data for over 5000 pupils now available.

I’ve tried to draw some very simple conclusions on my blogpost here, but really what teachers want is to be able to compare their school to the national picture. Enter, Tarjinder Gill, who has put some considerable effort into turning the available data into something useful for schools to do exactly that.

Accessing the spreadsheets below will allow you to enter your own school’s data. Once you do, you can see how your scores compare to the collected data from the schools who have so willingly shared their results. You can also see simple graphs that compare your scores both at mark-level and in bands, so that you might spot patterns.

readinggraph

Made-up data, I’m afraid, just to illustrate the graph option.

The school shown in the example here seems to be significant under-performing compared to the national picture indicated by my very dodgy collection of data.

Please be aware of all the usual caveats: this isn’t a reliable set of data, it isn’t a random sample, it hasn’t been checked for accuracy, it isn’t from a cross-section of schools, etc., etc. It’s just all we have at the moment, I’m afraid.

To access the spreadsheets, you can find them here on Google:

Key Stage 1 data comparison spreadsheet (NB: very limited national data so far)

Key Stage 2 data comparison spreadsheet

You will not be able to edit these versions. To enter your own data, users with their own Google account can make a copy of the spreadsheets. If you  do not have a Google account at all, then it is possible to download the spreadsheets, and then use them in Excel. However, these versions will only include a comparison to the data collected up to the point at which you downloaded the sheet.

If you find these sheets useful, and you haven’t already shared your school’s data with me to add to the “accuracy” of the project, please read here about how you can contribute.

Also, please thank Tarjinder Gill, as the work on these spreadsheets is all hers!