Monthly Archives: July 2016

Some thoughts on KS2 Progress

Caveats first: these conclusions, such as they are, are drawn from a small sample of a little over 50 schools. That sample of schools isn’t representative: indeed, it has slightly higher attainment than the national picture, both in terms of KS2 outcomes, and in KS1 starting points. However, with over 2000 pupils’ data, it shows some interesting initial patterns – particularly when comparing the three subject areas.

Firstly, on Maths – the least controversial of the three subjects. It seems that – in this sample – pupils who achieved Level 2c at KS1 had an approximately 40% chance of reaching the new expected standard (i.e. a scaled score of 100+). That leaps to around 66% for those achieving L2b at KS1 (i.e. just short of the national average)


The orange bar shows the average of this sample, which is slightly higher than the national average of 70%

It’s important to note, though, that progress measures will not be based on subject levels, but on the combined APS score at Key Stage 1. The graph for these comparisons follows a similar pattern, as you’d expect:


Where fewer than 10 pupils’ data was available for any given APS score, these have been omitted.

There is an interesting step here between pupils in this sample on APS of 13 (or less) who have a chance of 40% or less of reaching the expected standard, while those scoring 13.5 or more have a greater than 50% chance of achieving the standard. (The dip at 12.5 APS points relates to pupils who scored Level 2s in Maths and one English subject, but a level 1 in the other, highlighting the importance of good literacy for achievement in KS2 Maths)

For Reading, the graphs look broadly similar in shape


Blue bar shows average of this sample at 67%, which is slightly higher than national average of 66%

Interestingly here the level 2c children scorers still have only 40% chance of meeting the expected standard, but those achieving 2b have a lower chance than in maths of reaching the expected standard (58% compared to 66% for Maths).

When looking at the APS starting points, there is something of a plateau at the right-hand end of the graph. The numbers of pupils involved here are relatively few here (as few as 31 pupils in some columns). Interestingly, the dip at 18.5 APS points represents the smallest sample group shown, made up of pupils who scored 2a/2b in the two English subjects, but a Level 3 in Maths at Ks1. This will be of comfort to teachers who have been concerned about the negative effect of such patterns on progress measures: it seems likely that we will still be comparing like with like in this respect.


It is in Writing that the differences become more notable – perhaps an artefact of the unusual use of Teacher Assessment to measure progress. Compared to just 40% of pupils attaining L2c in Reading or Maths achieving the new expected standard, some 50% of those in Writing managed to make the conversion, and this against a backdrop of teachers concerned that the expected standard was too high in English. Similarly, over 3/4 of those achieving Level 2b managed to reach the standard (cf 58% Reading, 66% Maths)


In contrast to the other subjects, attainment in this sample appears notably lower in Writing than the national average (at 70% compared to 74% nationally)

With the APS comparisons, there are again slight dips at certain APS points, including 18.5 and 19.5 points. In the latter case, this reflects the groups of pupils who achieved Level 3s in both Reading and Maths, but only a 2b in Writing at KS1, suggesting again that the progress measure does a good job of separating out different abilities, even using combined APS scores.


Of course, this is all of interest (if you’re interested in such things), but the real progress measures will be based on the average score of each pupil with each KS1 APS score. I’d really like to collect some more data to try to get a more reliable estimate of those figures, so if you would be willing to contribute your school’s KS1 and KS2 data, please see my previous blog here.

Spread of data

Following a request in the comments, below, I’ve also attached a table showing the proportions of pupils achieving each scaled score for the two tests. This is now based on around 2800-2900 pupils, and again it’s important to note that this is not a representative sample.



A few words on the 65% floor standard

There’s been much discussion about this in the last few days, so I thought I’d summarise a few thoughts.

Firstly, many people seem to think that the government will be forced to review the use of a 65% floor standard in light of the fact that only 53% of pupils nationally met the combined requirements. In fact, I’d argue the opposite: the fact that so few schools exceed the attainment element of the floor standard is no bad thing. Indeed, I’d prefer it if no such attainment element existed.

There will be schools for whom reaching 65% combined Reading, Writing & Maths attainment did not require an inordinate amount of work – and won’t necessarily represent great progress. Why should those schools escape further scrutiny just because they had well-prepared intakes? Of course, there will be others who have met the standard through outstanding teaching and learning… but they will have great progress measures too. The 65% threshold is inherently unfair on those schools working with the most challenging intakes and has no good purpose.

That’s why I welcomed the new progress measures. Yes it’s technical, and yes it’s annoying that we won’t have it for another couple of months, but it is a fairer representation of how well a school has achieved in educating its pupils – regardless of their prior attainment.

That said, there will be schools fretting about their low combined Reading, Writing & Maths scores. I carried out a survey immediately after results were released, and so far 548 schools have responded, sharing their combined RWM scores. From that (entirely unscientific self-selecting) group, just 28% of schools had reached the 65% attainment threshold. And the spread of results is quite broad – including schools at both 0% and 100%.

The graph below shows the spread of results with each colour showing a band of 1/5th of schools in the survey. Half of schools fell between 44% and 66%.

Combined attainment

Click to see full-size version

As I said on the day the results were published – for a huge number of schools, the progress measure will become all important this year. And for that, we just have to wait.


Since posting, a few people have quite rightly raised the issue of junior/middle schools, who have far less control over the KS1 judgements (and indeed in middle schools, don’t even have control over the whole Key Stage). There are significant issues here about the comparability of KS1 data between infant/first schools and through primary schools (although not necessarily with the obvious conclusions). I do think that it’s a real problem that needs addressing: but I don’t think that the attainment floor standard does anything to address it, so it’s a separate – albeit important – issue.

A little data experiment

Right, let me be clear up-front: I cannot predict your school’s progress scores. I can’t even pretend to estimate a prediction of it. There is just no way to find it, without knowing the full national picture of data – and even the DfE don’t have that finalised yet.

However, we do know how the progress measure works (see the video here if you don’t), so it would be possible to recreate the process based on a sample of data. It’s really little more than a thought experiment, but it may be of interest all the same.

To get even close to that, though, it will need lots of data, from lots of schools in lots of detail. Where in the past I have tried to collect summary data, for this experiment I would need real data from schools that includes both KS1 and KS2 results for their Year 6 cohorts. My plan, then, is to collate the data and find the average progress made by pupils with common starting points within the sample.  I will then share the resulting progress calculations with schools who have submitted data.

Because this needs very specific and accurate data, it won’t be possible to collect this using an open spreadsheet. Instead, below is a master spreadsheet which can be downloaded by anybody who wants to take part. If you wish to join in, please download the form, add your own data, and then return it to

Just don’t base your career decisions on the results 🙂

To take part, please download, complete and return the following spreadsheet:



Am I overstretching it…?

What are people’s thoughts?

Everyone  wants to know about progress measures, but we won’t have the national data until September. We can’t work it out in advance… but is it worth trying to estimate?

I collected data on Tuesday night about the SATs results, and my sample was within 1 percentage point of the final national figures, which wasn’t bad. However, this would be a much more significant project.

To get anything close to an estimate of national progress measures, we would need a substantial number of schools to share their school’s data at pupil level. It would mean schools sharing their KS1 and scaled score results for every pupil – anonymised of course, but detailed school data all the same.

My thinking at this stage is that I’d initially only share any findings with the schools that were able to contribute. It would be a small sample, but it might give us a very rough idea. Very rough.

Would it be useful… and do people think they would be able to contribute?

KS2 Results – Frequently Asked Questions

After a late night, and reasonably early morning, there are a few common questions coming up, so here’s my attempt to answer them:

What’s the national data like?


Full set of data available from website here:

What about the floor standard?

We don’t really know much yet, so it’s too soon to panic! The floor standard is based on two elements: attainment and progress.

Nationally 53% of pupils met the combined Reading, Writing & Maths standards. However, that doesn’t tell us much about how many schools have met that floor standard. Large numbers of those 53% will be in the same schools – and some schools will have none of them. I’d expect the final figure for schools to be a much lower percentage reaching the combined attainment floor standard.

Because of that, this year the floor standard will come down to progress more than ever. We won’t know anything about progress data until September, so for many headteachers it could feel like a long summer holiday – and not in a good way!

Where are the ‘Greater Depth’ thresholds?

There is no threshold for ‘Greater Depth’. Indeed, for tested subjects, there is no ‘Greater Depth’ at all. The scaled score indicates how far above the expected standard a child is, so there is no need for a label. After all, where would the benefit be in saying that a child who scored 117 is high achieving but a child who scored 116 isn’t, when clearly they both are?

There will be an accountability measure that schools have to publish later that is linked to “high scores”, but this is likely to be a combined measure, for example, the proportion of pupils who achieved over 115 in all 3 tested subjects. Note, that I’ve just picked 115 out of thin air. We don’t know what will be counted as a high score, and for the purposes of reporting to parents and children, we don’t need to know.

How on earth do we report this to parents?parentguide

Schools are required to share test results and teacher assessment judgements with parents. I suspect that MIS suppliers will provide a template that allows you to print a separate sheet to give to parents alongside reports. I’ve also written a free leaflet which can be downloaded from the Rising Stars website that helps to explain the results to parents:



How do we calculate our combined RWM score?

The key thing to note with this is that it is based on the number of individual children who have met all 3 subjects,  not an average of the three subjects’ results. To find your combined RWM score, you need to look at each child to decide whether or not they have met all 3 subjects Expected Standards. For example, in this small cohort, 5 out of the 8 pupils did meet all 3 standards, so the combined RWM score will be 62.5%.
Note that the grammar score does not contribute to the floor standards.


What does CA mean on my results?

This means that special consideration has been granted. More details can be found on the STA website here:
Notably, it won’t affect the scaled score at this stage, but will be accounted for when it comes to accountability. Schools have to choose how to explain that to parents.

Can I appeal if they got a scaled score of 99?

You can still apply for marking reviews this year. If it affects the scaled score by moving from over the threshold, or if it affects the raw score by more than 3 marks, then there is not charge. You can submit review requests via NCA tools within the next 10 days. More details online at

Where are the markschemes?

All the markschemes can be downloaded from the website here:

KS2 test results – my early thoughts

Ignore all this: national data is now available here:


Here are my thoughts at 10.30am…

Individual Subjects

We don’t have the national picture yet, but based on a sample of 900 schools (usual caveat here – it’s not a representative sample, will contain errors, can’t be trusted, etc., etc.) average percentage of pupils achieving the Expected Standard or higher is looking like:

  • Reading: 65%
  • Grammar, Punctuation & Spelling: 73%
  • Mathematics: 72%

You can submit your own results into the mix here:

Reading seems hardest hit. From the 900 schools, average drop is 22 percentage points. This compares to 15 point drop in Maths, and just 8-9 point drop in GPS.

Floor Standard

When it comes to the combined RWM floor standard, based on a quick survey with only 200 responses (further caveats: tiny unrepresentative sample here), only around 1/3 of schools reach the 65% Combined attainment target.

You can submit your own combined RWM figure here:

Teacher Assessment

Again, no national data available yet, but a Local Authority source shared partial data collection (around half of schools nationally) which suggested that Writing could be at around 74-75% at the Expected Standard.

Could it really be the case that Writing will leap from being lowest to highest attainment this year?



What are you going to do about workload?

The famous JFK inauguration speech encouraged American to “ask not what your country can do for you–ask what you can do for your country“. An almost typically American message of independence and self-reliance, but words which resonated more widely.

Nicky Morgan probably wouldn’t get away with the same sort of message, not least because her department has a lot to answer for when it comes to workload, especially in recent years. But the point is salient. Much of the workload that exists in schools exists only because we make it.

The DfE actually has fairly limited control over what happens in any given school; Ofsted has gone to some considerable lengths to try to reduce its impact on workload; what remains is a doubt and fear in schools that we can really break free. From here on in, if we want to see a reduction of workload, then it is schools who are going to have to lead the way.

The department tried to give some direction with the publication earlier this year of its three workload reports. True, they got it wrong by publishing them in the Easter holidays, but there were some useful pointers in them and some key points that are worth holding onto when looking at what we do in schools.

On Marking:

We believe that three principles underpin effective marking: it should be meaningful, manageable and motivating

Senior leaders and governors are responsible for the effective deployment of all resources in the school. They should take into account the hours teachers spend on marking and have regard to the work-life balance of their staff.

Feedback can take the form of spoken or written marking, peer marking and self-assessment. If the hours spent do not have the commensurate impact on pupil progress: stop it

On Planning:

Planning a sequence of lessons is more important than writing individual lesson plans

SLT should review demands made on teachers in relation to planning to ensure that minimum requirements to be effective are made.

On Data:

Formative assessment data should not routinely be collected at school level, because of the additional burden it creates.

Summative data should not normally be collected more than three times a year per pupil.

The trouble with reports like these, is that often they are quite bland, and written in department-speak, when what school leaders want to know is what other schools are doing that works, and how are people ensuring that reducing workload doesn’t impact on quality of provision. It’s why I always have the aim to do less, but better.

I’ve tried to share specific examples of my own, including an outline of my school’s new feedback policy. I wrote recently too about the risk of burdensome school reports that don’t warrant the time spent on them.

Perhaps one of the most effective ways that we can all help to reduce one another’s workload is by sharing what we’ve found works so that we can learn from each other rather than reinventing the wheel. Taking pride in ways in which we’ve reduced teachers’ work, rather than only in those which demonstrate how hard we are working.

But what we must not do is wait for the Department for Education to solve these problems for us. They’ve got enough to sort out!

Ask yourself not how the DfE is going to reduce your workload this year, but what you can do as a school to reduce workload yourselves.

I’ll be speaking in more detail about my school’s approach to feedback – and a policy that tackles the problems of ‘evidence’ for external observers – at the Rising Stars Eliminating Unnecessary Workload conference in September