Getting started with FFT data for KS2

School leaders are used to dealing with change, not least when it comes to assessment data, but this year is in a league of its own. With changes to all the tests, teacher assessment, scaled scores and accountability measures, headteachers would be forgiven for despairing of any attempt to make sense of it.

Even when Raise becomes available, there’s no saying how easy it will be to interpret, not least because of all the changes this year. However, the FFT Summary Dashboard is available from today (Wednesday 14th), allowing you to make headway into that first stage of data analysis to evaluate your school’s strengths, and pick out areas for further development. In today’s climate, any help with that will be welcome!

The first glance of your dashboard will give you a very quick visual representation of your key headline figures – attainment and progress – related to those that will feature in performance tables and be published on your school website. In FFT these are represented in the form of comparison gauges:

gauges.png

Comparison gauges that show key figures at a glance

The beauty of this is the clarity they provide compared to the complexity of the published data and its confidence intervals. In short: the middle white zone shows that you’re broadly in line with national outcomes; the red and green bands at either end suggest significant lower or higher results. This will be particularly helpful for governors who are either shocked by changes in numbers from the old system, or who are concerned about small negative values on the progress measures.

 

The dashboard offers more clarity, too, about specific groups within your school. With a changing landscape it can be hard to know what to expect, but the pupil group analysis will quickly tell you which specific groups – girls, middle attainers, free school meals – have performed particularly well, and which seem not to be keeping up. It’s a simple overview that makes a good starting point for further investigation.

groups

Quick identification of groups that have done particularly well, or poorly (green plus symbols show significant values)

It’s worth remembering, though, that some groups may be very small in your school: if you’ve only got a handful of girls, then don’t get too worked up over variations!

The dashboard also helps to pick out trends over time – another challenge when all the goalposts seem to have moved. By comparing the national results to previous years, FFT have been able to plot a trajectory that compares how attainment and progress might have looked in 2014 and 2015 under the current system. As a result, you can begin to see whether your school has improved by comparison to the national picture.

time.png

The time series shows your previous results adjusted to bring them more closely into line with the new frameworks. Not perfect, but a very telling ‘starter for ten’!

A caveat here: this is much more difficult with the writing judgements which are much less precise than the scaled scores. Take that alongside the evident variation in writing outcomes this year, and you may want to look deeper into those figures before making any quick judgements.

vulgps

Groups analysed

Further into the summary dashboard itself, we get into the detail of vulnerable groups and of the separate subjects. Again, you get an overview that helps to pinpoint areas to look into further. Specific groups remain a clear focus for Ofsted and other inspections, so this information will be vital to leaders. The further breakdown of subjects will be of interest too, and of particular use in schools where writing has been affected by the national inconsistencies. Again these sections allow you to compare your attainment and progress to the national picture, and also to reflect on how your results may have changed over time.

No doubt, by the time school leaders and governors have begun to look at their summary overview, there will be many more questions asked. That’s where the FFT Aspire platform can help. Using your summary as a starting point, you can explore each element in greater detail, filtering your results for different groups, or subjects – even down to the level of individual pupils. It will help you to unpick the measures that are likely to feature on your Raise Online profile when it arrives, and with others too, including using contextual information about your pupils to compare to similar groups elsewhere.  Alongside the target-setting and other elements of FFT, you have a wealth of information at your fingertips that can be used to focus your school improvement planning – the summary dashboard is just the start.

 


This post was written with the support of FFT in preparation for the launch of the new dashboards on 14th September 2016.

Writing for a Purpose (or 4!)

For some time now I have been working on a model of teaching Writing built around the idea of longer blocks focusing on fewer things. Previously I have written about a model I used in my previous school, and since then have had many requests for more information.

This year I have finally produced some notes about the model I use, based on 4 Writing Purposes. My view is that rather than trying to teach children 10 or more different ‘genres’ or ‘text types’ as we used to do in the days of the Writing Test, rather it is better to focus on what those types have in common. It means that at my school we use 4 main types of writing across KS1 and KS2: Writing to entertain; to inform; to persuade; and to discuss.*

purposes

The 4 main writing purposes, and some of the ‘text types’ that could fall under each.

Importantly, by the end of KS2 I’d hope to see children recognise things like the fact that newspaper articles could actually fall under any or all of the 4 headings: they’re not a distinct type in themselves, really.

As a very rough rule, I’d expect around half of curriculum time to be taken up by “Writing to entertain”, with the remaining non-fiction elements sharing the remaining time. Notably in KS1 the non-fiction focus is only on Writing to inform.

sample

Example guidance note

To support structuring the curriculum in this way, I have now compiled some guidance notes for each category. I say compiled, rather than written, because much of the legwork on these notes was done by my wife – @TemplarWilson – as she rolls out a similar model in her own school.

The guidance notes attempt to offer some indications of National Curriculum content that might be covered in each section. This includes some elements of whole-text ideas, suggestions for sentences and grammar, notes on punctuation to include, and also some examples of conjunctions and adverbials.

They’re not exhaustive, nothing radical, but as ever, if they’re of use to people, then I’m happy to share:
4 Writing Purposes – guidance (click to download)

Alongside the guidance sheets, I also have the large versions of the 4 main roadsign images, and an example text for each of the four purposes. The example texts are probably of more use at the upper end of KS2, and could almost certainly be improved, but they are a starting point for teaching and analysis by the children to draw out key features, etc. Both can be downloaded here:

4 Writing Purposes – Roadsign Images

4 Writing Purposes – Example Texts


*Secondary English teachers may recognise these as being loosely linked to the old writing triplets at GCSE level.

Teachers aren’t that special

We’re a funny lot, teachers.

It’s different to most jobs I guess. For a start, we get 13 weeks holiday a year. We also work in strange circumstances that are simultaneously both very public and quite private.

We also seem to have an on-going struggle with what it means to a profession, that doesn’t seem to affect other roles. Or rather, an on-going clamour to be considered a profession, without being clear about what that means.

The College of Teaching has served to highlight some of those troubles, but also one other: we seem to have reached a point in the profession where “leaders” can be lumped together as a “them” who are not in any way connected to “us” at the chalkface. (Disclaimer: I don’t know which group I end up in according to those determined to divide in this way)

I suspect that this is based, in part, on a truth: some school leaders are awful. Some who reach the position of headteacher (or Executive Head for that matter, I suspect), probably weren’t very good classroom teachers, and aren’t very good leaders. They can damage schools, teachers and pupils in the process. But to presume that such negative experiences mean that all those who have a leadership responsibility are in opposition to those who teach in classrooms is childish. Not least because it fails to account for the huge number of people – particularly in primary schools – who manage both leadership roles and considerable classroom teaching commitments.

This has come to a head from the small group of vocal opponents to the College of Teaching, particularly since the appointment of a very experienced headteacher to the role of Chief Executive. For some, led by Andrew Smith (@oldandrewuk), only a practising classroom teacher would have been acceptable to lead an organisation that they don’t even think should exist.

The problem with that argument is clear: what experience does the average classroom teacher have that would equip them to lead a significant organisation? There will, of course, be a handful of classroom teachers who have prior experience in other roles that might match the job description, but they are rare. And often such people would quickly take on leadership roles within schools, hence disqualifying them from this very narrow field.

What’s more, I’d argue that being the CEO of a large organisation doesn’t require the skills of a classroom teacher, any more than running British Airways would require you to be trained pilot. Running large organisations requires  a specific skill-set, and if the College is to be a success, then it needs the right people with those skills at its head. The fact that within teaching we have excellent school leaders who have the appropriate skills means we are able to appoint the combination of leadership and teaching experience.

Looking at other professional organisations, there is a mix  when it comes to the CEO role: the CEO of the Law Society is a trained solicitor with considerable leadership experience; the CEO of the Royal College of GPs has a background in social work and charities and isn’t medically trained at all; the CEO of the Royal Institute of Chartered Surveyors has a background in marketing. I haven’t yet found a single professional body that has an entry-level professional at its head.

The reality is, teachers aren’t some superhuman species imbued with some professional brilliance that makes them better than GPs or Chartered Surveyors. We are trained for a job. And all the while that some of those teachers also acquire the skills to lead large organisations, it is great that we can have a qualified and experienced teacher at the head of a professional body; but let’s be serious: it’s not the talent for imparting phonics knowledge that is required to manage a large charity.

Of course, the real issue here is not the appointment of  the CEO. Those who are wholeheartedly opposed to the College – or who object to the way it has been developed – would likely have opposed any appointment, just as those who object to the existence of the BBC would never welcome a new Director General.

For those of us who would like to see if this thing can work, it strikes me that you would struggle to find a better starting point as CEO than Dame Alison Peacock – an experienced teacher and headteacher, a strong figurehead who is widely supported by the profession, and someone who has publicly spoken in the past against proposals from government.

Some will always be happy to throw stones, just as there are those who continue to criticise the BBC. Personally I hope that both groups are proven to be in a minority.

A foolish consistency – the Primary School disease?

Let me start by saying that I think consistency is vital in schools. Pupils need to know that the behaviour policy will apply equally to everyone, and be applied equally by everyone. If a school has a uniform, then rules about it should be fairly and consistently applied to all. Children in Year 4 are entitled to just as good teaching as children in Year 6.

But there are limits. And it seems that too many primary headteachers cross them, to my mind. Not all, of course, but too many. On Twitter today a perfect example was shared by Rosie Watson (@Trundling17):

There is a headteacher – or senior leadership team – somewhere that thought it was an useful use of its time to come up with a list of 30 “must haves” that include how the classroom door must be signed, and that pegs must be labelled in week 1.

I wasn’t even that surprised when I saw it, because I’ve known far too many schools get caught up in such nonsense. Display policies can sometimes be the most read in a primary school, and I’ve known them include things like:

  • drapes must be used to soften the edges of displays
  • all work should be double-mounted
  • topic boards must be changed at least every 2 weeks
  • all classrooms must display a hundred square
  • all staples must point in the same direction

The point is that none of these things is necessarily a bad thing. Indeed, the one about staples appeals to my slightly frenzied mind. But to dictate it to a staff of highly-trained professionals? To expect teachers to spend their time and energy on such things rather than planning and preparing for learning strikes me as crazy.

What surprised me most about Rosie’s post, though, was not the content –  I fear that’s all too common – but the fact that some headteachers then tried to defend such approaches. The claims were that it was a useful reminder, or helpful for new teachers.

I have two issues with this. Firstly, the list is very clearly presented as a list of expectations to be met and judged against – not just helpful reminders. Secondly, these are not all good uses of someone’s time. If they were recommendations that I was free to ignore (and believe me, I would ignore a good number of them), then that’s fine, but that’s clearly not the case here.

If a school is insistent that its classroom doors have name labels in a certain style, then it should organise this administrative task, not simply demand it of teachers. Teachers’ time should be spent on things that directly impact teaching and learning, and precious few of these do.

Sadly, such “non-negotiables” seem to have become something of  a norm in school, with headteachers thinking that the way they ran their classrooms is now the way that others should follow suit. But it’s madness.

Headteachers are well aware of the strategic/operational divide between governors and heads, but they should consider a similar separation from the involvement in classrooms. Absolutely it is the place of the headteacher to lead on matters of curriculum and learning, and even to set the broad principles and expectations for the “learning environment” (oh, how I hate that term!), but that’s not the same as specifying the date by which your pegs are labelled.

The only other argument that was tentatively put forward was for schools which are in “a category”. Now here, I have some sympathy with heads who take on a school where things are a mess. Sometimes a clear list of expectations helps to brings things out of a pit – but that clearly isn’t the case here. If classrooms are untidy, it’s reasonable to expect that they be tidy; if disorganised cloakrooms are delaying learning, then it’s reasonable to expect something to be done about it. But no school was ever put in Special Measures because boards were backed with ‘inappropriate’ colours, or because  a Year 6 classroom didn’t have a carpet area.

And if  a school is in measures, then it probably shouldn’t be wasting its attention on how the classroom door is labelled! Both the leadership team and the teachers more widely should be focusing on the things that make the most difference to teaching and learning. Of course expectations should be raised, but that doesn’t need to be done through a foolish consistency.

Headteachers and Senior Leadership teams: you are busy enough – don’t sweat the small stuff, and certainly don’t make others sweat it for you!

(P.S. I’m a real rebel: I don’t label pegs at all!)


For an indication of some of the mad things that are dictated in primary schools, take a look at this Storify in response to my tweet:

Some thoughts on KS2 Progress

Caveats first: these conclusions, such as they are, are drawn from a small sample of a little over 50 schools. That sample of schools isn’t representative: indeed, it has slightly higher attainment than the national picture, both in terms of KS2 outcomes, and in KS1 starting points. However, with over 2000 pupils’ data, it shows some interesting initial patterns – particularly when comparing the three subject areas.

Firstly, on Maths – the least controversial of the three subjects. It seems that – in this sample – pupils who achieved Level 2c at KS1 had an approximately 40% chance of reaching the new expected standard (i.e. a scaled score of 100+). That leaps to around 66% for those achieving L2b at KS1 (i.e. just short of the national average)

mathslevels

The orange bar shows the average of this sample, which is slightly higher than the national average of 70%

It’s important to note, though, that progress measures will not be based on subject levels, but on the combined APS score at Key Stage 1. The graph for these comparisons follows a similar pattern, as you’d expect:

mathsaps

Where fewer than 10 pupils’ data was available for any given APS score, these have been omitted.

There is an interesting step here between pupils in this sample on APS of 13 (or less) who have a chance of 40% or less of reaching the expected standard, while those scoring 13.5 or more have a greater than 50% chance of achieving the standard. (The dip at 12.5 APS points relates to pupils who scored Level 2s in Maths and one English subject, but a level 1 in the other, highlighting the importance of good literacy for achievement in KS2 Maths)

For Reading, the graphs look broadly similar in shape

readinglevels

Blue bar shows average of this sample at 67%, which is slightly higher than national average of 66%

Interestingly here the level 2c children scorers still have only 40% chance of meeting the expected standard, but those achieving 2b have a lower chance than in maths of reaching the expected standard (58% compared to 66% for Maths).

When looking at the APS starting points, there is something of a plateau at the right-hand end of the graph. The numbers of pupils involved here are relatively few here (as few as 31 pupils in some columns). Interestingly, the dip at 18.5 APS points represents the smallest sample group shown, made up of pupils who scored 2a/2b in the two English subjects, but a Level 3 in Maths at Ks1. This will be of comfort to teachers who have been concerned about the negative effect of such patterns on progress measures: it seems likely that we will still be comparing like with like in this respect.

readingaps

It is in Writing that the differences become more notable – perhaps an artefact of the unusual use of Teacher Assessment to measure progress. Compared to just 40% of pupils attaining L2c in Reading or Maths achieving the new expected standard, some 50% of those in Writing managed to make the conversion, and this against a backdrop of teachers concerned that the expected standard was too high in English. Similarly, over 3/4 of those achieving Level 2b managed to reach the standard (cf 58% Reading, 66% Maths)

writinglevels

In contrast to the other subjects, attainment in this sample appears notably lower in Writing than the national average (at 70% compared to 74% nationally)

With the APS comparisons, there are again slight dips at certain APS points, including 18.5 and 19.5 points. In the latter case, this reflects the groups of pupils who achieved Level 3s in both Reading and Maths, but only a 2b in Writing at KS1, suggesting again that the progress measure does a good job of separating out different abilities, even using combined APS scores.

writingaps

Of course, this is all of interest (if you’re interested in such things), but the real progress measures will be based on the average score of each pupil with each KS1 APS score. I’d really like to collect some more data to try to get a more reliable estimate of those figures, so if you would be willing to contribute your school’s KS1 and KS2 data, please see my previous blog here.


Spread of data

Following a request in the comments, below, I’ve also attached a table showing the proportions of pupils achieving each scaled score for the two tests. This is now based on around 2800-2900 pupils, and again it’s important to note that this is not a representative sample.

proportions

A few words on the 65% floor standard

There’s been much discussion about this in the last few days, so I thought I’d summarise a few thoughts.

Firstly, many people seem to think that the government will be forced to review the use of a 65% floor standard in light of the fact that only 53% of pupils nationally met the combined requirements. In fact, I’d argue the opposite: the fact that so few schools exceed the attainment element of the floor standard is no bad thing. Indeed, I’d prefer it if no such attainment element existed.

There will be schools for whom reaching 65% combined Reading, Writing & Maths attainment did not require an inordinate amount of work – and won’t necessarily represent great progress. Why should those schools escape further scrutiny just because they had well-prepared intakes? Of course, there will be others who have met the standard through outstanding teaching and learning… but they will have great progress measures too. The 65% threshold is inherently unfair on those schools working with the most challenging intakes and has no good purpose.

That’s why I welcomed the new progress measures. Yes it’s technical, and yes it’s annoying that we won’t have it for another couple of months, but it is a fairer representation of how well a school has achieved in educating its pupils – regardless of their prior attainment.

That said, there will be schools fretting about their low combined Reading, Writing & Maths scores. I carried out a survey immediately after results were released, and so far 548 schools have responded, sharing their combined RWM scores. From that (entirely unscientific self-selecting) group, just 28% of schools had reached the 65% attainment threshold. And the spread of results is quite broad – including schools at both 0% and 100%.

The graph below shows the spread of results with each colour showing a band of 1/5th of schools in the survey. Half of schools fell between 44% and 66%.

Combined attainment

Click to see full-size version

As I said on the day the results were published – for a huge number of schools, the progress measure will become all important this year. And for that, we just have to wait.

Edit:

Since posting, a few people have quite rightly raised the issue of junior/middle schools, who have far less control over the KS1 judgements (and indeed in middle schools, don’t even have control over the whole Key Stage). There are significant issues here about the comparability of KS1 data between infant/first schools and through primary schools (although not necessarily with the obvious conclusions). I do think that it’s a real problem that needs addressing: but I don’t think that the attainment floor standard does anything to address it, so it’s a separate – albeit important – issue.

A little data experiment

Right, let me be clear up-front: I cannot predict your school’s progress scores. I can’t even pretend to estimate a prediction of it. There is just no way to find it, without knowing the full national picture of data – and even the DfE don’t have that finalised yet.

However, we do know how the progress measure works (see the video here if you don’t), so it would be possible to recreate the process based on a sample of data. It’s really little more than a thought experiment, but it may be of interest all the same.

To get even close to that, though, it will need lots of data, from lots of schools in lots of detail. Where in the past I have tried to collect summary data, for this experiment I would need real data from schools that includes both KS1 and KS2 results for their Year 6 cohorts. My plan, then, is to collate the data and find the average progress made by pupils with common starting points within the sample.  I will then share the resulting progress calculations with schools who have submitted data.

Because this needs very specific and accurate data, it won’t be possible to collect this using an open spreadsheet. Instead, below is a master spreadsheet which can be downloaded by anybody who wants to take part. If you wish to join in, please download the form, add your own data, and then return it to info@primarycurriculum.me.uk

Just don’t base your career decisions on the results🙂

To take part, please download, complete and return the following spreadsheet:

 

 

Am I overstretching it…?

What are people’s thoughts?

Everyone  wants to know about progress measures, but we won’t have the national data until September. We can’t work it out in advance… but is it worth trying to estimate?

I collected data on Tuesday night about the SATs results, and my sample was within 1 percentage point of the final national figures, which wasn’t bad. However, this would be a much more significant project.

To get anything close to an estimate of national progress measures, we would need a substantial number of schools to share their school’s data at pupil level. It would mean schools sharing their KS1 and scaled score results for every pupil – anonymised of course, but detailed school data all the same.

My thinking at this stage is that I’d initially only share any findings with the schools that were able to contribute. It would be a small sample, but it might give us a very rough idea. Very rough.

Would it be useful… and do people think they would be able to contribute?

KS2 Results – Frequently Asked Questions

After a late night, and reasonably early morning, there are a few common questions coming up, so here’s my attempt to answer them:

What’s the national data like?

national

Full set of data available from gov.uk website here: https://www.gov.uk/government/statistics/national-curriculum-assessments-key-stage-2-2016-interim

What about the floor standard?

We don’t really know much yet, so it’s too soon to panic! The floor standard is based on two elements: attainment and progress.

Nationally 53% of pupils met the combined Reading, Writing & Maths standards. However, that doesn’t tell us much about how many schools have met that floor standard. Large numbers of those 53% will be in the same schools – and some schools will have none of them. I’d expect the final figure for schools to be a much lower percentage reaching the combined attainment floor standard.

Because of that, this year the floor standard will come down to progress more than ever. We won’t know anything about progress data until September, so for many headteachers it could feel like a long summer holiday – and not in a good way!

Where are the ‘Greater Depth’ thresholds?

There is no threshold for ‘Greater Depth’. Indeed, for tested subjects, there is no ‘Greater Depth’ at all. The scaled score indicates how far above the expected standard a child is, so there is no need for a label. After all, where would the benefit be in saying that a child who scored 117 is high achieving but a child who scored 116 isn’t, when clearly they both are?

There will be an accountability measure that schools have to publish later that is linked to “high scores”, but this is likely to be a combined measure, for example, the proportion of pupils who achieved over 115 in all 3 tested subjects. Note, that I’ve just picked 115 out of thin air. We don’t know what will be counted as a high score, and for the purposes of reporting to parents and children, we don’t need to know.

How on earth do we report this to parents?parentguide

Schools are required to share test results and teacher assessment judgements with parents. I suspect that MIS suppliers will provide a template that allows you to print a separate sheet to give to parents alongside reports. I’ve also written a free leaflet which can be downloaded from the Rising Stars website that helps to explain the results to parents:

 

 

How do we calculate our combined RWM score?

The key thing to note with this is that it is based on the number of individual children who have met all 3 subjects,  not an average of the three subjects’ results. To find your combined RWM score, you need to look at each child to decide whether or not they have met all 3 subjects Expected Standards. For example, in this small cohort, 5 out of the 8 pupils did meet all 3 standards, so the combined RWM score will be 62.5%.
Note that the grammar score does not contribute to the floor standards.

data

What does CA mean on my results?

This means that special consideration has been granted. More details can be found on the STA website here: https://www.gov.uk/guidance/key-stage-2-tests-applying-special-consideration-to-results
Notably, it won’t affect the scaled score at this stage, but will be accounted for when it comes to accountability. Schools have to choose how to explain that to parents.

Can I appeal if they got a scaled score of 99?

You can still apply for marking reviews this year. If it affects the scaled score by moving from over the threshold, or if it affects the raw score by more than 3 marks, then there is not charge. You can submit review requests via NCA tools within the next 10 days. More details online at https://www.gov.uk/guidance/key-stage-2-tests-how-to-apply-for-a-review-of-key-stage-2-results

Where are the markschemes?

All the markschemes can be downloaded from the gov.uk website here:

https://www.gov.uk/government/collections/key-stage-2-tests-past-papers

KS2 test results – my early thoughts

Ignore all this: national data is now available here:

https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/534573/SFR30_2016_text.pdf

 

Here are my thoughts at 10.30am…

Individual Subjects

We don’t have the national picture yet, but based on a sample of 900 schools (usual caveat here – it’s not a representative sample, will contain errors, can’t be trusted, etc., etc.) average percentage of pupils achieving the Expected Standard or higher is looking like:

  • Reading: 65%
  • Grammar, Punctuation & Spelling: 73%
  • Mathematics: 72%

You can submit your own results into the mix here: http://goo.gl/forms/fmuCRuj5jyS91ost1

Reading seems hardest hit. From the 900 schools, average drop is 22 percentage points. This compares to 15 point drop in Maths, and just 8-9 point drop in GPS.

Floor Standard

When it comes to the combined RWM floor standard, based on a quick survey with only 200 responses (further caveats: tiny unrepresentative sample here), only around 1/3 of schools reach the 65% Combined attainment target.

You can submit your own combined RWM figure here: http://goo.gl/forms/SqUUWx9ftIDm5ERW2

Teacher Assessment

Again, no national data available yet, but a Local Authority source shared partial data collection (around half of schools nationally) which suggested that Writing could be at around 74-75% at the Expected Standard.

Could it really be the case that Writing will leap from being lowest to highest attainment this year?