Category Archives: primary

Primary Curriculum Timetabling

As I look to timetabling in the new school year, I reflected on the work Tom Sherrington did a few years ago about secondary timetables. Unfortunately, the primary curriculum timetable is not so easy to analyse, given that very few schools stick to a simple programme of x lessons of equal length per day, and few teach every lesson every week – or even every fortnight, as would be common in secondary.

Because of this, it’s much harder to get a sense of how much time schools are giving over to each subject, particularly given the changes of recent years and those on the horizon. So, I set out to try to find out as much as I could, through another of my Google surveys.

It’s impossible to present all of that information tidily, since every school’s situation is unique, but here I’ve tried to draw out some key things.

Weekly subject hours

Different schools take different approaches. Three different schools might offer 36 hours of Art each year, with one offering a weekly 1-hour lesson, another having two hours every other week, with a third having two-hour lessons every week, but only every other half-term. Yet another might mainly use Art days each term to reach its quota. So we’re not comparing like with like, here, but the table below attempts to show the average number of hours taught for each subject if evened out over 36 weeks of term (allowing a couple of weeks for being off-timetable) – all rounded to the nearest 5 minutes.

averagenctime.png
I don’t imagine anyone being massively surprised by any of those figures, but it certainly gives an indication of the narrowing of the primary curriculum. When the QCA last recommended teaching hours in 2002, it suggested an average of 55 minutes a week for the majority of foundation subjects. We’re now struggling to get above 30 for Geography!

 

Perhaps more surprising is the fact that although there is more time given to the tested subjects in Year 6, the decline in ‘breadth’ is not huge. It seems that the curriculum is fairly limited across the whole of primary.rangeThe greatest breadth in curriculum, at least in timetable terms, appears to be in Year 3.

Regularity

Some years ago there was a clear government target for primary pupils to have at least 2 hours of timetabled PE each week. It seems that the target has achieved something, as it is the only foundation subject which has ended up with significantly more than its previously recommended amount (which was 1 hour 15 minutes in 2002). It’s also one of the few subjects with weekly slots, with 98% of responses saying they taught PE every week, with nearly 90% having more than 90 minutes of PE each week.

pe

The only subject that comes close to regular weekly slots is Science, with around 2/3 of respondents saying they taught Science every week.

science

At the other end of the scale, Design & Technology is very rarely taught on a weekly schedule. This is perhaps not surprising given the amount of resource required for the subject. Nearly a third of schools appear to use standalone days each term or half-term for the subject instead:

dt.png

Exceptional Cases

I didn’t collect exact data, but only in categories, so of those schools who said they had 7½ or more hours each week of English or Maths, I could only count the 7½ hours. In the end, more than half of responses (52%) gave an answer of 7½ hours a week or more for English. It seems, therefore, if anything that the above are under-estimates of the time given over t o English.

Only about 10% of schools gave a similarly high answer for Maths, but this is still quite a significant number. Those figures rise to 57% and 16% each for pupils in Year 6.

At the other end of the scale, approximately 5% of responses said that they gave over no time to PSHE. The subject is not yet statutory, so presumably that figure will fall over the coming year or two. Around 4% of responses said they taught no Computing at all; I wonder if that’s more a confidence issue than a planned decision. Who knows?

 

 

Advertisements

Multiplication Tables Check Comparison Data

As ever with such things, it is important to point out that this data is not a scientific sample, has not been verified, and could be completely meaningless. However, in the absence of any comparative data from the DfE, it is an attempt to give some vague indication of the national picture of schools that took part in the MTC sample.

At the time of writing, some 211 sets of data had been submitted to the open spreadsheet online. Because it’s an open spreadsheet, there’s no guarantee that it doesn’t have errors, or that some data hasn’t been damaged, or even completely made up. With that in mind, I have completed some very simple calculations based on the data to give some idea of indicative figures.

Overall Averages

The mean average of all pupils’ results was 18.4

The mean average of all schools’ averages was also 18.4

The following table shows the approximate cut-off points when comparing schools’ averages, to place schools into bands.

bands

Perfect Scores

There was talk at one point of full marks being the expect threshold. It’s no longer clear that this is the case, or even that there will be a pass mark of any sort at all, but within the sample:

Overall proportion scoring 25/25: 17.4%

Bands for proportion scoring 25/25:

bands2

Pupil Scores

More pupils did score full marks than any other individual score, with scores clearly more likely to be at the top end of the scale.

scores.png

School Averages

The majority of schools had an average score of between 16 and 20

scores2

Does any of this mean anything? Not really… it’s a tiny sample from a voluntary pilot of a new test with no clear expectations hastily compiled from questionable data. But some of it is at least slightly interesting.

Annual reporting to parents – our approach

Having shared our annual report template with a few interested teachers, I thought it was worth sharing the main template more widely. If you’re not interested in reading about it, then feel free to scroll to the bottom just to download the template… I’ll never know 🙂

It’s always struck me as odd that we seem to have contradictory wisdom about the main forms of report to parents. New teachers are always told that there should be “no surprises” at a parents evening. If children are falling behind, or misbehaving, or perhaps failing to complete homework, then parents should already know this rather than finding out in their 10-minute slot.

Why is it then, so many seem to presume the opposite for report-writing, as though parents know nothing of their child’s learning and so need everything spelling out in detail? In truth, most parents receive broadly similar reports year after year, because children don’t change that much.

The need to fill extra lines of content means either repeating the banal detail of what has been taught (regardless of how well it has been learned), or of trying to find minutiae to discuss.

So, when it came to re-working the report template for my current school, I had a few things in mind:

  • I wanted to minimise the amount teachers have to write, while leaving room for comments about the important personal & social detail (the bit parents are really interested in!)
  • I wanted to be clear about where children met – or failed to meet – expectations, and to set clear expectations for excellence.
  • I wanted to give an opportunity to reflect on attainment in all subjects.

So, our report is made up of a number of sections (after the introductory statement):


comment

This is clearly the most important part of the report, not least because this is the section memories are made of. In my school I do ask teachers to write a comment which incorporates the personal/social elements as well as some reference to attainment in the key subjects of English & Maths. It’s also the place to add in detail about any particular skill or expertise in other subject areas.
The whole box takes up to 10 lines – roughly 180 words max.


subject

The subject attainment section is very brief in terms of outcomes, but quite clear for parents. I’m not a fan of the vocabulary of ‘Greater Depth’, but given its use in the statutory assessments, it seems to make sense to use it consistently across the school. Invariably these descriptors are not a great surprise to parents (mine, for example, were never going to expect me to achieve great things in PE!), but where they do highlight something, then parents can of course raise that at the open afternoon that follows shortly after reports are issued.


pre

This section is something I brought with me from a previous school, and we had taken the idea from another school – so if your Nottinghamshire school was the originator, do let me know!
I like it because it’s a clear at-a-glance indicator of key areas of interest, including attendance which can sometimes come as a surprise to parents. I also like the clarity that “Good” is good, but that to be exceptional is, well, exceptional.


end

There is no doubt that adding a pupil comment creates additional work. I like to keep it as much because I think it’s something for pupils and families to look back on in years to come as it is an insight into their current achievements. It’s also a useful reflective opportunity for older pupils. (Pupils don’t see the rest of the report first; juniors type their entries and they get added electronically; infants write on smaller sheets of paper which are pasted in to the template – achievable in a 1fe school).

As for the targets, I don’t expect anything in-depth or insightful. For most children’s it’s at least one English and one maths target, often linked to key skills that can be practised at home, such as number bonds, key word spellings or regular reading. There might also be a personal/social target if appropriate, or behaviour in some cases. As I say to my staff, though, sometimes it’s also appropriate to put a target that just says “keep up the great work!”


htcomm

I do manage a headteacher comment for every pupil, but as we only have 200 that’s perhaps more manageable than in some schools. (I haven’t pointed out to my staff that this means I actually write more for reports than any one teacher; I’m not sure the point would go down to well given all the other demands on them!)


Presenting the report

I’m always conscious that school reports are often kept for years, if not generations, and try to present them accordingly. Our template is set up as a 4-page document, which we print onto A3 white card and fold into A4 size. The front cover consists mainly of the large (attractive) logo and pupil name, and the back cover is pretty blank, but I think it makes the whole thing look a whole lot nicer.

As a school, we also currently track Key Performance Indicators in key subjects across the year, and so printed those out to accompany the report last year. I may take soundings from parents this year to see if they value that level of detail; I’m not clear that they would.

I also include a covering letter with reply slip. In theory this helps us to track receipt, but more importantly I hope it gives parents an opportunity to send positive responses and thanks to teachers which they might not otherwise have the opportunity to convey. I still keep some report reply slips from my teaching days – and I ditch others!


The Template

Well done if you read this far. No credit if you just skipped my words of wisdom. I have stripped out the school-specific content from the template (logos, etc.) and uploaded a version here which you are welcome to download, adopt, edit and re-share as you wish. No need to add any credit on the report (it’d look odd for a start!), but I’d be glad to hear if you found it useful.

Okay, I’ll stop… just download the Report Template!

On the importance of vocabulary

Just a quick blog, inspired by this much more detailed and challenging one by Solomon Kingsnorth:

I think he has a point about the importance of vocabulary, and it’s something we can easily underestimate. It’s also something we can worry that we’ll never be able to resolve, because there’s no way of knowing what vocabulary will come up in any given text or test.

So I took a look at this year’s KS2 Reading test paper and tried to identify some of the vocabulary required to answer each question. It’s not every word in the texts, but it’s also not just the case of the 10 marks theoretically set aside for vocabulary. In fact, I think there were 80 or more examples of vocabulary which might not have been met by pupils who don’t read regularly:

Q1 approximately, survive
Q2 disguise
Q3 razor-like, powerful
Q4 majority
Q5 develops, newborn
Q6 hibernate
Q7 captivity, territory
Q8 puzzling
Q9 vital, essential
Q10 extinction, survive, supplies, diminishing, poaching, territory
Q11 adopt, reserve
Q12 challenge
Q13  
Q14  
Q15 fascinating,
Q16 protective, enfold
Q17 punished
Q18 mountainous, praised, lavishly
Q19 wounded, lame, circumstance
Q20 seized
Q21  
Q22 vividly recall
Q23 frail, hobbled
Q24 hobbled, hesitate, peered
Q25  
Q26 lit up
Q27 amusing, shocking, puzzling, comforting
Q28 arrives, injured
Q29 verses
Q30 suggests, bothered, basins, smelt
Q31 lifeless, ancestors
Q32 guardian
Q33 devices (left to my own devices)
Q34 recesses
Q35 dawned (dawned on me)
Q36 assorted, debris, network, grime
Q37 detemination, thorough
Q38 impression, evidence, frightening, intensity, cautiously
Q39 justice, efforts
Q40 inspect, fashioned, ought

The only questions that are counted as vocabulary marks are the 10 written in italics. And all those ones in bold? They’re listed as inference questions in the mark schemes. The challenge of inference is often about interpreting complex language as much as it is about guessing what the writer intended.

Perhaps more importantly, very few of those words are technically specific to the texts they appeared in. Even in the case of the non-fiction text about pandas, much of the apparently technical vocabulary is applicable to plenty of other contexts that children meet in the course of the curriculum.

The link here to ‘tier two’ vocabulary is clear: there is plenty of vocabulary here that would come up in a number of different contexts, both through fiction and non-fiction reading.

Which rather makes me think that Solomon is on to something important: a significant part of teaching reading is about getting them reading and reading to them.

What is a “particular weakness” anyway?

In DfE terms, it’s early days for being able to make decisions about KS2 Writing outcomes. After all, it wasn’t so long ago that we were reaching February without any exemplification at all, so for the STA to have released its “particular weakness” scenarios as early as mid-January is progress!

However, publishing the materials is one thing. Providing the clarity that a high stakes statutory assessment process dearly needs is quite another. The example scenarios offer some insight into the thinking at the STA about this new ‘flexibility’, but seem to have deliberately skirted round the key issues that keep coming up, such as dyslexia!

In an effort to get a sense of the interpretations out there, I put together some very brief scenarios of my own, and asked Y6 teachers to say whether or not they thought such pupils would be awarded the expected standard. And as I feared, there is a real lack of clarity about. The six example scenarios follow, accompanied by the pie charts showing decisions. In each case, the blue represents those who would award EXS (based on a sample of 668 responses)

Scenario 1

graph1

77% award EXS

Edith has shown herself to be a fluent and confident writer. She adapts her writing for a variety of purposes, and in many cases has evidence of elements of working at Greater Depth. However, there are no examples of the passive voice used in any of her writing, except through planned tasks.

Scenario 2

graph2

67% award EXS

Beowulf is a good writer, who meets almost all of the requirements for EXS. However, he has been identified as being at high risk of dyslexia. In his writing he has shown that he can use some of the Y5/6 words accurately. However, he struggles with some of the regular spelling patterns from the curriculum, and his work contains several errors, particularly for the more complex patterns.

Scenario 3

graph3

36% award EXS

Ethelred writes effectively for a range of audiences and purposes, with sound grammatical accuracy. He uses inverted commas correctly to mark speech, but does not yet consistently include punctuation within the inverted commas.

Scenario 4

graph4

71% award EXS

Boudicca writes well, showing an interesting range of language, sentence type and punctuation. However, she has developed a largely un-joined style of writing, which although clearly legible does not include the usual diagonal or horizontal strokes.

Scenario 5

graph5

55% award EXS

Cleopatra is a confident writer, who shows good grasp of technical aspects and a beautiful joined style of writing. She enjoys writing fiction and can develop good plot, with writing that flows well. However, in non-fiction texts she is not always able to use the cohesive devices that enable cohesion between paragraphs. There are some examples of stock phrases used (On the other hand, Another reason, etc.) when writing in a formal style, but these are not consistent across the non-fiction texts she writes

Scenario 6

 

graph6

92% award EXS

Englebert is a technically sound writer. He is able to adapt writing for fiction and non-fiction purposes and uses a variety of language and punctuation techniques. His spelling of common patterns is generally good. However, there are a number of examples of words from the Y5/6 lists which are mis-spelt in his writing generally. His teacher has shown that he could spell these words correctly when tested in the context of dictated sentences throughout the year.

 

Notably, all but one of the results were within 5 percentage points of the figures above when looking only at those who said they had had some training provided on this topic. The biggest difference came for scenario 4 (handwriting) where only 61% of those who said they’d been trained would award EXS compared to 71% of the full sample.

 

It’s hard to say what I expected when I set up these little scenarios. I certainly don’t know what any “correct” responses might be. I think I imagined that some would be fairly evenly split – as with the case of Cleopatra’s weak use of cohesive devices.

Scenario 6 has genuinely surprised me. I don’t know what a moderator would say, but my fear about dictated sentences would be that children could easily be tested on a handful of words each week, learned for Friday’s test, and then quickly forgotten. Is that sufficient to say they can spell at the Expected Standard? Who knows? (That’s not to say that I think ‘no’ is the correct answer either; I’m not persuaded that the importance of spelling those particular words is as great as the system might suggest).

I’m equally surprised at scenario 3. Is it really right that speech punctuation is so so important that 2/3 of teachers would deny a pupil an EXS judgement on this alone – even when so many are happy to overlook spelling or handwriting failures?

As I say – I don’t have any answers. If any moderator – or perhaps an STA representative would like to give a definitive response, I’d be glad of it. I suspect that as close as we’d get to an official answer is that a moderator would have more evidence upon which to make a decision. Which is all well and good. For the 3-4% of pupils whose work gets moderated. For everyone else, we have to hope that teachers have got it right. And judging by these results, that’s not that easy!

Writing Moderation materials

Just a quick post to share the moderation support materials that were shared by the STA today. For some reason, they have only been shared via the password-protected NCA Tools website. However, there is no indication that they should be maintained under any conditions of secrecy, and no indication that they are not covered by the usual Crown Copyright rules… so here they are:

KS1_standardisation_training_presentation_1

KS1_teacher_assessment_moderation_training_pack_1

KS2_standardisation_training_presentation_1

KS2_teacher_assessment_moderation_training_pack_1

The presentations include clarifications about some of the criteria included in the assessment frameworks.

The moderation training packs include the examples that are meant to help illustrate what counts as an exception when you want to overlook one of the criteria.

See if you find it at all helpful…

National Curriculum test videos

With the introduction of the new style National Curriculum tests in 2016, I made some short informative videos for parents about each set of tests. Since then, I’ve updated then each year to reflect changes such as this year’s timetable changes at KS2. The videos last around 5 minutes and are ideal for sharing on school websites, twitter feeds, facebook pages, etc.

To help schools use them most effectively, I have provided links below in each of the main formats so they can easily be shared. Please feel free to share or download the videos and use them for your school:

Key Stage 2 tests

youtube facebookicon Twit mp4icon
YouTube Facebook Twitter MP4 download

Key Stage 1 tests – including Grammar, Punctuation & Spelling

 

youtube facebookicon Twit mp4icon
YouTube Facebook Twitter MP4 download

Key Stage 1 tests – without GPS

youtube facebookicon Twit mp4icon
YouTube Facebook Twitter MP4 download

Primary Assessment changes… again!

First of all, let me say that I’m pleased that primary assessment is changing again, because it’s been a disaster in so many ways. So here is a summary of the changes at each key stage – with my thoughts about each.

Early Years Foundation Stage Profile

  • The EYFS Profile will stay, but will be updated to bring it into line with the new national curriculum and take account of current knowledge & research. I’ve never been a huge fan of the profile, but I know most EY practitioners have been, so that seems a sensible move.
  • A proposed change to reduce the number of reported Early Learning Goals to focus on prime areas and Literacy/Maths
  • The ’emerging’ band may be divided to offer greater clarity of information particularly for lower-attaining pupils.
  • An advisory panel will be set up to advise on changes to the profile and ELGs. Membership of that could be contentious

Reception Baseline

  • New Reception baseline to be introduced from 2020 (with proper trialling beforehand this time, one presumes!) to take place in the first 6 weeks of school.
  • Won’t be a ‘test’, but also won’t be observational over time. Suspect something more like the current CEM model, perhaps?
  • Will focus on literacy & numeracy, and potentially a ‘self-regulation’ element, as good predictors for attainment in KS2
  • Data won’t be used for any judgements about Reception, but will be used at cohort level to judge progress by the end of KS2.
  • The intention is for the assessment to provide some narrative formative information about children’s next steps.

Key Stage 1

  • The KS1 Grammar, Punctuation & Spelling test will remain optional.
  • Statutory Assessment will remain until at least 2023 (to allow for a year of overlap with the first cohort to be assessed using Reception baseline).
  • A new framework for Teacher Assessment of Writing has been published for this year only. Exemplification will follow this term.
  • DfE will continue to make assessments available (perhaps through an assessment bank if that ever gets off the ground!) after 2023, to help schools to benchmark attainment.
  • After 2023, tests and statutory teacher assessment will become optional for through primary schools.
  • There is more work to be done to find a system which works well for infant/junior and first/middle schools. This will be done with those in the sectors.

Key Stage 2

  • A multiplication check will be introduced at the end of Year 4. (Although, of course, whether the end means July or May remains to be seen).
  • School-level data on the multiplication check won’t be published.
  • This will be the last year that teachers have to make Teacher Assessment judgements for Reading and Maths
  • A new framework for Teacher Assessment of Writing has been published for this year only. Exemplification will follow this term.
  • DfE will continue to evaluate other options for the future, but not really committing to anything yet.
  • Small trials of peer-to-peer moderation will take place this summer.
  • Science Teacher Assessment frameworks will be updated next year.
  • The Reading test will not be timetabled for Monday of SATs week any more (hurrah!)
  • The DfE aims to link the reading content of the tests more closely to the curriculum to ensure children are drawing on their knowledge.

My thoughts

Overall, I’m pleased. Most of these changes are to be welcomed. The Reception baseline is a sensible idea (just a shame it was so badly implemented the first time round), as is scrapping KS1 assessments. The Early Years changes seem reasonable given the popularity of the current setup. The improvements to the KS2 Reading test are positive, as is the removal of pointless Teacher Assessment judgements.

On Writing, I fear we haven’t gone far enough. The current system is a joke, and it seems like the interim solution we’ll have to replace the old interim solution will just aim to make it less awful without really fixing the problem. It’s a shame that there is no obvious answer on the horizon. Perhaps the department has had its fingers burnt by rushing into quick fixes in the past and is prepared to bide its time.

In the interim, the updated expectations for Writing seem more manageable both in terms of achieving and assessing them. Of course, the devil is in the detail. If we get another exemplification book that breaks down single statements into several tick-boxes then we may be back at square one. Equally, of course, we can expect proportions of pupils meeting the expected standard to rise again substantially this year. Surely we have to be honest now and say that we really cannot use this data for accountability purposes? Mind you, perhaps it won’t matter – if we’re all getting 90% in Writing, it’ll only be the tested subjects that will make a difference to the accountability!

There are some other changes I would have liked to have seen. I really don’t think the “expected standard” label is helpful, particularly in subjects where scaled scores are used; it’s a shame we’ve not seen the back of that.

We’re not out of the woods yet. But we’re heading in the right direction, and credit is due to those at the department for listening. Let’s just hope they keep listening until we all get it right.

Stop moaning about tests!

Today marked the end of 4 short days of testing. For Year 6 pupils everywhere, they’ll have spent less than 5 hours on tests – probably not for the first time this year – and later in the year we’ll find out how they did.

Now, I’m the first to complain when assessment isn’t working, and there are lots of problems with KS2 assessment. Statutory Teacher Assessment is a joke; the stakes for schools – and especially headteachers – are ridiculously high; the grammar test is unnecessary for accountability and unnecessarily prescriptive. I certainly couldn’t be seen as an apologist for the DfE. And yet…

For some reason it appears that many primary teachers (particularly in Facebook groups, it seems) are cross that some of the tests contained hard questions. I’ve genuinely seen someone complain that their low-ability children can’t reach the expected standard. Surely that’s the very reason they’re defining them as low ability?

Plenty of people seem annoyed that some of the questions on the maths test were very challenging. Except, of course, we know that some children will score 100% each year, so the level of challenge seems fair. There were also plenty of easier, more accessible questions that allowed those less confident mathematicians to show what they can do. It’s worth remembering that to reach the expected standard last year, just 55% of marks were needed.

But the thing that annoys me most is the number of people seemingly complaining that the contexts for problem-solving questions make the questions too difficult. Of course they do, that’s the point: real maths doesn’t come in lists of questions on a page that follow a straightforward pattern. What makes it all the more irritating is that many of those bemoaning the contexts of problems are exactly the same sort who moan about a tables test, complaining that knowing facts isn’t worthwhile unless you can apply them.

Well guess what: kids need both. Arithmetic knowledge and skills need to be secure to allow children to focus their energies on tackling those more complex mathematical problems. You can’t campaign against the former, and then complain about the latter.

The tests need to – as much as possible – allow children across the ability range to demonstrate their skill, while differentiating between those who are more and less confident. That’s where last year’s reading test fell down: too few accessible elements and too many which almost no children could access. This year’s tests were fair and did a broadly good job of catering for that spread. For those complaining about the level of literacy required, it’s worth remembering that questions can be read to children, and indeed many will have had a 1:1 reader throughout.

No test will be perfect, and there are plenty of reasons to be aggrieved about the chaos that is primary assessment at the moment, but blaming tests because not all children can answer all questions is a nonsense, and we’d do well to pick our battles more carefully!

KS2 Writing: Moderated & Unmoderated Results

After the chaos of last year’s writing assessment arrangements, there have been many questions hanging over the results, one of which has been the difference between the results of schools which had their judgements moderated, and those which did not.

When the question was first raised, I was doubtful that it would show much difference. Indeed, back in July when questioned about it, I said as much:

At the time, I was of the view that LAs each trained teachers in their own authorities about how to apply the interim frameworks, and so most teachers within an LA would be working to the same expectations. As a result, while variations between LAs were to be expected (and clearly emerged), the variation within each authority should be less.

At a national level, it seems that the difference is relatively small. Having submitted Freedom of Information Requests to 151 Local Authorities in England, I now have responses from all but one of them. Among those results, the differences are around 3-4 percentage points:

moderated

Now, these results are not negligible, but it is worth bearing in mind that Local Authorities deliberately select schools for moderation based on their knowledge of them, so it may be reasonable to presume that a larger number of lower-attaining schools might form part of the moderated group.

The detail that has surprised me is the variation between authorities in the consistency of their results. Some Local Authority areas have substantial differences between the moderated and unmoderated schools. As Helen Ward has reported in her TES article this week, the large majority of authorities have results which were lower in moderated schools. Indeed, in 11 authorities, the difference is 10 or more percentage points for pupils working at the Expected Standard. By contrast, in a small number, it seems that moderated schools have ended up with higher results than their unmoderated neighbours.

What can we learn from this? Probably not a great deal that we didn’t already know. It’s hard to blame the Local Authorities: they can’t be responsible for the judgements made in schools they haven’t visited, and nor is it their fault that we were all left with such an unclear and unhelpful assessment system. All this data highlights is the chaos we all suffered – and may well suffer again in 2017.

To see how your Local Authority results compare, view the full table* of data here. It shows the proportions of pupils across the LA who were judged as working at the Expected and Greater Depth Standards in both moderated and unmoderated schools.


*Liverpool local authority claimed a right not to release their data on the grounds of commercial sensitivity, which I am appealing. I fully expect this to be released in due course and for it to be added here.