Category Archives: Curriculum

Will we see a leap in Writing attainment?

I’ve long been clear that I think that the current system of assessing writing at KS2 (and at KS1 for that matter) is so flawed as to be completely useless. The guidance on independence is so vague and open to interpretation and abuse, the framework so strictly applied (at least in theory), and moderation so ineffective at identifying any poor practice, that frankly you could make up your results by playing lottery numbers and nobody would be any the wiser.

One clear sign of its flaws last year was in the fact that having for years been the lowest-scoring area of attainment, and despite the new very stringent criteria which almost all teachers seem to dislike, somehow we ended up with more children achieving the expected standard in Writing than in any other subject area.

My fear now is that we will see that odd situation continue, as teachers get wise to the flaws in the framework and exploit them. I’m not arguing that teachers are cheating (although I’m sure some are), but rather that the system is so hopelessly constructed that the best a teacher can do for their pupils is to teach to the framework and ensure that every opportunity is provided for children to show the few skills required to reach the standard. There is no merit now in focusing on high quality writing; only in meeting the criteria. Results will rise, with no corresponding increase in the quality of writing needed.

For that reason, I suspect that we will likely see a substantial increase in the number of schools having more pupils reaching the expected standard. At Greater Depth level I suspect the picture will be more varied as different LAs give contradictory messages about how easy is should be to achieve, and different moderators appear to apply different expectations.

In an effort to get a sense of the direction of travel, I asked teachers  – via social media –  to share their writing data for last year, and their intended judgements for this year. Now, perhaps unsurprisingly, more teachers from schools with lower attainment last year have shared their data, so along with all the usual caveats of what a small sample this is, it’s worth noting that it’s certainly not representative. But it might be indicative.

Over 250 responses were given, of which just over 10 had to be ignored (because it seems that some teachers can’t grasp percentages, or can’t read questions!). Of the 240 responses used, the average figure for 2016 was 71% achieving EXS and 11% achieving GDS. Both of these figures are lower than last year’s national figures (74% / 15%) – which themselves seemed quite high, considering that just 5 years before, a similar percentage had managed to reach the old (apparently easier) Level 4 standard. Consequently, we might reasonably expect a greater increase in these schools results in 2017 – as the lower-attaining schools strive to get closer to last year’s averages.

Nevertheless, it does appear that the rise could be quite substantial. Across the group as a whole, the percentage of pupils achieving the expected standard rose by 4 percentage points (to just above last year’s national average), with the percentage achieving greater depth rising by a very similar amount (again, to just above last year’s national average).

We might expect this tendency towards the mean, and certainly that seems evident. Among those schools who fell short of the 74% last year, the median increase in percentage achieving expected was 8 percentage points; by contrast, for those who exceeded the 74% figure last year, the median change was a fall of 1 percentage point.

Now again, let me emphasise the caveats. This isn’t a representative sample at all – just a self-selecting group. And maybe if you’re in a school which did poorly last year and has pulled out all the stops this year, you’d be more likely to have responded, so it’s perfectly possible that this overestimates the national increase.

But equally, it’s possible that we’ll see an increase in teacher assessment scores which outstrips the increases in tested subjects – even though it’s already starting from a higher (some might say inflated) base.

I’m making a stab in the dark and predicting that we might see the proportion of children – nationally – reaching the Expected Standard in Writing reach 79% this year. Which is surely bonkers?


On Knowledge Organisers

When Jon Brunskill recently agreed to share his work on Knowledge Organisers in primary school, I was excited to see what he came up with. I wasn’t disappointed, and I’m sure many others have been looking with interest. I think there’s a lot of merit in the model, but inevitably I think there is some refining to do.

I say this not as an expert – far from it, I’ve cobbled together one Knowledge Organiser in my life and remain unhappy with it. However, having spoken briefly to Jon about his, I think we both agree that there is merit in unpicking the model further.

Firstly, with Jon’s permission, let me share an image of the organiser he shared (I highly recommend reading the accompanying blog before continuing further with mine!)

At first glance, it looks like a lot of content to learn. I think that’s partly because most of us have spent a good many years teaching broad ideas, and not expecting children to learn detail off by heart. I think there are also very few of us who could hand-on-heart say we know all this content to recall. But I think that represents the shift we need to make rather than something to fear.

That led me to question the purpose behind the Knowledge Organiser. I haven’t spent enough time thinking about them, and certainly not enough time using them, but when I have, I’ve usually considered it a vehicle for outlining the key information that I expect students to learn and retain for the longer term. Often over longer units of work these might include key ideas which are integral to later understanding, whether that’s later in the school year, or later in their education career.

By way of illustration of my thinking, let me share a knowledge organiser I constructed a couple of years ago for my Year 5/6 class


My first attempt at a Knowledge Organiser in 2015

The differences are quickly obvious. For a start, mine is clearly based on a wider period of teaching, and perhaps more indicative of a basic revision guide, rather than providing content in advance of a unit. I think perhaps that’s also its biggest downfall. It’s worth noting that it’s something I tried and didn’t come back to.

But I think there is maybe a useful middle ground. In Jon’s case, much of the content set out – particularly on the timeline – is content that is useful for the purposes of writing an information text about the event itself (a task which Jon plans to do in his Y2 class). However, I don’t think he expects those students to secure that detail in the very long term. Arguably, this brings the organiser perhaps closer to the cramming model of revision than the more successful spaced practice approach.

Ruth Smith posted a comment on Jon’s blog saying she could imagine the organiser being used as a prompt during writing. While I can see the merits, I do think that the risk then – as Jon would rightly say – is that we replace the value of knowledge with the reliance on someone/something else to do the work for you. That’s not the aim here.

It leaves me wondering what the function of a Knowledge Organiser should be. I’m not persuaded of the value of knowing the date of leaving quarantine after the lunar landing. That said, the value of learning the word ‘quarantine’ is something I think is highly valuable.

The question for me becomes one of later testing (and let me be honest, I’m only at the very beginning of this journey; don’t for a second presume that I’m an expert. I’m a way behind Jon on this!)  In a knowledge rich curriculum, I think one of the key functions of a Knowledge Organiser is to set out the key knowledge that I want students to retain and that I will test for.

We know of the great merit of spaced testing to aid learning, and it strikes me that a Knowledge Organiser should aim to set out that content which would likely later form part of such tests. In the context of Jon’s organiser, I could see merit in testing much of the vocabulary, the date of the landing, and perhaps the names of the crew. However, I’d also want to include some wider context – perhaps a bit more detail behind the Space Race, mention of JFK’s 1960 aim, etc. Might these replace some of the less significant dates of 1969?

Of course, we’re talking about 7-year-olds in Jon’s context. They will lack much of the wider historical knowledge to place events in context, and so there is a risk of expecting too much. But equally, if we train children that knowledge is to be learned, then ought we not be training them to learn it for the long term?

The content I think* I’d like to see on Knowledge Organisers is the detail that I would also expect to use in a brief pop quiz a week later, but also on a test mid-year drawing on prior units, and again at the end of the academic year, or in the first days of the following September. There is a risk that using Knowledge Organisers to aim for short-term recall of detail that is later lost, will develop a cramming ethos, rather than one of long-term storage of information.

What does this mean for Jon’s example? I’m not sure. Maybe a separation of the content that he expects children to retain in the long term from information which would be useful in this context? There is certainly some merit in having this timeline clear in the child’s mind as they are writing – not least because it helps to build a narrative, which is a great learning technique –  but is it necessary for it to be stored in long-term memory? Indeed, is a two-week unit even long enough for such a transfer to be made?

Yet there is unquestionably information here which would be re-used in future that would allow such a long-term retention.

More thinking to do… but well worth doing, I think.

*I say I think, because I am not entirely sure that I won’t think completely differently in six months time.

If you haven’t already, I again recommend reading Jon’s original post here.

You’re not still teaching that are you?

This has become something of a recurring refrain over my teaching career, and it always – always – frustrates me.

Nobody ever says it about Science: “Oh, you’re not still teaching solids, liquids and gases, are you?”. Or music: “Oh, you’re not still teaching standard notation, are you?” And yet for some reason it seems to abound in other areas – especially English.(Even maths seemed to go through a phase where the standard basics were frowned upon!) But such decisions are often distinctly personal.

The first time I read Holes by Louis Sachar, I couldn’t wait to get planning for it, and was desperate to start teaching it. Now, having taught it too many times for my own liking, I’m tired of it. I suspect that this will be my last year of tackling it because I’ve lost my love for it. But for my class this year, it was their first time of approaching it. It was fresh for them. The only reason to abandon it is that my waning love for it risks coming through in the teaching.

But that won’t stop somebody somewhere from saying “Oh, but you’re not still teaching Holes, are you?”

It happens too often.

Tonight I’ve seen the same said of both The Highwayman and the animation The Piano. Now for sure they’ve both had more than their fair share of glory, but there was a reason why they were chosen in the first place. I’m all in favour of people moving away from them, finding better alternatives, mixing things up a bit. But they don’t cease to be excellent texts just because they’ve been done before. Every Year 5 child who comes to them does so for the first time.

I’ve heard the same said before of The Lighthouse Keeper’s Lunch at KS1 -as though somehow the fact that a topic has worked brilliantly in the past should be ignored simply because a consultant is over-familiar with it.

Of course, there are reasons to ditch texts. Sometimes they become outdated. Sometimes they cease to match the curriculum. Sometimes the ability of the children demands more stretch. Sometimes something much better comes along. Sometimes you’re just sick of them.

I’ve never cared for Street Child even though it’s wildly popular. I’ve always found Morpurgo’s work irritating. But if others find them thrilling, and get great results with their classes, then so be it. Who am I to prevent them teaching them?

As somebody also responded on Twitter this evening: the best “hook” is the teacher. If a teacher feels passionately about a poem, a book, or a topic, then it can be a great vehicle for the teaching that surrounds it. And if we make them all ditch those popular classics merely because they’re popular, then you’d better have a damned good replacement lined up to offer them!

Writing for a Purpose (or 4!)

For some time now I have been working on a model of teaching Writing built around the idea of longer blocks focusing on fewer things. Previously I have written about a model I used in my previous school, and since then have had many requests for more information.

This year I have finally produced some notes about the model I use, based on 4 Writing Purposes. My view is that rather than trying to teach children 10 or more different ‘genres’ or ‘text types’ as we used to do in the days of the Writing Test, rather it is better to focus on what those types have in common. It means that at my school we use 4 main types of writing across KS1 and KS2: Writing to entertain; to inform; to persuade; and to discuss.*


The 4 main writing purposes, and some of the ‘text types’ that could fall under each.

Importantly, by the end of KS2 I’d hope to see children recognise things like the fact that newspaper articles could actually fall under any or all of the 4 headings: they’re not a distinct type in themselves, really.

As a very rough rule, I’d expect around half of curriculum time to be taken up by “Writing to entertain”, with the remaining non-fiction elements sharing the remaining time. Notably in KS1 the non-fiction focus is only on Writing to inform.


Example guidance note

To support structuring the curriculum in this way, I have now compiled some guidance notes for each category. I say compiled, rather than written, because much of the legwork on these notes was done by my wife – @TemplarWilson – as she rolls out a similar model in her own school.

The guidance notes attempt to offer some indications of National Curriculum content that might be covered in each section. This includes some elements of whole-text ideas, suggestions for sentences and grammar, notes on punctuation to include, and also some examples of conjunctions and adverbials.

They’re not exhaustive, nothing radical, but as ever, if they’re of use to people, then I’m happy to share:
4 Writing Purposes – guidance (click to download)

Alongside the guidance sheets, I also have the large versions of the 4 main roadsign images, and an example text for each of the four purposes. The example texts are probably of more use at the upper end of KS2, and could almost certainly be improved, but they are a starting point for teaching and analysis by the children to draw out key features, etc. Both can be downloaded here:

4 Writing Purposes – Roadsign Images

4 Writing Purposes – Example Texts

*Secondary English teachers may recognise these as being loosely linked to the old writing triplets at GCSE level.

One-page markscheme for KS2 GPS test

Just a quick post to share a resource.

As I plough through marking the 49 questions of the KS2 sample Grammar test, I find keep flicking back and forth in the booklet a nuisance, so I’ve condensed the markscheme into a single page document.

You’ll still want the markscheme to hand for those fiddly queries, but it means a quicker race through for the majority of easy-to-mark questions. For each question, where there are tickboxes I’ve just indicated which number box should be ticked; where words should be circled/underlined I’ve noted the relevant words. For grid questions, I’ve copied a miniature grid into the markscheme.

Feel free to share: One-page GPS markscheme

Of course, once you’ve marked the tests, please also share your data with me so we can start to build a picture of the national spread of results – see my previous blog.

KS2 Maths – Question Level Analysis

As so many schools have evidently used the sample tests to help ascertain their pupils’ progress towards the expected standard (whatever that might be), I’m sure many will welcome the opportunity to analyse the outcomes.mathsqla

Emily Hobson, (@miss_hobson) of Oasis Academies, has kindly agreed to share the template she put together for analysing the KS2 tests.

The spreadsheet can be downloaded below, and then data entered to scrutinise your pupils’ progress in the main areas, and for each question.

Question Level Analysis (Sample Material) – Mathematics

Names need only be entered onto the first page; these will then carry across to later pages.

You can also adjust the % thresholds on the first page, and these will be reflected in the colour bands marked for each pupil.


More Teacher Assessment confusion…

I’m never happy.

Months of moaning about the delays to the delivery of exemplification for Writing Teacher Assessment, and now it arrives I’m still not happy.

But then… it is a bloody mess!

The exemplification published today demonstrates what many of us feared about the new interim teacher assessment framework: expectations have rocketed. I appreciate (probably more than most) that direct comparisons are not ideal, but certainly having been told that the new expected standard would be broadly in line with an old Level 4b, I know I feel cheated.

The discussions in this household about the “expected standard” exemplification were not about whether or not the work was in line with a 4b, but whether or not it would have achieved a Level 5. That represents, of course, an additional 2 years of learning under the old system. We’re expecting 11-year-olds to write like 13-year-olds.

In fact, the only time where 4b ever came into the conversation was in our browse through the new “Working towards” exemplification. It seems that a child who used to meet the expected standard in 2015, would now be lucky to reach ‘working towards’ even.

What this will mean for national data this year, who knows? If schools are honest, and moderation robust, could we see a new “expected standard” proportion somewhere in the mid-30% range, like we used to with Level 5s?

Among all this, though, is another confusing element. For while in the old exemplification materials for levels in years gone by we were told that “All writing is independent and is in first draft form” (my emphasis), it seems that now this message is not so clear. Informal feedback from the meetings held at STA on Thursday and Friday last week seemed to bring up some surprises about what constituted independent writing, including the scope for using dictionaries, following success criteria, and even responding to teacher feedback.

So now we have what looks like horrendously difficult expectations for a majority of pupils who have had barely two years of a new National Curriculum instead of six, and a lack of clarity, once again, about what is actually expected.

Is it really too much to ask?


For those who haven’t yet had the pleasure, the KS1 and KS2 Writing exemplification documents are available here:




For whom do we toil?

When I was a young(er) teacher, I learnt a few tricks of observations.

As an NQT, I knew that my mentor wanted to see a calm environment; she saw it as an indicator of the all-important behaviour management. And so I obliged.

In later years I had a Head of Year for whom I always included something that engendered good engagement – all the better if it was on coloured paper. Another subject leader rated talk partners, and so they always appeared in lessons in which I was observed.

When marking became the thing, I’d always ensure that I grouped children in my observed lessons according the work I’d marked the night before. Rarely did I do it at any other time, but it ticked the box.

And then it was progress in the lesson. So every observed lesson, I ensured that I asked children to do something at the start of the lesson (often giving them rather too little time or guidance), before teaching them some new skill and asking them to try the task again, with evident improvement clear for the observer to see.

A cynic might suggest that these things didn’t help children make progress, but rather than created the illusion of progress for the observer.

And now it’s progress over time. But I’m a cynic.

The latest craze seems to be for hot and cold tasks and the like. Now I’m sure there are many arguments for this approach in some cases, but it seems that the main reason put forward is for its ability to “demonstrate progress over time”.

It’s the drawn out version of my “progress in a lesson” trick, to show progress over a period of days or weeks. It offers the evidence on  a plate to our external judges; it stops them from ‘catching us out’ on that tickbox in the Ofsted framework.

But frankly, if an inspector can’t see progress over time by looking in books, then either there is something very wrong with the books.. or the inspector!

Progress over time is when children go from using simple multiplication facts to being able to use the standard written method.

Progress over time is children who use repetitive sentence structures in September, are showing more variety by January.

Progress over time is a well-planned curriculum that builds on prior learning and extends pupils’ experiences.

We shouldn’t be finding ways of making progress over time evident; we need to be finding ways to make progress over time happen. The evidence will come. And if that means dragging the inspector to see it, then so be it.

Spot the Difference

The two following extracts are taken from entirely different documents. Before I start ranting, take a look at them, and try to discern which demands the greater challenge:

aqa ks2

Now, chances are that you recognised at least one set of statements, but putting that knowledge aside, how clear is it which is the more demanding?

For example, if I were to point out that one is the expected standard for KS2 (i.e. a list of things that a child must be able to do at age 11 to reach the expected standard – for which 85% of children are meant to be aiming) and the other is the writing descriptors for a mid- to high-range GCSE grade (i.e. an outline of the expectations of a student somewhere around the expected standard at age 16), would it be clear which was which?

I’d imagine that some of the expectations give it away: its unlikely to see handwriting mentioned in the GCSE paper, and for some reason choosing appropriate content doesn’t seem to matter at age 11. But is the demand really different enough to recognise a further 5 years of education?

If a child is able to make some use of semi-colons, dashes, colons and hyphens at age 11, is it really any different to be using a range of punctuation at age 16?

Is spelling irregular words correctly any different from generally accurate spelling (bearing in mind that the list of words for KS2 includes accommodate, embarrass, mischievous and yacht)?

And how is it that accurate sentence demarcation falls in the upper range of GCSE performance, but is only “working towards the expected standard” for 11-year-olds/

Now, you might argue that the GCSE criteria are too simple. I might even agree with you. But overall – given the higher level of demand of the task and other things, it seems like it might be a reasonable aim for a majority of 16-year-olds, given that 65% are expected to meet the new Grade 4 standard initially. But is the list of expectations for 11-year-olds really realistic?

So much for 4b-equivalence

When the new National Curriculum assessments were initially explained, we were told that the expectation for 11-year-olds would rise to be in line with what had previously been a “good level 4”, or a level 4b. This list for KS2 bears no resemblance to such a list.

As we’ve changed from best-fit measure to a non-negotiable one, it strikes me that the most straightforward way of drawing a comparison would be to look at the old Level 4 writing criteria. In the past it would be enough to get ‘most’ of these elements secured to reach a level 4, so presumably to be a ‘good level 4’, you would expect to see all of them. But what sort of a list would that leave us with?

I’d suggest something like this:


And it’s notable that of those children who were securing this ‘good level’, some 72% of them were going on to get 5 good GCSEs including English and Maths. That seems like a pretty good figure to me, if the current aim is for around 65%. So why the massive ramping up of demand?

An effort to be seen to be raising the bar?

A scheme to force more primary schools to convert to academy status?

Or just sheer incompetence?

For those interested, the GCSE criteria are taken from the AQA Specimen Markscheme for its new-style GCSE, and can be found at

I should thank @sputniksteve for bringing the document to my attention.

Why we’ve got planning and marking all wrong (part 2)

On Thursday I published a post that largely focussed on why I think we are expending too much effort on written marking. Today I want to pick up on why one of the worst costs of that excessive use of time, is the lack of time left to devote to planning.

Many people responded to both my recent polls stating that they consider marking and planning to be synonymous, or intertwined, or in some way part of the same thing. I argued previously that actually I think it is the looking at work that has the greatest impact on future teaching, not the written comments that get added to it.

It seems, though, that the “informing future planning” argument has become well-used to justify the massive volume of marking. Unfortunately, like with so much else about marking, the credit it is given outweighs its actual value, in my opinion. For while undoubtedly there is power in good formative marking, I’d argue there is much more in good planning. And interestingly it seems that a majority of people instinctively agree with me. Another simplistic poll suggested that a significant majority of teachers feel that planning has a greater impact on pupils’ progress than marking:


Which rather begs the question: why did the poll for the most-time spent, show it to be the other way around? Why are 2/3 of us spending more time on marking, when most of us feel that planning would be more beneficial?

What’s more, I think that most people are basing that on the relatively narrow idea of planning that we currently use in teaching. I suspect that many of those who thought marking more valuable are in schools where the burdens of recording planning detract from its benefits. It’s still common to hear of schools where detailed daily and weekly plans must be submitted in advance, or where every lesson must be planned using a given pro forma with endless boxes.

But it’s not this that I mean by planning. Too often we still think in short term lumps when it comes to planning – even to the point of separating out learning into separate single-hour lessons. Bodil Isaksen has written well about this in the past in her blog: A Lesson is the Wrong Unit of Time.

I think the historical focus on progress is partly to blame. When Ofsted were looking to see progress within a 20-minute window, of course it was necessary to have at least one new objective every lesson. But in reality we know that learning doesn’t work like that. One lesson on subordinate clauses will not make high quality complex sentences abound in children’s writing. There is a long progression of understanding to pass through to reach that point.

The problem is, I don’t know what it is. I’ve got some thoughts, but I haven’t given enough time over to thinking it through clearly enough. I haven’t spent the time planning what the curriculum should look like if my goal is to ensure children can use complex sentences well. There was always too much else to do.


My mantra

My drive for more planning time is not about more filling in of pro formas. Quite the opposite, it’s about the thinking time to develop meaningful sequences of learning. It’s about setting a small number of key learning goals to be achieved over a period, and then developing the sequence of learning experiences that will guide students towards that aim. It’s about doing less, but better.

And inevitably that means that in some one-hour lessons, children won’t evidently be any closer to achieving that outcome than when they began.

But as I’ve advocated in the past, by spending longer periods of time building up a narrower range of objectives, we can develop meaningful sequences of learning that provide opportunities for practice, for application, for making links, and for exploring in greater depth.

The current reality is very different. Particularly in primary schools, long-term planning (if it exists at all) tends to consist of the ‘sharing out’ of topics, with medium-term planning often focussing on links between subjects and contexts for work. Very rarely do I see a medium-term plan which clearly sets out the handful of things that children will be expected to really understand by the end of the unit.

Probably partly because of all the marking.

If you’re marking 30 books in school every day, and taking another 30 home, and saving the topic books for the weekend, when do you have any serious time to sit down and think about – or better still, talk about – the direction of the curriculum. Is it any wonder that we get trapped in the short-term cycle of planning lessons for the next few days? And given the detail in which we often plan in the short-term, is it any wonder that our longer-term plans are inevitably brief?

Now of course, there will be arguments that planning needs to be done immediately before teaching so that you can respond to assessment in the prior lesson. Again, I think that’s a nonsense. The only reason planning needs significant adaptation is if it is too detailed. If you plan every lesson down to the last minute (as once we might have been expected to do), then of course, any slightly twist in the lesson will mean re-writing the plan. But if, rather, we have thought about the long-term goals, and planned a likely sequence of reaching them, then the minor variations along the way are incorporated into that “responsive teaching” that I mentioned in the last post.

And just think – if we significantly reduced the volume of written marking, and detail of short-term planning, how much time would we free up to really explore the very best ways of teaching new content and skills over time; to assess children’s understanding more fully; and to respond to that feedback to adapt our teaching to ensure the best possible progress, in line with our medium- and longer-term aims.

In fact, if there’s one thing that unites the problems of planning and feedback, it seems to be that we spend too much time on recording those things, and not enough actually doing them.