Monthly Archives: August 2014

Key Objectives for KS1/2

The Key Objectives for Reading, Writing, Maths and Science for KS1/2 can now be found in the Free Resources section.

Curriculum Cock-Ups?

Teachers, school leaders and experts across the land have been only to keen to point out that the latest changes to the National Curriculum seem rushed. Only this week we saw the ATL survey showing that eight of ten teachers don’t feel they’ve had enough time to prepare for it.

What is becoming increasingly evident is just what an impact such a rush has. I present just three examples of what I consider to be cock-ups that should have been ironed out before the curriculum documentation reached its final stage.

Exhibit A

The first is my particular favourite. At first glance, the spelling requirements for Y5/6 English don’t seem too ridiculous, until you look more closely at the fifth and sixth bullet points:

dictionariesI have yet to find anyone who can rationally explain to me the difference between these two requirements. Clearly, the wording is different, but is it only me that feels that this is just an error where someone forgot to delete one example?


Exhibit B

Now, I’m no history expert, so I’ve been trying to pull together some history ‘Cheat Sheets’ to help both me and others with the new strands of the curriculum. The area which has presented most challenge has been the new Non-European Study section. By far the most popular option has been the study of the Mayan Civilization, but some schools – particularly those with many West African students, or with some expertise in the area – would reasonably opt for Benin. A quick glance at the Wikipedia article for the kingdom suggests that it was at its most significant from 1440, yet for some reason the National Curriculum proposes that we study an alternative period:

benin

This seems all the more strange when you consider that, as the Historical Association says: “Benin didn’t really exist in 900 AD” and that the most famous of historical artefacts from Benin – the Benin Bronzes – date from the twelfth century onwards, with the most significant falling in the fifteenth century, well outside the proposed period.

So what was the rationale behind the very precisely defined 900-1300 period? Just another cock-up?


Exhibit C

Given that the new curriculum is meant to be “the best which has been thought and said“, you’d think that plenty of experts would have been involved in the development of the programmes of study. And given the great interest in the content of the History curriculum, perhaps none moreso than in this subject. Yet it was here that I found a further error. Again, to the untrained eye nothing seems amiss with the third British History unit:

scots

However, the most precursory research into that second bullet point – about the Scots invasions – suggests that perhaps too few experts were involved. For some time now, there has been much doubt about the idea that the Scots ever “invaded” at all. In fact, it seems more likely that the Gaels on the Scottish west coast were part of the same group as those in Northern Ireland; there is no archaeological evidence for an invasion. It is, at best, a contested issue.

Interestingly, when I raised the point with Scottish historian, Mark Jardine, he described it as a classic “myth history in chronicles vs. history” debate, and “way, way too complex” for lower KS2!


Of course, it could be argued that these mistakes are not errors at all, just… unusual choices. But given the very short time period allowed for drafting, redrafting and publishing the curriculum, is it any surprise that errors slipped through? Doubtless there may be more in the Secondary subjects which I haven’t even begun to look at.

Can they seriously argue that this wasn’t rushed?

The rocky road of progress

I’m a big fan of analogies, but they can be dangerous things.

The latest big thing in educational assessment is Ladders. As schools struggle to work out how to move forward with assessment in the world after levels, there have been many schools using ‘ladder’ schemes, including two of the three primary schemes in the Assessment Innovation Fund.

This blog is not intended to criticise those schemes at all, far from it. However, there is a risk in using the analogy of ladders that implies a straightforward progression. It is too easy to be tempted to look at a linear list of objectives and presume that each follows on from the last.

I can say with certainty that the team behind the Hiltingbury Learning Ladders scheme have been quite clear about this whenever I’ve heard them speak: the linear presentation shouldn’t imply linear progression through the objectives. Unfortunately, this message won’t necessarily get through to all parties – particularly those who are not teachers themselves.

This file is licensed  from Romary under the Creative Commons Attribution-Share Alike 3.0 Unported license.Although ladders make for nice presentation, I had a conversation this week about an alternative analogy:  the climbing wall.

There are several significant differences between ladders and wall, not least of which is the fact that with a climbing wall, each step may require a different capability or skill.

Significantly, different people reaching the same vertical height have not necessarily mastered comparable skills (cf. National Curriculum levels).

There is often one common way to climb a wall, but that doesn’t prevent some people taking different paths to achieve the same goal – and significantly, achieving the same overall goal (i.e. the top of a wall) can be more difficult if some paths are chosen over others.

In academic terms, my reference of choice is always the knowledge of multiplication tables. In the past, knowledge of tables was a key part of Level 4 number criteria, and yet many students happily reached significantly higher levels on tests without securing this knowledge. It simply isn’t possible to say that the knowledge of tables has a fixed place in a sequence of teaching and learning.

It’s important that teachers understand this, but it’s also equally important that parents, data managers and dare I say it, even Ofsted inspectors understand this. Learning Ladders are great… so long as we recognise that they’re just an analogy, with weaknesses.

Whose data is it anyway?

I caused a bit of an upset today. As too easily happens, I saw a conversation via Twitter that raised concerns with me, and I rushed in with 140 characters of ill-thought-through response.

Some very knowledgeable experts in the field of school data management were trying – quite understandably – to get their heads round how a life after levels will look in terms of managing data and tracking in schools. As David Pott (@NoMoreLevels) put it: “trying to translate complex ideas into useable systems”.

My concern is that in too many cases, data experts are being forced to try to find their own way through all this, without the expert guidance of school leaders (or perhaps more importantly, system leaders) to highlight the pitfalls, and guide the direction of future developments. That’s not to say that the experts are working blind, but rather that they are being forced to try to work out a whole system of which they are only a part.

Of course, the problem is that without first-hand knowledge of some of those areas, the data experts are forced to rely on their knowledge of what went before. And as seems to be the case in so many situations at the moment, we run the risk of creating a system that simply mirrors the old one, flaws and all. We need to step back and look at the systems we actually need to help our schools to work better in the future. And as with all good design projects, it pays to consider the needs of the end user. Inevitably, with school data, there are always too many users!

Therefore, here is my attempt – very much from a primary perspective, although I daresay there are many parallels in secondary – to consider who the users are of data and tracking, and what their needs might be in our brave new world.

The Classroom Teacher

This is the person who should be at the centre of all discussions about data collection. If it doesn’t end up linking back to action in the classroom, then it is merely graph-plotters plotting graphs.

In the past, the sub-level has been the lot of the classroom teacher. Those meaningless subdivisions which tell us virtually nothing about the progress of students, but everything about the way in which data has come to drive the system.

As a classroom teacher, I need to know two things: which children in my class can do ‘X’, and which cannot? Everything else I deal with is about teaching and learning, be that curriculum, lesson planning, marking & feedback, everything. My involvement in the data system should be about assessment, not tracking. I have spoken many times about this: Tracking ≠ Assessment

Of course, at key points, my assessment should feed into the tracking system, otherwise we will find ourselves creating more work, but whether that be termly, half-termly or every fortnight, the collection of data for tracking should be based on my existing records for assessment, not in addition to it.

We have been fed a myth that teachers need to “know their data” to help their students make progress. This is, of course, nonsense. Knowing your data is meaningless if you don’t know the assessments that underpin it. Knowing that James is a 4b tells you nothing about what he needs to do to reach a 4a. A teacher needs to know their assessments: whether or not James knows his tables, or can carry out column subtraction, or understands how to use speech marks. None of this is encapsulated in the data; it is obscured by it.

My proposal is that classroom teachers use a Key Objectives model for assessing against specific objectives. Pleasingly, the NAHT appear to agree with me.

Students

Children do not need to know where they are on a relative scale compared to their peers, or to other schools nationally. What matters to children in classrooms is that they know what they can do, what they need to do next, and how to do that. All of that comes directly from teachers’ assessments, and should have no bearing on data and tracking (or perhaps, more importantly, the methods of tracking should have no bearing on a child’s understanding of their own attainment).

Too many schools have taken the message about students knowing where they are and what to do next as an indication that they should be told their sub-level. This doesn’t tell children anything about where they are, and much less about what to do next.

The School Leader

As a department, year team or senior leader, it is very rarely feasible for any one person to have a handle on the assessment outcomes for individual students; that is not their role.

This is the level at which regular tracking becomes important. It makes sense for a tracking system to highlight the numbers of children in any class group who are on-track – however that might be measured. It might also highlight those who are below expectations, those who are above, or those who have made slower progress. It should be possible, again, for all of this to come from the original assessments made by teachers in collated form.

For example, if using the Key Objectives approach, collation might indicate that in one class after half a term, 85% of students have achieved at least 20% of the key objectives, while a further 10% have achieved only 15% of the objectives, and some 5% are showing as achieving less than that. This would highlight the groups of children who are falling behind. It might be appropriate to “label” groups who are meeting, exceeding, or falling below the expected level but this is not a publication matter. It is for school tracking. There is nothing uncovered here that a classroom teacher doesn’t already know from his/her assessments. There is nothing demonstrated here that impacts on teaching and learning in classrooms. It may, however, highlight system concerns, for example where one class is underperforming, or where sub-groups such as those receiving the pupil premium are underperforming. Once these are identified, the focus should move back to the assessment.

In the past, the temptation was to highlight the percentage of children achieving, say L4, and to set then a target to increase that percentage, without any consideration of why those children were not yet achieving the level. All of these targets and statements must come back to the assessment and the classroom teacher.

Of course, senior leaders will also want to know the number of children who are “on-track” to meet end-of-key-stage expectations. Again, it should be possible to collate this based on the assessment processes undertaken in the classroom.

What is *not* required, is a new levelling system. There is no advantage to new labels to replace the old levels. There is no need for a “3b” or “3.5” or any other indicator to show that a student is working at the expected level for Year 3. Nobody needs this information. We have seen how meaningless such subdivisions become.

Of course, the devil is in the detail. What percentage of objectives would need to be met to consider a child to be “on track” or working at “age-related expectations”? Those are professional questions, and it is for that reason that it is all the more important that school and system leaders are driving these discussions, rather than waiting for data experts to provide ready-made solutions.

Ofsted

Frankly, we shouldn’t really need to consider Ofsted as a user of data, but the reality is that we currently still do. That said, their needs should be no different from school leaders. They will already have the headline data for end-of-key-stage assessments. All they should need to know from internal tracking and assessment is:

  1. Is the school appropriately assessing progress to further guide teaching and learning?
  2. Is the school appropriately tracking progress to identify students who need further support or challenge?

The details of the systems should be of no concern of Ofsted, so long as schools can satisfy those two needs. There should be no requirement to produce the data in any set form or at any specific frequency. The demands in the past that schools produce half-termly (or more frequent!) tracking spreadsheets of levels cannot be allowed to return under the new post-levels systems.

Parents

Parents were clearly always the lost party in the old system, and whether or not you agree with the DfE’s assessment that parents found levels confusing, the reality is that the old system was obscure at best. It told parents only where their child was in a rough approximation of comparison to other students. It gave no indication of the skills their child had, or their gaps in learning.

For the most part, the information a parent needs about their child’s learning is much the same as that that their child needs: the knowledge of what they can and can’t do, and what their next steps are. Of course, parents may be interested in a child’s attainment relative to his/her age, and that ought to be evident from the assessment. Equally, they may like to see how they have progressed, and again assessment against key objectives demonstrates that amply.

 

So where next?

We are fortunate in English schools to be supported by so many data experts with experience of the school system. However, they should not – indeed they must not  – be left to try to sort out this sorry mess alone. School leaders and school system leaders need to take a lead in this. Schools and their leaders need to take control of the professional discussions about what we measure when we’re assessing, and about what we consider to be appropriate attainment based on those assessments. Only then can the data experts who support our schools really create the systems we need to deliver on those intentions.

NAHT Assessment Framework – A Review

Last month I reviewed the three main assessment schemes offered for primary schools as part of the DfE’s Assessment Innovation Fund programme. Later in July, with no fanfare whatsoever as far as I can tell, the NAHT released its own suggested framework for primary schools. This followed the recommendation of its Assessment Commission that the NAHT should “develop and promote a set of model assessment criteria“. Consequently, I thought it only appropriate to offer the same treatment.

nahtI was pleased to see the NAHT report back in February, and have been equally pleased to see the general outcome of their expert group’s suggestions for a new assessment framework – not least because it very closely mirrors my own framework!

The panel appears to have acted upon the very good advice of the Expert Panel set up to review the National Curriculum, by setting out an assessment framework which clearly and directly links assessment outcomes to specific learning criteria. The approach has been the same as that which I used to derive the Key Objectives: the group has reviewed the objectives from each year group and stage, and highlighted those which is considers to be Key Performance Indicators (KPIs). Alongside this, for each year group they have written a short performance statement to support a broader understanding of expectations and to support in sharing with parents both expectations and progress.

It is notable that the NAHT framework appears to have slightly fewer objectives from the curriculum recorded as Key Performance Indicators than I selected for the Key Objectives I created. This is particularly of interest as, when I was constructing the Key Objective lists I was aware that some colleagues were concerned that by missing out smaller objectives an important element was lost. Arguably the NAHT team might say that I didn’t miss out enough! By way of example, I chose 15 Key Objectives for Y1 Maths, where the NAHT framework has selected 11 – although some are combined or separated slightly differently from my own list. By Y6 the gaps widens slightly (30 vs around 20 objectives) although again the organisation is slightly altered. I feel that my model offers slightly more clarity… but then I would say that, wouldn’t I?

The materials accompanying the framework quite rightly suggest that schools should adopt the methodology of the framework rather than just transplanting the KPIs without evaluating them. I think that’s important because if there is one shortfall that I would identify with the KPIs chosen by the NAHT team, it is that they have underestimated the importance of some of the fractions statements from the curriculum documentation: schools will need to ensure that their students are confident in use of fractions in line with the objectives in the National Curriculum, and very few of these appear to have been selected as Key Performance Indicators. Similarly, in Y5/6 Reading I felt that the very important objective of “Identifying how language, structure and presentation contribute to meaning” has not been selected. It’s clearly important that schools consider selecting their own criteria based on the model.

In addition to the Key Performance Indicators, the NAHT has also suggested that the information could be recorded using spreadsheets both to track progress, and to provide summarised data. They provide simple screenshots of such an approach in the accompanying video, but do not provide templates or similar. However, this is entirely in line with the spreadsheets initially created by Tim Clarke which track progress against the Key Objectives I set out. If schools chose to adopt the NAHT model, they could adapt the spreadsheets from this site to record and track progress.

Overall, I can’t help but be positive about the NAHT’s work, but then, it so closely fits with the model I have been suggesting, and meets the criteria set out by the National Curriculum Expert Panel that it couldn’t go far wrong.


For School Leaders who are members of the NAHT, the link to download the assessment framework materials is: www.naht.org.uk/welcome/news-and-media/key-topics/assessment/naht-assessment-framework-materials/

For schools wishing to consider using my Key Objective model, the resources can all be downloaded from this link. For each subject there is a booklet listing the Key Objectives (broadly similar to the NAHT’s Key Performance Indicators) and a spreadsheet for tracking progress against the objectives (which the NAHT seems only to suggest; they haven’t actually shared one as far as I can tell.)

Assessment & Tracking Resource Pack

Assessment & Tracking Resource Pack