How many schools have spent the last couple of years telling the DfE that designing a new assessment system from scratch isn’t as easy as it looks? Now it seems we’ve reached the time where we can say:
We told you so!
For in all the rush and difficulty of trying to put together a teacher assessment approach, one can’t help but wonder whether the DfE have forgotten altogether what the point of it was! When I look at the new exemplification of the interim assessment framework (there’s that wretched interim word again), it doesn’t strike me that the current incarnation is any good for anything.
After all, what purpose might summative assessment judgements serve?
To inform parents?
Just hopeless. While the objective-level information might be of some interest, schools are much better equipped to provide information about pupils’ progress and attainment based on their on-going assessments. The simplistic category decision based on a fairly arbitrary list of criteria is not much use to parents at all.
Frankly, what’s on my mind at the moment is the issue of how we inform parents of the real information that underlies the useless information they’ll get from the proposed arrangements.
To inform secondary schools?
When a change to Teacher Assessment was first on the cards, I suggested the idea of a simple list of criteria that a teacher could make binary decisions about. My thoughts, though, were to limit those decisions to perhaps a maximum of 10 key areas that would be useful for secondary teachers to know. Instead of a simple Below/At/Above decision (or the rather more complex labels of the current monstrosity), it could offer a simple list of basic Level 4-type criteria that would inform future teachers. Imagine if for each child starting in Y7, a teacher could see a profile which read something like:
Adapt writing to various purposes/audiences ✓
Use paragraphs to organise ideas in writing ✓
Use appropriate sentence demarcation ✓
Surely this would be more useful information for transition – and would need no laborious evidence-collection to accompany it. Much better, at least, than receiving children graded as “Working towards the expected standard” which tells us nothing.
To differentiate between pupils
This is where the level of expectation is all wrong. Almost universal agreement seems to be that the ‘expected standard’ descriptor looks more like an old Level 5 Writer than the Level 4b we were promised. Bear in mind that just 36% of pupils achieved Level 5 or higher last year. That suggests that at least 1/2 of all pupils will be lumped into the “Working towards” standard, with many of the rest falling below even that. Hardly a good differentiator. It seems all the more odd when we consider that the top band (Working at Greater Depth), which seems to align broadly with an old L5a, would likely cover a tiny percentage of pupils.
To hold schools to account
How on earth can schools be held to account when 50% of pupils are likely to be lumped into the same group. What hope is there for the progress measure, comparing pupils with similar starting points, if everyone has similar ending points. How can this possibly be of any use.
Except, of course, the problem is that this is exactly what is proposed. It may well be the case that half of pupils end up in the same band – with many pupils working at Level 2 in KS2 falling into that group. Do we all have the same progress score? Will a school’s progress measure come down to the luck of quite how close your children were to borderlines in Year 2?
It’s a farce. And the whole system seems to have lost its way.
The department needs to think again – and fast!
NAHT members should note that the organisation has given the DfE a week to address its significant concerns about workload and expectation, before proposing action in some form. Members are invited to pledge their support for action via the NAHT website.