Monthly Archives: February 2016

Year 6 Sample Tests – early data

 

warning

I haven’t got room for all the caveats that this needs, so let me cover it all by saying that this could all be nonsense. However, since when did ‘being nonsense’ ever stop something happening in education?

A couple of weeks ago I put out a request for people to share their test data. Today, I have the raw scores of around 7000 Year 6 pupils on the sample tests, and can begin to draw some very very dubious conclusions. (See below if you want to add to the pupils!)

So, what can we see so far?

Updated 26th March 2016

Reading

Looking at the data from tests taken this term alone (as earlier tests are obviously likely to show lower attainment), the following can be said of a sample of 6426 pupils’ test scores:

  • The mean average score in the sample was 29 marks
  • The median score in the sample was 29 marks
  • The middle 50% of students scored between 23 and 36 marks
  • If the passmark were set at the 50% mark (i.e. 25/50), then 70% of pupils would have reached the expected standard

Obviously there is still some time until test week, and so children’s scores will unquestionably improve.

The graph shows the spread of results in the sample:

Reading

Grammar, Punctuation & Spelling

Looking at the data from tests taken this term alone, the following can be said of a sample of 5489 test scores:

  • The mean average score in the sample was 35 marks
  • The median score in the sample was 35 marks
  • The middle 50% of students scored between 26 and 44 marks
  • If the passmark were set at the 50% mark (i.e. 35/70), then 53% of pupils would have reached the expected standard

Obviously there is still some time until test week, and so children’s scores will unquestionably improve.

The graph shows the spread of results in the sample:

GPS

Mathematics

Looking at the data from tests taken this term alone, the following can be said of a sample of 6926 test scores:

  • The mean average score in the sample was 59 marks
  • The median score in the sample was 59 marks
  • The middle 50% of students scored between 39 and 79 marks
  • If the passmark were set at the 50% mark (i.e. 55/110), then 55% of pupils would have reached the expected standard

Obviously there is still some time until test week, and so children’s scores will unquestionably improve.

The graph shows the spread of results in the sample:

Maths

More Caveats

I will have made mistakes here. Some people may have made mistakes when entering data. Someone may have made up their data completely. Some of the data will be from the first week of January while some will be from this week. Some teachers will have marked more generously than the test markers might; others will have been more strict. The schools who submitted data were entirely self-selecting. The tests are still weeks away, and booster groups will be going on up and down the country. And frankly, there is no predicting what the DfE might have in mind.

Nevertheless… if teachers can take some comfort from the fact that their data is not horrendously different from everyone else’s, then our work here has been worthwhile.

More data please

We’ll never have anything like a random sample, or even a well-chosen one, but the more data the better, surely. Please do continue to contribute by sharing your data. See my previous post for more details. (And yes… I’m collecting data for a KS1 version too if you’re interested.)

KS1/2 Sample tests – data collection

All over the country, Year 2 and Year 6 pupils have been sitting the sample tests that were published last summer… and then finding very little out as a result about how they’re progressing towards the expected standard.

Of course, there are benefits in practising for the style of the test and identifying gaps in knowledge, but teachers are desperate for an indication of how their pupils compare nationally. In an effort to help, I am hoping to collect data from as many schools as possible so that we can draw some comparisons.

I’m asking Year 2/Year 6 teachers who have used the sample tests to share the raw score data for their children with me. No names, no links to schools… just a set of raw scores for each of the three tests.

If enough schools contribute, then hopefully we can start to build a picture that will give us some idea of how children are faring. We’ll be able quickly to see an approximate mean score for the group, and we’ll start to see what percentage of children are scoring more than half-marks, or what proportion are achieving a 65% threshold, say.

It won’t provide all the answers, but it might just help us feel less in the dark!

If you’ve completed the tests with your pupils, please do share your results. You can enter them directly onto the Google Spreadsheets at

Key Stage 1:
https://docs.google.com/spreadsheets/d/1CkE17ixOGIWwSaexl-m9iJJVzGUrqCR0pFngoVwfygk/edit?usp=sharing

Key Stage 2:
https://docs.google.com/spreadsheets/d/1OupodGAMPVCqHRXwi4YkzXX1usvGH2YSCp-dgkavwAc/edit?usp=sharing

Or you can email me them at info@primarycurriculum.me.uk and I’ll happily collate.

I’m hoping that in a week or so I might be able to start picking out some key messages which I’ll share here and via Twitter.

The Problems with Teacher Assessment

The following statements all came from the 2015 report from the Commission on Assessment without Levels. They all appeared under the heading “The problems with levels”. Are there any of these issues that don’t also apply in exactly the same way to the new interim teacher assessment frameworks?

  • [They] required aggregating a wide variety of data into a single number
  • Too often levels became viewed as thresholds
  • Teaching became focused on getting pupils across the next threshold instead of ensuring they were secure in the knowledge and understanding
  • In reality, the difference between pupils on either side of a boundary might have been very slight, while the difference between pupils within the same level might have been very different
  • Although levels were intended to define common standards of attainment, the level descriptors were open to interpretation
  • Different teachers could make different judgements
  • The information secondary schools received from primary schools was sometimes felt to be unreliable or unhelpful.
  • Teachers planned lessons which would allow pupils to learn or demonstrate the requirements for specific levels
  • The drive for progress across levels also led teachers to focus their attention disproportionately on pupils just below level boundaries
  • Pupils compared themselves to others and often labelled themselves according to the level they were at
  • The disconnect between levels and the content of the national curriculum also meant that telling a parent his or her child was level 4b, did not provide meaningful information about what that child knew and understood or needed to know to progress.
  • The expectation to collect data in efforts to track pupils’ progress towards target levels considerably increased teachers’ workload.

It’s also worth noting that when Mr Gove first made the announcement, the webpage that explained it said:

  • We believe this system is complicated and difficult to understand, especially for parents.

Just saying.

10 questions for the DfE / STA

Sadly, I’ll be busy at 2pm on Friday (funnily enough), so I won’t be able to take part in the DfE’s webinar about KS2 assessment. I suspect that much of it will re-cover old ground because there’s so much confusion around. I consider myself to be pretty clued-up about things, but still there are plenty of questions I have, either because clarity has not been forthcoming, or in some cases because no information has been given at all.

So here are my ten questions for the DfE/STA to answer this Friday:

  • What counts as “independent” work for the purposes of Writing assessment?
  • Will self-corrected spellings, using dictionaries, count as evidence for writing assessment?
  • If so, how are moderators meant to differentiate between spellings which have been self-corrected and those where teachers have directed correction?
  • When will Reading & Science exemplification be published? (We were told they would be out by the end of January!)
  • When will moderation happen?
  • When will schools be informed that they are to be moderated?
  • When the clarification document about teacher assessment be provided?
  • When will we see whatever is to replace for the interim TA frameworks in 2017? (We were promised at least one year’s notice of significant changes!)
  • Was any effort made to align the new Writing expectations to the old Level 4b standard, as we had been told was the intention?
  • Will pupils assessed against the pre-key stage frameworks be awarded a nominal score for progress measure purposes?

And one for luck:

  • Why are webinar events like this being held during the teaching day? (Again!)

 

With love

Michael

Just Plain Wrong.

When I posted this,

I was joking really. But in my absence, a video and bizarre supporting webpage emerged from the DfE that have further raised the profession’s collective hackles. So here are my thoughts. They can be best summarised as follows: mostly the DfE is right, but where they are wrong, they’re just plain wrong.

Five things you need to know

  1. Setting a higher bar
    The government has the right to set different standards and a curriculum, and to set new tests. Although they did not quite meet their own deadline of providing information a year in advance with the new-style tests, they weren’t far off. This part of the department doesn’t seem too chaotic to me, which is a marvel considering what ministers have thrown at it.
  2. Preparing for the new tests
    I’ve said before that I agree that the thresholds can’t be set until after the tests have been taken. And the fact that this is the case means that they can be reasonably set at around the old 4b standard, which the government has given its evidence to support. (It happens not to be evidence I agree with, but that’s another matter). I think that actually this will make the tests acceptable. That some people disagree with the content of the tests doesn’t mean they’re chaotic.
  3. Getting it right
    Ironically, this is where it starts going all wrong. They talk about the frameworks being published later than they’d like. In fact, they were published back in Autumn 2014, which was a reasonable time. Except that they were so awful that they had to be scrapped and started from scratch: the delay was caused by the DfE. Then the exemplification which is so vital to interpretation of such discredited approaches to assessment was so late that some of it still hasn’t arrived. If there is any disingenuity here, then it is undoubtedly on the part of the person writing this statement. If there were anything but chaos about this, then we would at least by now have some vague idea of what Teacher Assessment will look like in 2017, yet there is none. The fact that we are having these battles just 10 weeks before the tests is shocking.
    I’d note, too, that we were promised the exemplification by the end of January (which itself was far too late). For the DfE to claim some magnanimity in moving the KS1 submission date back by 2 weeks when the materials are already over 3 weeks late after their already late deadline is far worse than ingenuous. It once again shows their complete disregard for the profession, and is unacceptable.
  4. Teachers won’t have to fill out 6,120 check boxes
    The exemplification guidance is quite clear that teachers must “check and record whether there is sufficient evidence for each of the statements within the standard”. That could not be more plain. For the secretary of state to claim that this is scaremongering is ignorant. If the document is erroneous in stating that, then she ought to apologise for the confusion caused by her department, not blame others (as @theprimaryhead points out in his blog)
    As it is, I still fail to see how this can be got around if teachers are to ensure that moderators cannot find any gaps in evidence and thus potentially report teachers or schools for maladministration. Again, if this is not the intention, then the secretary of state ought to apologise for the confusion caused, not blame the unions.
    Currently, as a teacher of Year 6 pupils, I can see no other way of preparing myself for the high-stakes moderation process, than to collect an evidence trail of ticks to assure the moderator (and myself). And as a deputy headteacher, I can see no other way of preparing my school for the high-stakes league tables and floor standards, than to corrupt good teaching to try to tick those boxes.
    Lindsey Thomas’s excellent blog addresses this point in greater detail well.
  5. A new floor standard which sets high expectations for all
    Firstly, it annoys me that the department continues to claim that the new standards challenge schools with able intakes. The attainment element of the standard removes challenge for the schools with the easiest catchments.
    That aside, I’ve said before that it’s reasonable to set the expected progress measure after the results are in. The problems come from the ridiculous way in which Writing attainment will be measured. The secretary of state seems to missed the point made by the unions that the new expected standards in Writing for both Key Stages are far in excess of those proposed by the DfE. The secretary of state refers to children “mastering the basics”; I’ve yet to find anyone who thinks that use of semi-colons counts as a ‘basic’.
    The original argument was that pupils who achieve 4b have a far greater chance of achieving 5 good GCSE grades, and so the new threshold should be set at this level. The interim frameworks and exemplification documents clearly show that the DfE has far overshot the bar with its new materials. Setting such a high bar will lead to huge proportions of children being deemed to have failed to meet the standard, and of schools being described as failing. I genuinely believe that this is more a cock-up than a conspiracy, but it’s one that the department seems determined to deny. It is this approach which leads to claims of chaos in the department, and to the claim that the DfE don’t know what they’re doing.

 

The irony in the repeated use of the word “disingenuous” won’t have been lost on teachers.

Local Authorities: the real barrier to Assessment without Levels?

I have done a lot of work all over the country for the past couple of years trying to help schools to move forward with the new curriculum, and particularly with venturing into the brave new world of life after levels.

Having initially been reticent, I have been a keen advocate of schools taking control of assessment so that it matches their curriculum, and moving away from points-based systems which expect linear progress. No teacher has ever argued that a linear model makes sense; we all know that it was meaningless. In many cases schools and their leaders feel liberated to focus on what matters for their children. Teachers are freed up to focus on real assessment that drives teaching and learning, helping children to make better progress.

Except there’s that word.

For the first year or so, headteachers would often ask me: but how do you show progress? As Ofsted have told us time and again: progress can be seen in books, by talking to children, and occasionally by some summative assessment. The Assessment Commission report was clear in its use of Good Practice examples where schools recorded data only annually. In its draft form, it even pointed out that collecting data too frequently could actually be damaging. There should be no need for schools to be falling over themselves with data every 6 or 12 weeks!

So why is it that local authorities up and down the country are still pestering schools for termly data that shows measurable progress for each term? Why are so many trying to re-label old measures, such as “expected progress” by just calling them “sufficient progress” or “good progress”.

The fact that my own authority still has a form entitled “Termly Evaluation of progress of current cohorts” is bad enough. The fact that it takes up 5 pages to record various percentages and proportions for each year group is very frustrating. The fact that any of the numbers that could possibly be put into it would be essentially meaningless is a shocking waste of everyone’s time. The idea that conclusions might be drawn from such data: positively worrying.

So it seems like local authorities could do with some advice. I’ll draw upon that most useful of documents: the final report of the Commission on Assessment without Levels. So perhaps the leaders in local authorities up and down the country could take heed of the following:

The expectation to collect data in efforts to track pupils’ progress towards target levels considerably increased teachers’ workload. The Commission hopes that teachers will now build their confidence in using a range of formative assessment techniques as an integral part of their teaching, without the burden of unnecessary recording and tracking. For this approach to be adopted effectively, it is essential that it is supported by school leaders.

Notice, that last bit includes you, local authority experts!

Another extract:

In-school summative assessment is not designed to support comparisons between schools

The summative information schools use should be for their information. My school’s meaning of ‘on-track’ in December may be very different to another schools. Any data isn’t designed to be comparable, and so shouldn’t be used to compare!

Perhaps Local Authority teams should take their lead from Ofsted:

Inspectors will want to know how schools are assessing whether their pupils are making progress which is appropriate for their age and ability and is sufficiently challenging. Inspectors will gather information from observations in lessons, pupils’ work, discussions with pupils about their understanding and acquisition of knowledge, and the school’s own records. However, Ofsted will not expect any particular data outputs from a school’s assessment system.

It’s time to stop asking for the same old data, and start asking the right questions. Every time you insist schools complete a pro forma that just tweaks the old levels-based style of data, not only are you not helping: you may well be stopping the school from moving forwards with more effective assessment.


In defence of those who work for Local Authorities who are not making such demands, it’s worth sharing this tweet from @clivetaylor915

What purpose Teacher Assessment?

 

How many schools have spent the last couple of years telling the DfE that designing a new assessment system from scratch isn’t as easy as it looks? Now it seems we’ve reached the time where we can say:

We told you so!

For in all the rush and difficulty of trying to put together a teacher assessment approach, one can’t help but wonder whether the DfE have forgotten altogether what the point of it was! When I look at the new exemplification of the interim assessment framework (there’s that wretched interim word again), it doesn’t strike me that the current incarnation is any good for anything.

After all, what purpose might summative assessment judgements serve?

To inform parents?

Just hopeless. While the objective-level information might be of some interest, schools are much better equipped to provide information about pupils’ progress and attainment based on their on-going assessments. The simplistic category decision based on a fairly arbitrary list of criteria is not much use to parents at all.

Frankly, what’s on my mind at the moment is the issue of how we inform parents of the real information that underlies the useless information they’ll get from the proposed arrangements.

To inform secondary schools?

When a change to Teacher Assessment was first on the cards, I suggested the idea of a simple list of criteria that a teacher could make binary decisions about. My thoughts, though, were to limit those decisions to perhaps a maximum of 10 key areas that would be useful for secondary teachers to know. Instead of a simple Below/At/Above decision (or the rather more complex labels of the current monstrosity), it could offer a simple list of basic Level 4-type criteria that would inform future teachers. Imagine if for each child starting in Y7, a teacher could see a profile which read something like:

Adapt writing to various purposes/audiences     ✓
Use paragraphs to organise ideas in writing       ✓
Use appropriate sentence demarcation               ✓
…etc.

Surely this would be more useful information for transition – and would need no laborious evidence-collection to accompany it. Much better, at least, than receiving children graded as “Working towards the expected standard” which tells us nothing.

To differentiate between pupils

This is where the level of expectation is all wrong. Almost universal agreement seems to be that the ‘expected standard’ descriptor looks more like an old Level 5 Writer than the Level 4b we were promised.  Bear in mind that just 36% of pupils achieved Level 5 or higher last year. That suggests that at least 1/2 of all pupils will be lumped into the “Working towards” standard, with many of the rest falling below even that. Hardly a good differentiator.  It seems all the more odd when we consider that the top band (Working at Greater Depth), which seems to align broadly with an old L5a, would likely cover a tiny percentage of pupils.

To hold schools to account

How on earth can schools be held to account when 50% of pupils are likely to be lumped into the same group. What hope is there for the progress measure, comparing pupils with similar starting points, if everyone has similar ending points. How can this possibly be of any use.

Except, of course, the problem is that this is exactly what is proposed. It may well be the case that half of pupils end up in the same band – with many pupils working at Level 2 in KS2 falling into that group. Do we all have the same progress score? Will a school’s progress measure come down to the luck of quite how close your children were to borderlines in Year 2?

It’s a farce. And the whole system seems to have lost its way.

The department needs to think again – and fast!


NAHT members should note that the organisation has given the DfE a week to address its significant concerns about workload and expectation, before proposing action in some form. Members are invited to pledge their support for action via the NAHT website.

 

More Teacher Assessment confusion…

I’m never happy.

Months of moaning about the delays to the delivery of exemplification for Writing Teacher Assessment, and now it arrives I’m still not happy.

But then… it is a bloody mess!

The exemplification published today demonstrates what many of us feared about the new interim teacher assessment framework: expectations have rocketed. I appreciate (probably more than most) that direct comparisons are not ideal, but certainly having been told that the new expected standard would be broadly in line with an old Level 4b, I know I feel cheated.

The discussions in this household about the “expected standard” exemplification were not about whether or not the work was in line with a 4b, but whether or not it would have achieved a Level 5. That represents, of course, an additional 2 years of learning under the old system. We’re expecting 11-year-olds to write like 13-year-olds.

In fact, the only time where 4b ever came into the conversation was in our browse through the new “Working towards” exemplification. It seems that a child who used to meet the expected standard in 2015, would now be lucky to reach ‘working towards’ even.

What this will mean for national data this year, who knows? If schools are honest, and moderation robust, could we see a new “expected standard” proportion somewhere in the mid-30% range, like we used to with Level 5s?

Among all this, though, is another confusing element. For while in the old exemplification materials for levels in years gone by we were told that “All writing is independent and is in first draft form” (my emphasis), it seems that now this message is not so clear. Informal feedback from the meetings held at STA on Thursday and Friday last week seemed to bring up some surprises about what constituted independent writing, including the scope for using dictionaries, following success criteria, and even responding to teacher feedback.

So now we have what looks like horrendously difficult expectations for a majority of pupils who have had barely two years of a new National Curriculum instead of six, and a lack of clarity, once again, about what is actually expected.

Is it really too much to ask?


 

For those who haven’t yet had the pleasure, the KS1 and KS2 Writing exemplification documents are available here:

KS1: https://www.gov.uk/government/publications/2016-teacher-assessment-exemplification-ks1-english-writing

KS2: https://www.gov.uk/government/publications/2016-teacher-assessment-exemplification-ks2-english-writing

 

Primary Progress, Floor Standards & Coasting…

In the last week we’ve had a bit more detail about progress calculations (at last) and can now be a little clearer on how these fit with the new floor and coasting standards. There are a few changes to be aware of which I’l try to highlight, along with the explainer videos I’ve made.

Progress Measures from 2016

There has been one slight change to progress calculations since I made the video below back in the autumn. KS1 APS scores (which are effectively the baseline measure for progress until 2019), will be calculated slightly differently than originally planned. Rather than simply adding all 3 subjects together and dividing by 3, Maths will now count double to balance out the fact that English has two scored strands (for Reading & Writing).

One thing to note, which has caused a little confusion: the “group of pupils with the same KS1 APS score”, means all pupils in the country who had that score, not just other pupils in the same school/group. The video has been updated to reflect this.

Floor & Coasting Standards

Again, there is a video to explain this here, with a few additional notes below:

Some things just to highlight:

  • There is a change from the old floor standards, in that schools who do not meet the 65% attainment threshold must instead meet all three progress measures, whereas until 2015 schools only had to be above at least one measure.
  • Similarly, for the coasting standard, while in 2014/15 it was enough to be above one measure, from 2016 schools must be above the necessary progress thresholds, unless they are already above the 85% attainment level.
  • The progress measures are school-level measures, as outlined in the first video. So a school where children, on average, made the same progress as others with similar starting points, would end up with a progress score of 0. Those doing slightly better than average with their pupils overall would have a positive score; those with more pupils doing less well than average would have a negative score.
  • Remember that progress scores are based on comparing pupils who had the same APS score at KS1, so a child who scored L2c Reading, L1 Writing and L3 maths would be considered to have the same starting point as one who had L3 Reading. L2a Writing and L2c Maths.

Nick Gibb is confused…

If you haven’t already spat out your cornflakes this morning over Nick Gibb’s ridiculous claims that the blame for confusion about primary assessment should be laid at journalists doors,then start by reading his article in today’s School’s week:

gibb

Nick Gibb’s column

Now let me quickly contest some of his points:

  • He seems to have missed the point about the commas fiasco*: the two main arguments were that re-introducing an old-fashioned unscientific approach was not helpful (especially for learners with EAL), and that the supposed clarification from the STA only confused matters because it was unclear about how answers would be marked. When your department’s clarification notes cause confusion, the right thing to do is to apologise and seek to add clarity, not to blame.
  • The concern about the 30+ updates to assessment materials is not that they were all major, but more than schools don’t get told whether they’re significant or not – we simply have to check the documentation ourselves – repeatedly.
  • The line about an “alteration designed to bring the document in line with guidance” is disingenuous, when that alteration brought forward a teacher assessment submission deadline by a month (and all while we’re still waiting for materials to assess against!)
  • He says that the department “have been clear” that the new expected standard is broadly equivalent to a level 4b. He clearly doesn’t have any idea about the shift in Writing expectations.
  • Finally, he says that the department is “working hard” to make sure that information is given in an accurate and timely fashion.
    On that I can only say: “must try harder”

*Incidentally, one can’t help but suspect that the re-introduction of commas was probably exactly one of those cases that Lucy Powell was referring to this week when she complained about government ministers’ excessive involvement in the curriculum according to their whims.