Monthly Archives: June 2016

My concerns about primary assessment – a letter to my MP

Dear Mr Bridgen,

I feel compelled to write to you to raise my concerns about the forthcoming publication of Key Stage 1 and Key Stage 2 assessment data. I must apologise for the length of my email, but feel that it is necessary to convey the many significant concerns I have.

While I recognise – and indeed value – the need for schools to be held to account for the progress of pupils in their care, I really do feel that the results which will be published this year at both a local and national level will be unhelpful and indeed misleading on so many fronts that they present a genuine risk to the quality of education provision in the near future.

You may be aware of the delays that have beset the statutory assessment processes since the introduction of the new National Curriculum and the removal of levels. Primary and middle schools have worked valiantly over the past two years to introduce the new curriculum to their pupils, and to prepare them as best they can for the statutory assessments at the end of Year 6, but the challenges have been manifold.

Issues have particularly arisen in the area of statutory Teacher Assessment at Key Stage 1, and in Writing at Key Stage 2, which form part of the accountability processes. The system used this year – an interim solution following the removal of well-understood National Curriculum levels – has replaced a simple number system (i.e. with Level 3 being better than Level 2), with a system built on complex codes and accompanying descriptors. The descriptors were gradually made available to schools around Christmas, with the exemplification to support them not available until as late as Easter in some cases.

Teachers have been asked to apply these new standards with little training, guidance or support, and the messages and repeated clarifications coming from the Standards & Testing Agency have often been unclear. As a result, on the day that teachers are required to submit their final judgements, it is my view – based on my considerable connection with teachers nationally and through social networks – that there is still considerable misunderstanding about how these standards should be applied. Consequently, the data is likely to show an incomplete picture based on inaccurate data in many cases.

In theory, this problem should be mitigated through the use of the Local Authority moderation processes. However, it is clear from my discussions with teachers in many LA areas that this process has been inconsistent both within and between authorities, with a lack of clarity on expectations on key matters such as what constitutes independent work. If you are familiar with the many issues raised about the reliability of coursework assessments at GCSE, you will now find these replicated almost exactly in primary schools this summer.

Furthermore, up to 75% of schools will not have received a moderation visit at all. Thus these schools will have had virtually no support in interpreting the frameworks, nor in making accurate judgements. Notably, from my own discussions it has also been clear that many teachers are not fully aware of the full structure against which they should be assessing children. This is particularly an issue at Key Stage 2 where the framework is very complex.

For example, in Writing, a child can be awarded one of seven different judgements, only three of which form part of the moderation processes. Worryingly, many teachers seem unaware of the lower judgements, and therefore some will erroneously be graded at a far higher level than is accurate. By contrast, in Science, only two judgements are available to teachers, while in Mathematics there are either 2 or 4 possible judgements, depending on whether or not the child sat the statutory maths test, and none of which provide recognition for those pupils working at a higher level than the national expectations.

I have long been concerned that these complex systems of judgements will mean that parents – who are surely one of the key stakeholders in the assessment system – will find it all but impossible to understand how their child’s progress and attainment compares to those of others nationally or locally. Schools will do what they can to mitigate this, and support parents. However, my concern now is for the capacity of the system itself to provide meaningful judgements at all.

With different interpretations of the guidance, significant inconsistencies between authorities in moderation, and many misunderstandings of the frameworks by the professionals involved, it seems almost inevitable that the high stakes nature of assessment at Key Stages 1 and 2 will lead to mistakes being made, and poor decisions being taken. Indeed, the lack of clarity surrounding the Teacher Assessment framework has meant that ‘gaming’ of results is very easy to achieve, even when a school is moderated.

I am particularly concerned that in the current climate it seems not uncommon to hear of schools where the whole process has been either deliberately or accidentally misinterpreted to an extent which would have a significant impact on final published results. I hear from colleagues of advice from Local Authorities or Academy chains that a ‘best-fit’ approach should be taken for some subjects or that some requirements can be treated flexibly, when this clearly contradicts the statutory guidance; I am aware of approaches which teachers have been instructed to follow which, while over-stretching the spirit of the guidance, could easily be argued to be within the letter of the law of the guidance; I know from my own research that there are wildly different interpretations of the guidance on what constitutes a child’s own independent work, and have no doubts that such interpretations could easily be used to inflate a school’s results, while a more conservative interpretation might have a significant negative impact.

As you will recognise from my email, the concerns about this year’s published data are both plentiful and significant. Therefore, while I recognise the importance of making pupils’ individual information available to parents, I am asking that you urge the Secretary of State to publish only national-level data on attainment and progress this year, until such time as the accuracy and validity of any school-level data can be investigated. I feel that a full investigation into this year’s processes ought to be set up at the STA in order that the usefulness of the data can be evaluated, and changes made to future years accordingly.

I would also ask that you remind the Secretary of State of the Department for Education’s protocol which states that significant changes to policy will be communicated to school’s with a lead-in time of at least one year. I note this because this year’s flawed system has been provided as an “interim” solution, but no permanent solution has yet been shared with schools, despite the fact that Teacher Assessment judgements will again be due in less than a year’s time. I really would be most concerned if we were to see a repeat of the delays schools have faced this year.

Should you wish to discuss the matter in greater depth, I would be happy to meet with you to discuss the detail of my concerns. I look forward to your timely response.

Please be aware that I have placed a copy of this email on my Teaching blog and would like to publish your response there also with your permission. I have also copied this email to the member for the constituency in which my school is based.

Yours sincerely,

Whose work is it anyway?

Imagine a Year 6 classroom in the run up to the end of the year. A teacher, trying to gather the last of the evidence for supporting Teacher Assessment on the 30th June decides to set one final task of writing a fictional narrative.

This isn’t a “cold task”. Instead, the children watch a film that tells the story of a child who ‘learns his lesson’ about unkindness, through some incident of kindness that happens to him. The children are all to write a similar tale, giving their own incident as the turning point in the tale.

Before the children start to write, they discuss the features of a narrative, and record these as success criteria on the classroom display board (a practice that is “sensibly permitted” according to 82% of teachers). Children collectively draw up a list of features that marries well with the interim assessment framework, and are reminded of their own target areas by referring to the ticklists taped to their tables.

After some time planning the basic outline of the story using a planning frame that outlines the key elements, the group collects back together again to discuss the sorts of “wow words” they might use to describe the character both before and after the incident. The teacher adds this vocabulary to the classroom display as a prompt for everyone (“sensibly permitted” according to 68%). Of course, being an experienced Year 6 teacher, she checks that words like aggressive and mischievous appear from the Y5/6 word list.

The following lesson, before starting out on the story, the children have a reminder lesson about the punctuation of speech in a narrative. As a plenary, they each record a few sentences that might show how the character is before and after the ‘transformation’.

Finally it comes to the big event: the writing of the story. The children beaver away for 45 minutes, drawing on help from peers when they fancy, helping themselves to a dictionary to check spellings (“sensibly permitted”: 93%). Occasionally, as the lesson proceeds, the teacher reads aloud from the particular good examples around the room (59%):

“Just listen to this from Arriety – ‘The boy was astounded: he couldn’t believe what had happened’. Isn’t that a great way of using a colon, everyone? She’s told us how he felt, and then after the colon she told us why. I wonder if anybody else could include a sentence like that?”

Eventually, the work is done.

Until the next lesson. Over the weekend, the teacher takes away the books and makes notes on areas to improve, either generally or for specific children. Where spelling isn’t quite up to the standard needed, she writes a little “sp.” note in the margin of the appropriate line, so the child can check again (50%). In a few cases, where there are several trickier words on a line, to save time she underlines the misspelt word for the child to check in the dictionary (20%).

During the next lesson, the children get started on their corrections. In one or two cases, children who struggle with spelling are paired with more able children who give them feedback on the words they need to correct.(34% – a further 29% think it’s unreasonable, but still permitted).

One child has written at length, without any sense of paragraphing. The teacher sits with the child and discusses again with him when paragraph changes are needed. They read through the text together discussing where the child now thinks paragraph shifts should be added (50%).

The following evening the teacher looks over the work again. Some pieces she is happy with. Others she sets aside and finds time to discuss the work again with the children. A few need reminding to look again at how the speech is punctuated. One or two could do with improving the ending. Some still have spellings to double-check.

She meets briefly with each of the remaining pupils, pointing out gaps on their ticklist of criteria. In one case she points out that a child has very little evidence of semi-colons. She suggests that he adds a sentence or two to his story somewhere that includes a semi-colon (21%), referring to the example posters on the wall display.

By Friday, most of the class are doing art with the teaching assistant, while the teachers works with the last couple of individuals. Edward can’t quite believe that he’s onto his fifth draft of the same piece of work (20%). He’s had so many conversations with his teacher, dictionary sessions with Max -the best speller in the class – and pointers to posters and prompts, that he hardly recognises the story he started with. Funnily enough it’s not work that his Year 7 teacher will ever recognise as his either.

He’s on track to meet the nationally expected standard.

But is it still his own work?

 

Consistency in Teacher Assessment?

I posted a survey with 10 hypothetical – but not uncommon – situations in which writing might take place in a classroom, and asked teachers to say whether or not they are permitted under the current guidance for “indepencence” when it comes to statutory assessment. It seems that mostly, we can’t agree:

(Click for a slightly larger/clearer version)

1000replies

 

The burden of Teacher Assessment

I’m not normally one for openly personal posts. Although my wife will happily say that much of my reason for blogging is either ego or arrogance, I usually try to keep to pragmatic matters. But this week I’m troubled. And I’m troubled on behalf of others, too.

I’ve made no secret of my loathing of the current Interim Assessment framework and its expectations (both narrow and high), or its tardiness, or error. But as our LA moderation lead said to us last week: whether we like it or not, for this year we’re stuck with it.

And ‘stuck’ is exactly how it feels. I’ve said at a few events held since the framework first came out that the Writing descriptors are literally what keeps me up at night .And with barely a fortnight until the Teacher Assessment deadline, and only a week until our external moderation, that remains the case.

As I look at the collections of work from my class – work of which I am in many cases very proud, and work which demonstrates great progress – I worry about the final figures I’m going to submit. I worry because they reflect on me, my team and my school, and I just don’t know where the land lies.

I worry, too, because everywhere I look I see people doing their best to push children over the hurdles. Not teaching, just ticking off the boxes. It means I’ve heard, from all directions, of the strategies people are using to provide evidence for skills that their children almost certainly don’t have. It means I’ve heard of pupils re-drafting work five times until it ticks the right boxes. It means I’ve heard of directed teaching and overly-structured work that will allow teachers to show the evidence that they think will give children the credit they deserve for their learning. And I know how much additional work I’ve put in – and my class have – in these last few weeks to try to tick those boxes myself.

But as the judgement day looms, I want those figures to be higher. I worry that my refusal to stretch the rules to breaking point may disadvantage my school, and may well reflect badly on me.

And what really worries me about that is that I’m a confident teacher. I know where I have made choices that are for principled reasons. I know that my headteacher isn’t some unscrupulous bully who will demand results with menaces. I know that I have taught my class well this year, and I know what progress looks like, and could happily show it to anyone. Indeed, just 6 weeks ago, Ofsted came in and agreed with it.

But what of the newly-qualified teacher in Year 6? What of  the teacher in a school where the headteacher has no qualms about bullying staff to get the results? What of the teacher who knows that one year of bad data could be them out of a job? What of the teacher already struggling with something outside of school having to work this out?

If I’m having sleepless nights, how must it be for them?

There are those who would happily see the back of SATs tests, who argue that teacher assessment is the way forward. Personally, I think it’s the stakes that matter. And just about the only thing worse for  a teacher than a high-stakes test, is high-stakes Teacher Assessment.

It’s like being taken to the hangman’s noose… and then being asked to make your own gallows.


Before you worry too much about me, rest assured that on proof-reading this, my wife said “I don’t think you’re arrogant…. all the time.” Praise indeed.

Collecting KS2 data on Teacher Assessment

Having had over 100 schools respond to my plea to share data from KS1 Scaled Score tests, the next big issue on the horizon is the submission of Teacher Assessment data at the end of June.

In the hope of providing some sort of indication of a wider picture, I am now asking schools with Year 6 cohorts to share their data for Teacher Assessment this year, as well as comparison data for 2015. As with all the previous data collections, it won’t be conclusive, or even slightly reliable… but it will be something other than the vacuum that currently exists.

So, if you have a Year 6 cohort, please do share your Teacher Assessment judgements via the survey below:

 

Some initial thoughts on KS1 data

ks1warning

I started collecting data from test scores and teacher assessment judgements earlier this week. So far, around 80 schools (3500+ pupils) have shared their KS1 test scaled scores, and nearly 60 (nearly 2500 pupils) have shared their Teacher Assessment judgements (in the main three bands). So, what does it show?

Scaled Score Data

Despite – or perhaps because of – the concerns about the difficulty of the reading tests, it is Reading which has the highest “pass rate”, with 65.5% of pupils achieving 100 or greater. (Similarly, the median rate for schools was just over 65%)

Maths was not far behind, with 64.2% of pupils achieving 100 or greater, although the median score was slightly higher for schools, again at 65%. The results for GPS were lower (at around 57%), but this was based on a far smaller sample of schools, as many did not use the tests.

The spread of results can be seen approximately, by the proportion of schools falling within each band in the table below (click to enlarge)

testscores

For example, just 2% of schools have more than 90% of children achieving a scaled score of 100 in Reading, while 43% of schools had between 60-69% of children scoring 100+

Notably, the range in Maths results is slightly broader than in Reading.

Teacher Assessment Judgements

The order of success in the subjects remains the same in the collection of Teacher Assessment judgements, with Reading having the highest proportion of pupils reaching the expected standard or greater, closely followed by Maths – and Writing trailing some way behind. However, perhaps the most surprising difference (or perhaps not) is the fact that the proportions are all approximately 10% higher than the scaled score data.

According to teachers’ own assessment judgements, some 74% of pupils are reaching the expected standard or above in Reading, 73% in Maths, and around 68% in Writing.

Similarly, the spread of teacher assessment judgements shows more schools achieving higher proportions of children at the expected level – and includes one or two small schools achieving 100% at expected or above.

tascores

There are notable shifts at the bottom end. For example, 16% of schools had fewer than half of children achieve 100+ in Maths, whereas only 4% of schools have fewer than half of their children achieving the expected standard when it comes to Teacher Assessment.

It’s important to note that the data is not from the same schools, so any such comparisons are very unlikely to be accurate, but it does raise some interesting questions.

Greater Depth

Have I said we’re dealing with a small sample, etc? Just checking.

But of that small sample the proportions of pupils being judged as “Working at Greater Depth within the Expected Standard” are

Reading: 17%
Maths:     16%
Writing:   11%

More Data

Obviously there are many flaws with collecting data in this way, but it is of some interest while we await the national data. If you have a Year 2 cohort, please do consider sharing your data anonymously via the two forms below:

Collect Key Stage 1 data

By popular request, I am collecting data about both test scores and teacher assessment judgements for Key Stage 1. The intention is to provide colleagues with some very approximate indicative information about the spread of results in other schools.

As with previous exercises like this, it is important to warn that there is no real validity to this data. It isn’t a random sample of schools, it won’t be representative, it is easily corrupted, mistakes will often slip through… etc., etc.
But in the absence of anything else, please do share your data.

scoresI am collecting data in two forms. Firstly, test score data using scaled scores. These can be entered into the spreadsheet as with previous sample test score data. Please enter only scaled scores for your children. The spreadsheet can be accessed (without a password, etc.) at this link:

Share Key Stage 1 Test Scaled Score Data

tadataI am also collecting schools’ data on Teacher Assessment Judgements. To simplify this, I am collecting only percentages of children working at each of the three main bands in Key Stage 1. I am not collecting P-scales, pre-Key Stage or other data. For this, I have put together a Google Form which can be completed here:

Share Key Stage 1 Teacher Assessment Data

Please do read the instructions carefully on each form (you’d be amazed at how many foolish errors have been submitted previously through not doing so!)