Back in 1998, Andrew Wakefield caused a stir when he produced a long-since discredited article suggesting a link between the MMR vaccine and autism. What resulted included a long-running debate, the retraction of the article by the publishing journal, and the striking off of the surgeon behind it.
But the damage was done. The fact that the original doctor behind the report has been struck off has not removed the problem from history. The reality is that when an ‘expert’ speaks, the average listener does not question his credentials, or investigate their own research: we rely on those offered up as experts to do this work for us and to guide the rest of us.
Now, my point is far less serious than that posed by the MMR controversy, and the mention of it merely illustrative. But there is an issue with ‘experts’ in our profession offering solutions which may actually cause more harm than good.
This week I was attending a conference in London hosted by The Key, at which I was speaking about my approach to assessment without levels. There were other schools represented, also sharing their own models, some of which I thought brilliant, others of which were – to my mind – awful. If nothing else, such events serve to ground me and make clear that I am no oracle.
However, at the same event, David Driscoll – an education consultant who also works as an additional inspector and as an “associate education expert” for The Key – was asked to speak on the topic of “Inspection of Assessment”. He was listed in the brochure as an expert, and the intention was clearly to provide leaders with guidance on managing new assessment systems in an Ofsted-friendly way.
Now, the fact that such reassurance is needed suggests that Ofsted’s messages about not having a pre-determined view on what assessment systems should look like are not yet trusted by the profession; the presentation given by this particular lead inspector demonstrates exactly why that is the case!
To credit Mr Driscoll, he did at least once state the official Ofsted view. However, he then proceeded to explain to delegates that data ought to be presented in standard forms, and that schools would be best advised to keep levels and simply re-write the descriptors.
I was astounded.
He continued to explain that schools needed to choose a starting and end measure and define a fixed measure of expected progress, saying that “you need a number”. Now, perhaps this was evidence of his own limited concept of how assessment and progress works, but it certainly isn’t a message that fits in with the direction of travel in education at the moment. Or at least, it oughtn’t be. But, of course, the problem is that he is “the expert”. And every headteacher there will have had at the back of their mind the realisation that he could have been the lead inspector of their next Ofsted visit.
Worryingly, he also stated with some aplomb that there was only one statutory requirement for reporting to parents (namely that we report on the progress a child has made). Clearly he isn’t familiar with The Education (Pupil Information) (England) Regulations 2005.
This was different to the usual doubts I might have about another school’s approach to assessment without levels. This was not some practising school leader musing on his current thinking (in fact, it appears from his website that Mr Driscoll hasn’t taught for over 25 years). This was someone presented as an expert, offering guidance on how data ought to be managed and presented for the purposes of Ofsted. It was advice that was likely to take priority over much of the other content of the day (including excellent presentations from people such as Katharine Bailey of the CEM at Durham).
It’s true, the problem is not a patch on the risks of poor advice about vaccinations or such things. But the root of the problem is the same: ‘experts’ with a poor message can present more danger than no message at all.
I live in hope that the Assessment Commission set up before the election soon helps to bring some guidance to the profession that quashes the nonsense spouted by ‘experts’ such as this, and ensures that Ofsted is supported to keep its inspectors in line!
For teachers unsure of how best to move forward with assessment, I cannot recommend strongly enough the article by Dylan Wiliam in Teach Primary magazine from last autumn: