“What is the impact of that?” A reflection on our Ofsted visit.

Last week we got the call.  I have been through a few Ofsted inspections, but this was my first as a middle manager.  I always wondered what happened in those meetings with the inspector that my head of department feared as much as we feared the classroom door opening.  They were spoken of in hushed tones and, whatever the result, someone was always crying afterwards.  Yet, I was convinced that it would be ok.  I had a lot to say about what we were doing, and was willing to talk honestly about some of the challenges. 

The meeting began with a discussion of the data.  We had obviously prepared for this, and started by showing that our achievement in maths and English was just above national average for the sector, a big improvement on the years before.  This was a mistake.  Were we so lacking in ambition, so complacent about low sector averages, that we were willing to accept such appalling success rates?   

“Well, no,” we replied.  We have all sorts of plans in action to improve things further, but for some years now we have been accustomed to the national average as a benchmark.  We made that comparison for them, not for us. 

“So why are you talking about national rates?”  She asked.  “This isn’t about other colleges, this is about you.” 

We were dumbfounded.  There was only one reason to talk about national averages.  There is only one reason anyone ever talks about national averages.  Ofsted.  It is Ofsted who started comparing everyone to the national average.  It was Ofsted who initiated a Sisyphean bid to make all schools better than the mean.  And so, for Ofsted, we prepared data making the comparisons we knew they would draw.  No one gave us the memo that said “stop doing that, it will now be used as evidence you lack aspiration.” 

We drew her attention to the value-added progress rate of our students, which is low but positive.  She appeared momentarily surprised that the sector as a whole has negative progress for maths.  Then she tried to tell us our data was wrong as negative progress was not possible.  I watched in amazement as my manager explained to her, as tactfully as possible, how the value-added figure was reached.  Flustered, she fell back on her previous insistence that it was “not good enough to be better than the sector, because the sector is so low.”  

We tried another tack and began showing her our plans for this year.  We led with a new assessment policy focussing on mastery of the basics rather than exam cramming.   

“What’s the impact of that?” She asked. 

“We’ve only been doing it for half a term.”  We replied “We’re at the stage of ensuring it’s implemented consistently.” 

“So, you’ve done no analysis of its impact?” 

“How would we analyse impact after half a term?” we asked. 

Silence 

“Well let’s look at the next policy.” She said, with an air of tiredness, exhaustion even, at so stupid a question. 

We introduced a new homework policy for maths.  “Impact” is the new buzzword I thought, and led with that.  I showed her that in the first half term almost 80% of students had done some homework.  Don’t compare this figure to what you’re used to in secondary.  Getting maths homework off drama students is about as easy as getting mathematicians to dance at a disco, and a lot of colleges have effectively given up.  This was a huge improvement for us and, I thought, a demonstrable impact.  She looked over her glasses at me and said with a smile, 

“So, what’s the impact of that?”  Her voice dripped with such derision that it was almost a rhetorical question. 

“Well,” I said, “What I am showing you is impact.  We have a new policy, we have done a lot of chasing up and engagement with homework has gone up dramatically.” 

“Yes,” she said, now removing her glasses, “but what is the impact of that?” 

“You mean, what is the impact of them doing homework?” I never thought I would have to explain to an Ofsted inspector why students doing homework was a good thing. 

She nodded with the smile of a patient teacher, whose weakest student is finally beginning to understand. 

“I hope the impact will be improved knowledge of maths and better results...” 

“So, you don't know if it’s had any impact at all.  That hasn’t been very successful has it?” 

She began to move on to another point, but I could not resist pushing her on this. 

“Sorry,” I said, “but just so I know for next time, how would you demonstrate that students doing homework has had an impact?  What would that evidence look like?” 

“Well...”  She had a similar look on her face to when my manager explained mean progress scores to her, “well...”  She thought a little longer, “Look, you don’t have to do anything in particular to show the evidence.  We’re not asking for folders of evidence.  If something is working your students will just be full of it,” once more in her stride, she threw her hands up in the air in a gleeful demonstration of a student being full of the joys of maths homework. 

“But why do you think the policy is not working?” 

“Well, it’s not working is it?” I could feel my colleague's irritation transferring itself from her to me as I pressed her. 

“Why is it not working?  I want to understand what you have seen that tells you it’s not working.” 

At this point she finally talked about the only observed maths lesson.  One maths lesson.  In a college which has at least 20 taking place every day.  And in that one maths lesson, she did not see or hear anyone talking about homework.  The teacher did not explicitly refer to it in her planning. So that was it, it wasn’t working.  Never mind that I could show her the actual homework.  Someone had to say “my, that homework I did last night really had an impact on my learning,” in front of her or it had not happened. 

The meeting went on in more or less the same vein.  Our principle was sanguine about it, and apparently did not much care what the inspector thought.  Her own theory was that she did not understand the sector, or the data, and compensated for it with aggression.  It meant a lot to have her support, and I can only imagine how we all would have felt if we had not. 

We came out ok.  We got a reasonable grade.  Other areas got a reasonable grade, too, some of which I know have serious flaws which went unnoticed.  In the end, she just said “yeah it’s all ok” without really analysing or understanding anything.  I got the impression she thought her job was to make us all as miserable as possible, before benignly telling us to carry on as before. 

This experience, and stories of similar experiences from colleagues, is why I cannot yet get excited about the latest noises coming from Ofsted.  Yes, they are saying all the right things.  They don’t want to see obsessive, useless data gathering.  They want to see workload being tackled.  They will no longer narrowly focus on outcomes.  When I listen to Amanda Spielman talk it sounds wonderful. 

The problem is that it won’t be Amanda Spielman who comes to inspect my workplace.  It will be someone else from Ofsted, who may or may not be interested in the latest debates.  Who may or may not have nodded off during their last CPD session.  Who may or may not have gone into the inspectorate to escape from their own woes in the classroom.  Of course, there are great inspectors, too.  But it’s luck of the draw whether you get one or not. 

It is why I am particularly worried about Ofsted looking for “integrity” in schools.  How will integrity be measured?  Will it be obvious?  It may be, to diligent, intelligent and experienced inspectors at the top of their game.  What about the other ones? It won't be long before mediocre schools are training their staff to show integrity and, sometimes, fooling mediocre inspectors.

The question Ofsted needs to ask itself about any new policies is not just “What will the impact of this on schools and colleges be?” but, “What will this look like when it is interpreted and applied by the least interested, least competent members of our team.”  Because somewhere, in someone’s workplace, it will be.