Intent, Results and the Right Inspector

This is the first of a series of short pieces exploring themes that emerged from my recent Ofsted inspection. You may want to catch up on the last inspection first (“What is the impact of that?”) for some context.

The main difference between this inspection and the last was the quality of my inspector. It was, of course, still stressful. That the stress was not added to by a feeling of being unfairly dealt with was a great relief, however. Talking to an inspector who is determined to find fault and enjoying their position of (enormous) power is grim. It reminded me that our experience of Ofsted is determined at least as much by the team implementing the framework as by the framework itself. This time, I was lucky.

Whereas last year the defining experience (for me) was a surreal conversation about “impact” with an inspector who did not appear to fully understand the statistics she was grilling us on, this time the conversations revolved around teaching methods and evidence of whether students were learning or not. Where my inspector was critical, I felt (in most cases) that the criticism was fair. That is certainly an improvement, though there were still contradictions and frustrations during the visit.

One thing was clear from the beginning. They pointedly did not wish to see internal data. My own ‘deep diver’ at least gave me the opportunity to present her with some data, but I got the impression it did not feature heavily in her decision making. Another colleague’s deep diver put it more bluntly.

“We’re not looking at data,” he said to my fellow manager, who was trying to present him with a ring binder full of impressive results.

Having read about the new framework, I was not surprised to find that data had dropped in importance. However, I expected them to make some concession to the very advice they gave us when we were last inspected, only one year ago. They told us to improve certain results relative to the national average, improve internal data gathering and make it easier to see “at the click of a button,” how individuals and courses were performing. We worked hard on realising those goals. We presented Ofsted with the result of a year’s work to those ends, and they weren’t interested. We were even told that we should not focus too much on “just getting the results.” Well. Schools and colleges up and down the country did not start obsessing over national averages for no reason. In the long run, I welcome the reduced focus on data. I could not help wondering, though, what Ofsted would make of it if I told a class to go away and revise algebra for a week, and then tested them on geometry when they returned. To complete the analogy, I could feign shock at my students’ inevitable outrage and respond with a gasp,

“But you weren’t only revising algebra because I told you to, were you? I mean, you really should have been revising geometry too, because you love the subject so much.”

What were they looking at, then, if they weren’t interested in data? Curriculum planning. They were keen to understand why we put the curriculum together in the order we chose. Fortunately, we have given that a lot of thought. Our students come to us having already sat through maths and English several times, so it’s sensible for us to think differently about sequencing. I did wonder what this conversation is like for my counterparts in secondary schools, though. Does a head of maths in Yorkshire need to have clear reasons for their sequencing which are different, say, from a head of maths in Sussex? Is there something in the air in different regions that means mathematics needs to be approached in a different way? Is it so bad to download the exam board’s recommended scheme of work and go with that? I’m not convinced it is.

I tried to imagine how the ‘curriculum intent’ conversation might go for a head of maths who said,

“For four thousand years, people have needed to know about finding factors before they could cancel fractions, so we do factors before we do fractions.”

It seems odd to me that Ofsted want every school to reinvent the wheel and come up with a unique curriculum sequence, when we are all delivering the same knowledge. If there is a specific order in which it’s best to do things... could they just tell us what it is? Perhaps this conversation makes more sense in subjects like history and geography, which might indeed be thoughtfully modified to account for location.

Their other interest was what students could remember. They pulled them out of lessons and asked them maths questions, sometimes about topics they studied three months ago. I quite liked that. What better way to test if they are learning? Where the inspector deemed learning to be poor, it usually matched my judgement. My only quibble is that it places huge stakes on a small number of conversations. Where such importance is placed on anecdotal evidence it is bound to lead, sometimes, to inaccurate judgements. It is also easy to steer these “conversations” in such a way they provide confirmatory evidence of a judgement the inspector has already made. This brings me back to the quality of the inspector. In my case, she was open to changing her mind and being directed to additional evidence. She seemed admirably determined to make a fair, objective judgement. But what if the same inspector as last time had been conducting those conversations? It doesn’t bear thinking about.

Finally, there was one contradiction running through the whole inspection. Do results matter now or not? The inspectors made it clear they were not interested in data, either internal or external. They also insisted that they did not want to see “exam hot-housing,” and “teaching to the test.” However, when we asked why they had chosen the courses they did for a “deep dive” it was because attainment figures had recently slipped back. So, if your exam results are not good enough, they will come in to check that you are not focussing too much on exam results. There is a clear contradiction here and it highlights my feeling that Ofsted have not yet decided for themselves whether and how much results matter.