FE Labyrinth

Documenting a journey through the labyrinth of further education – @FeLabyrinth on twitter – email FE-Labyrinth@mail.com

Could our different inspectors please talk to each other?

Or, at the very least, could they appear to know of one another’s existence?

Halfway through the second day of the inspection, I visited a class of ten functional skills students with my designated “deep dive” inspector. They ranged in ability from entry level 1 up to Level 1. If you teach functional skills yourself, you’ll know this means it’s a challenging group to pitch a lesson to. After we left, the inspector asked me, with a knowing smile,

“Do you think that is a viable group?”

“In what sense?” I asked, thinking ahead like a chess player trying to work out where the trap was being laid.

“In terms or the range of abilities and needs in the session.”

“It’s certainly a challenge.” I replied, remaining non-committal until I knew where this round was going.

“Are you under pressure from SLT to avoid splitting the class? Is it about saving money?” The question seemed, to me at least, to imply that saving money was a grubby motive. It irked me a little and my game slipped.

“Yes, of course.” She looked surprised, as though I had willingly confessed to a crime. “We’re always under pressure to keep class sizes up. But not from SLT. When we’re audited, we’ll be challenged if they are too low.”

“Oh,” she said, “who audits you?”

What a question! Who audits us? The FE commissioner’s inspections are almost as stressful as Ofsted inspections, and the ramifications of a poor outcome are every bit as serious for the staff and students. The annual internal audits are a huge part of management’s workload at the end of the year.

This may have seemed like a small thing to the (Ofsted) inspection team we dealt with. Indeed, the off-hand comment that was quickly forgotten and we moved on to other conversations, but it highlighted to me one of the fundamental tensions running through the heart of FE. We are a public service acting like a business. Or is it the other way around? It probably depends on whether you ask a curriculum manager or the head of finance.

We struggle, under decades of funding neglect, to provide certain basics for our students. The financial pressure means we are constantly pushing for new enrolments (for new business) and pushing back at the boundaries of what we should accept in classroom standards.

The way we are inspected in FE reflects the duality of our mission. The FE Commissioner inspects the college from a perspective of financial competence. They want to know whether we are running well as a business. Their concerns about classes are limited to whether teachers are being well utilised. Are classes too small? If they are, they could be merged, and the college would make a saving. Are teachers given too much “free” time for planning and development? If so, they could be reduced in number. It’s about making sure public money is being spent efficiently.

Ofsted, however, have a different agenda. They are focussed purely on the educational experience of the students. They don’t care how much an intervention costs if it’s good for learning. An Ofsted inspector sees one-to-one sessions with struggling students and gives a big thumbs up. An auditor looks at the same thing and sees a waste of money.

As funding gets tighter, these two agendas come into increasing conflict and it would be nice if the two organisations would speak to each other. At best, they could agree on certain approaches to give a consistent message. At the least, they could make sure they both know the other one exists.

Intent, Results and the Right Inspector

This is the first of a series of short pieces exploring themes that emerged from my recent Ofsted inspection. You may want to catch up on the last inspection first (“What is the impact of that?”) for some context.

The main difference between this inspection and the last was the quality of my inspector. It was, of course, still stressful. That the stress was not added to by a feeling of being unfairly dealt with was a great relief, however. Talking to an inspector who is determined to find fault and enjoying their position of (enormous) power is grim. It reminded me that our experience of Ofsted is determined at least as much by the team implementing the framework as by the framework itself. This time, I was lucky.

Whereas last year the defining experience (for me) was a surreal conversation about “impact” with an inspector who did not appear to fully understand the statistics she was grilling us on, this time the conversations revolved around teaching methods and evidence of whether students were learning or not. Where my inspector was critical, I felt (in most cases) that the criticism was fair. That is certainly an improvement, though there were still contradictions and frustrations during the visit.

One thing was clear from the beginning. They pointedly did not wish to see internal data. My own ‘deep diver’ at least gave me the opportunity to present her with some data, but I got the impression it did not feature heavily in her decision making. Another colleague’s deep diver put it more bluntly.

“We’re not looking at data,” he said to my fellow manager, who was trying to present him with a ring binder full of impressive results.

Having read about the new framework, I was not surprised to find that data had dropped in importance. However, I expected them to make some concession to the very advice they gave us when we were last inspected, only one year ago. They told us to improve certain results relative to the national average, improve internal data gathering and make it easier to see “at the click of a button,” how individuals and courses were performing. We worked hard on realising those goals. We presented Ofsted with the result of a year’s work to those ends, and they weren’t interested. We were even told that we should not focus too much on “just getting the results.” Well. Schools and colleges up and down the country did not start obsessing over national averages for no reason. In the long run, I welcome the reduced focus on data. I could not help wondering, though, what Ofsted would make of it if I told a class to go away and revise algebra for a week, and then tested them on geometry when they returned. To complete the analogy, I could feign shock at my students’ inevitable outrage and respond with a gasp,

“But you weren’t only revising algebra because I told you to, were you? I mean, you really should have been revising geometry too, because you love the subject so much.”

What were they looking at, then, if they weren’t interested in data? Curriculum planning. They were keen to understand why we put the curriculum together in the order we chose. Fortunately, we have given that a lot of thought. Our students come to us having already sat through maths and English several times, so it’s sensible for us to think differently about sequencing. I did wonder what this conversation is like for my counterparts in secondary schools, though. Does a head of maths in Yorkshire need to have clear reasons for their sequencing which are different, say, from a head of maths in Sussex? Is there something in the air in different regions that means mathematics needs to be approached in a different way? Is it so bad to download the exam board’s recommended scheme of work and go with that? I’m not convinced it is.

I tried to imagine how the ‘curriculum intent’ conversation might go for a head of maths who said,

“For four thousand years, people have needed to know about finding factors before they could cancel fractions, so we do factors before we do fractions.”

It seems odd to me that Ofsted want every school to reinvent the wheel and come up with a unique curriculum sequence, when we are all delivering the same knowledge. If there is a specific order in which it’s best to do things... could they just tell us what it is? Perhaps this conversation makes more sense in subjects like history and geography, which might indeed be thoughtfully modified to account for location.

Their other interest was what students could remember. They pulled them out of lessons and asked them maths questions, sometimes about topics they studied three months ago. I quite liked that. What better way to test if they are learning? Where the inspector deemed learning to be poor, it usually matched my judgement. My only quibble is that it places huge stakes on a small number of conversations. Where such importance is placed on anecdotal evidence it is bound to lead, sometimes, to inaccurate judgements. It is also easy to steer these “conversations” in such a way they provide confirmatory evidence of a judgement the inspector has already made. This brings me back to the quality of the inspector. In my case, she was open to changing her mind and being directed to additional evidence. She seemed admirably determined to make a fair, objective judgement. But what if the same inspector as last time had been conducting those conversations? It doesn’t bear thinking about.

Finally, there was one contradiction running through the whole inspection. Do results matter now or not? The inspectors made it clear they were not interested in data, either internal or external. They also insisted that they did not want to see “exam hot-housing,” and “teaching to the test.” However, when we asked why they had chosen the courses they did for a “deep dive” it was because attainment figures had recently slipped back. So, if your exam results are not good enough, they will come in to check that you are not focussing too much on exam results. There is a clear contradiction here and it highlights my feeling that Ofsted have not yet decided for themselves whether and how much results matter.

The Infinite Resource Machine

Last week I was looking for a worksheet on the general term of a linear sequence. A quick search on the usual website turned up some interesting results. Among the various options, there were three resource bundles ranging from £1 to £3 that looked suspiciously similar. Not wanting to buy them merely out of curiosity, I clicked on the preview pages. Two were identical. So, one person at least was profiting from stolen work. This is quite common on resource sharing (and selling) websites. Why? There’s a bit of background to fill you in on first about why these websites are so heavily used. When I attended the first lecture of my teacher training course, the advertised topic was “differentiation.” It amuses me to recall that, since I was training to be a maths teacher, I thought this might be a session on how to teach basic calculus. “Differentiation,” (as a pedagogical practice) was still a new thing in those days. It wasn’t one of those fads that comes and goes in the space of five years, though. Differentiation has stuck around. The theory behind it all sounded perfectly plausible when the lecturer revealed it to us: every student is at a different stage of development in their understanding of a given topic, and the exercises you give them need to reflect their particular needs. Of course, I had not yet taught a lesson, so I was still thinking very much in the abstract. The practical ramifications of this new philosophy only occurred to me when I began observing lessons in school.

During the first lesson I observed there were at least five different worksheets being completed at any one time (the teacher may have been trying to impress). Gone were the rugged textbooks with their bent spines and phallic illustrations from which I had worked throughout my own time in school. Everything was printed on sheets, with colour-coding at the top to identify the right level of challenge. I was struck by how much paper was being used in every lesson. “Where do you get these sheets from?” I asked. “Oh, I like to make them myself, so I know they’re appropriate for the class I’m teaching,” came the suave and terrifying reply. (He was definitely trying to impress).

When I prepared to teach my own lessons, the expectation appeared to be that I would craft my own resources, tailor made to the precise educational requirements of the students I knew. Not only did I have to provide at least three different levels of difficulty, but it had to be appropriate “for that particular class.” In other words, if I didn’t want to justify exactly why I had chosen a specific worksheet, it had to be new. It was both disheartening and impossibly time-consuming. Preparing for a single lesson took hours and the end result was often terrible in comparison to the professional quality of a publishing company’s book. But when I relayed these difficulties to my peers on the PGCE program, there followed a great deal of hilarity. “You’re actually making your own worksheets?!” A friend asked me, all but wiping tears from his eyes, “Why don’t you just use TES?” That was when I discovered what had replaced the humble textbook. It wasn’t teachers up and down the country busily writing their own worksheets for their unique students who were confused in such unique and original ways that an entirely new set of practise questions was called for. Nor was it the “online textbooks,” that have been rolled out with zero impact year after year since the digital revolution began. It was the vast repository of original, copied, edited and reprinted resources that you can find on the Times Education Supplement website. The discovery of this cache of ready-made worksheets was a great relief to me. I have since found other reliable websites as well. Soon I was able to find what I needed for a lesson, put my name on it and print thirty copies, in under ten minutes.

The problem with this approach is that I wasn’t really tailoring my questions to the specific class in front of me. I was merely pretending to (which turned out to be just fine by my observer, actually). In fact, quite often, I was planning my lessons around what resources I could find at the last minute, in such a way that said resources would appear to have been designed for the class and lesson. It’s interesting to look at which worksheets get the most downloads. The ones with dense rows of questions, resembling a page from an old-fashioned textbook, seem particularly popular. Add an answer sheet (like the back page of a textbook has) and you’ve got a hit, top-rated resource. It’s true that Ofsted have changed their tune about textbooks in recent years, but the culture of education is too saturated now with suspicion of those basic, wonderful tools, for them to make a widespread comeback as the staple classroom resource. I’ve had colleagues who photocopied pages from books when they wanted to use them, so they could hand out sheets rather than risk appearing as the kind of teacher who “does textbook lessons.”

It is this environment in which the TES resources website has become so heavily used and, more recently, monetised. Lessons sold at a premium of up to £5 at the top end are often sold with the promise that they won an “outstanding” grade in an observation. Worksheets are always “Differentiated.” Often, though, these resources are also copied. The editability of the worksheets is part of the appeal, but documents sold in easily edited formats like Word or Powerpoint can also be easily re-sold under a different name. Original creators of content have found there is very little they can do when they see someone else selling their work.

All this individual purchasing from teacher’s own pockets, copying, editing and endless photocopying is not done to improve the education of our students. It is done to circumvent the weird taboo that has emerged in teaching around relying on a textbook. Somewhere, in a parallel universe in which that taboo never arose, teachers are wasting a whole lot less time, less paper is being shunted from the copier to the bin, less money is being spent – and students are having almost exactly the same experience.

How do they get those results?    The national pass rate for GCSE maths re-sits in FE is currently around 16%.  Less than one in every five students having a second (or third, or fourth) go at achieving a grade 4 actually gets it.  For some time, I’ve been looking for a shining example of a college which bucks this trend. 

Last year I saw an advert for a position at a nearby rival (we’re corporations, remember – that's how we think of each other).  The advert highlighted that the successful applicant would be joining an “outstanding” maths team.  To be honest, I don’t want to work in an “outstanding” department.  I’ve done so before and hated it.  I’d rather work in a really good one, but that’s a difference to explore another time.  Though I did not apply for the job, it led me to wonder what their results were like.  I looked them up.  They were much higher than the shockingly low national rate.  How, I asked myself, have they done this?  And why are they not famous for it?  Why are they not travelling the length and breadth of the country showing the rest of us where we err? 

I did a little ferreting around and found my answer.  I am sure their staff are dedicated and capable (outstanding, even) but that is not what explained the high results.  It was when I found their enrolment strategy, passed to me by a friend whose partner worked there, that I said “Ah! That’s how they do it!”  This brings me to the first of three strategies I have come across for improving results without making anyone better at maths. 

They enrol everyone for GCSE maths, including the students who already have a grade 4.  The vast majority of those with a grade 4 or above go on to achieve exactly the same grade they already had – but it still counts in that college’s attainment figures!  This sounds like a trick too easy to get away with, but Ofsted were full of praise for the policy.  “All students are given the opportunity to improve their maths,” the report stated.  In a way, that was true.  Except teaching hours for those students were set at the bare minimum and a blind eye was turned to non-attendance figures which were (I am told on the grapevine) somewhat massaged at the end of the year.  It’s not surprising they didn’t make any progress.  They didn’t need to.  All the college required was for them to tread water and it would make them look wonderful.  It’s a cynical policy, chosen for cynical reasons – but easy to present in a positive light to an inspector on a tight schedule who likes the statistics it produces. 

Here’s another method some colleges employ to improve their results, this time in retrospect!  Suppose you have 100 students sitting GCSE maths and only 16 of them pass.  You would think that’s a pass rate of 16%, right?  Not necessarily.  A truly outstanding maths team does not give up just because the exam was two months ago.  What if 30 of the students who failed return to study with you the following year – in this case, you up can retrospectively decide that those students were on a two-year course.  That means they’ll count in next year’s attainment figures and now, all of a sudden, your 16 passes are out of 70 instead of 100.  Your pass rate just went up to 23%.  Congratulations! 

These students are called “rollovers,” and they are an administrative pain in the rear end, as they have to be enrolled on year-old codes and tracked and monitored separately even though they are in the same classes as everyone else.  It’s more grist for the mill of the vast FE bureaucracy, more man-hours and salaries spent on form filling and paper chasing that benefits none of the people using the service but is so endemic within it. 

Finally, one of the main measures of success for colleges is the general attainment rate.  This is simply the percentage of all students who pass their qualification.  It makes sense for BTECs and other vocational courses to be judged in such a way (apart from the powerful incentive that gives to fudge the in-house assessments and coursework), but it makes no sense at all that GCSEs are rolled up into that figure.  A student currently at grade 2 in GCSE could do either Level 2 Functional Skills (L2 FS) or GCSE under the new conditions of funding.  However, no head of faculty I know wants their students in a L2 FS class.  The reason?  A grade 2 student might not get a grade 4 in their exam this year, but even a grade 1 counts as a positive achievement for the department.  If you fail L2 FS, it doesn’t matter how close you were, you go down as a negative.  So, even when a student has failed GCSE three times in a row but might pass L2 FS, people often won’t risk putting them in for it.  Nobody trusts an inspector to look past the headline figure and try to work out what’s going on behind the scenes.  Having experienced a number of poor inspections, I can’t say I blame them. 

All of the above tactics are, as far as I am aware, legal if not entirely above board.  There are lots of decisions made in FE that fall into a sort of grey area, where you’ll get away with them if the college generally smells good and is not in financial trouble.   

What also boggles my mind is how easy these practises would be to stop.  You could end all three with two simple steps: 

1) Stop counting GCSEs as an achievement if the student doesn’t improve their grade.   

2) Insist that all students who sat an exam count on the figures, with no exception for “rollovers.”   

I cannot fathom why these loopholes persist and are so easily exploited.  The only conclusion I can draw is that the government has simply taken its eye off FE and doesn’t care that much what happens here.  Colleges are all businesses now, so as long as they are financially stable that’s all that really matters.   

How We Make the Predicted Grade Sausage – A walkthrough for those wondering whether to ask their teams to produce one   Now the exams are over, all that is left this year is to prepare for the next.  It’s a chance to clear out cluttered rooms, re-write schemes of work and print resources for the first weeks back – and of course, because this is the world of education, to gaze into our crystal balls and predict what will happen on results day. 

Most governors and SLT like to know in advance what result the students will get.  Some are content with broad achievement predictions.  Others are keen to know how many will achieve a grade 4, how many a grade 5 and so on.   

I’ve been in colleges where the teachers have refused to guestimate grades, and bitter feuds have arisen between the unions and SLT.  I’ve been in colleges in which the accuracy of a teacher’s predictions were used to analyse how well they “knew their students.”  I even heard of a secondary school which made its year 11 students complete the GCSE paper they had just done the following week, in order to mark them and know in advance how well the cohort had performed!  Apparently, the students downed tools and walked out.  These are the lengths we go to in order to see two months into the future.  Why?  Perhaps because at this stage there is nothing more we can do, and nobody likes to feel as though they are doing nothing. 

The following paragraph is a confessional.  It’s a day in the life of the data sausage factory.  I hope to convince you that it’s not worth predicting in advance how well your school or college will do.  You can’t know.  All you can do is create work and acrimony with staff.  Or, at least, if you are going to do it, think about why staff might be afraid to present you with the truth and how might that affect the numbers?  Think about whether you want a realistic picture, or something to reassure you before the summer break. 

It usually works like this:  SLT asks upper middle management (UMM) for the performance predictions.  UMM make a beautiful spreadsheet, full of top-notch conditional formatting and delightfully complex sub categories of grading and ask lower middle management (LMM) to get it filled in. LMM take it to the lecturers who say they have no idea what the students will achieve.  LMM say we can’t tell that to UMM, just put your best guess in.  Some lecturers spend an hour on it, thinking over and over about which student will get what grade.  Some spend minutes on it (and are possibly more accurate as a result).  Others predict that everyone will fail on the grounds they will be held to account if they predict passes which they don't deliver.  Some lecturers predict that everyone will pass, because it’s just easier and they are retiring next week and by the time those chickens come home to roost it will be someone else's nest.  LMM takes this hot mess back to UMM, who declares that it simply won’t do.  You can’t predict a result that’s too low (or too high) or the next few weeks will be hell.  LMM reads between the lines and asks their teachers for another prediction with some tacit guidance on the figure they should ideally reach.  Some lecturers comply, but others refuse to fill it in a second time.  LMM changes a few 3s to 4s and 4s to 5s because it’s getting late and they want to go home and it doesn’t matter anyway.  They come across a handful of students that nobody claims to have taught and randomly select grades on the basis of how familiar their name is, or which song is on spotify at the moment, or what they gave the last three students and how that will affect the mean.  LMM sends the sheet back to UMM, who does some further ‘tidying’ to make sure the bottom line is more or less 0.5% better than last year, and submits the sheet back to SLT. 

SLT know, on some level, that the data is absolute crystal bollocks.  They know this because they were once lecturers or managers doing what it took to keep the next level up off their backs.  The Governers, perhaps, are the only ones who take it seriously and are concerned on results day when at least one of the measures is significantly different from what was predicted and nobody else is in the least bit surprised. 

I’m sure there are schools and colleges around today that don’t bother with this dance anymore, but we still do it every year.  It’s one of the curses of technology.  Just because a tool exists, doesn’t mean it has to be used.  Just because you could make a really nice spreadsheet for gathering predicted grades, does that mean you should?  It is a ritual with built in layers of decision making that exist to protect SLT from seeing something they won’t like.  It would be better to just wait until results day. 

An Example of College Funding Working as Intended

At half past eight on Wednesday morning a group of Ghanaian ladies are bringing their children to the nursery outside our college.  The ten of them are coming in for a day of lessons, which begins with English language, covers maths and a little science and ends with ICT.  It started with one woman enquiring where she could improve her English.  It ended with her and all her friends getting everything they could out of our college in a great example of an institution really trying to serve its community, not just making itself available.  And, crucially, it is one of the few experiences I’ve had which puts the ‘education as a business’ funding model in a positive light.   

There are many problems in FE which can be traced back, in part, to the funding model which has shaped its culture since 1992.  I won’t rehash arguments I have already made in the past, but I have linked it to poor behaviour management culture (https://write.as/fe-labyrinth/behaviour-in-fe-the-problem-with-the-solution), lack of SLT involvement in day to day teaching and the dominance of business departments such as marketing over curriculum (https://write.as/fe-labyrinth/the-business-of-business-planning ).  However, having experienced one positive impact of the “funding per qualification” system it seems only fair to write about that too. 

The broader context of this positive is not so good.  It’s a silver lining situation.  The cloud is that finance recently informed us we are several thousands of pounds below our projected AEB (adult education budget) and needed to find some extra money.  We scrabbled everywhere we could for it.  Anyone who passed an Entry Level test was practically begged to enrol on the next one up.  I’m not ashamed to admit we canvassed some of the less qualified staff (jobs ride on budgets, after all), but we didn’t have much luck.  That is, until we stumbled across a rich vein of untapped ESOL (English as a second or additional language) students.  Our English staff asked the first of our Ghanaian students if she would like to do the next level after she passed an exam and got into a conversation with her about what exactly she needed.  

It turned out her son would soon be attending school in the UK.  What she really wanted, more than being able to speak English, was to help him understand his lessons and homework here.  He would be taught subjects he had begun to learn in another language.  She wanted to know the maths vocabulary, what was taught in the science curriculum, how to use a computer to check his homework and so on.  What’s more, she had several friends in the same situation. 

We went to work finding a way to accommodate them.  We merged classes that had shrunk due to drop outs or passes.  We closed a couple of support sessions that were poorly attended and we worked out who was now free and what they were qualified to deliver.  Our staff come from such a wide range of backgrounds (and were so willing to help out and try something new) that we soon stitched together a bespoke curriculum for adults wanting to improve their English and learn how to support children through the English national curriculum.  It was a good thing to do, and that made everyone feel good.  It also helped that we could tie several qualifications to it and go some way to solving our AEB problem. 

I am certainly not arguing that the way to get colleges thinking creatively about serving the community is to threaten their jobs if they don’t come up with something. It would be nice if that situation had arisen, not out of a desire to avoid losing our jobs, but simply to do them as best we could. Perhaps it could be positive instead of negative (such as bonuses for expanding the business) in a better funded sector.  In the long run, that would get the same application without the long-term disillusionment that can set in when funding is always on such a knife edge.    However, there is clearly some benefit to local communities in incentivising colleges to reach into every nook and cranny to find out where they could help.  It forces us to go out to the community, rather than waiting for the community to come to us.  It means our staff want to help people through the bureaucracy of enrolment, rather than leaving them to figure it out for themselves or fail and walk away.  

Whether these positives match the poisonous impact the funding model has on behaviour standards, academic rigour and overall efficiency is up for debate, but I thought it time to acknowledge that some positives exist which we would not wish to discard. 

Two Stories of Violence on Campus

Does your college put students first? Great. Which ones?   Early in the Autumn term, one of our teachers left work with a long-term sickness.  We found a supply teacher who seemed well qualified and prepared, and he took over the classes.  Early on, we knew we had problems.  Students came to complain they could not understand the lessons and the tutor complained that the same students would not listen.  Leaving aside the details of the story, I want to focus on a single incident and the issues it raised.  In one lesson a girl, who I will call Kira, threw pen lids at the teacher.  My manager and I felt it was clear what should happen.  That student should be removed from college.  Their head of department felt differently.  She struggled to concentrate in the badly taught lessons, he wasn’t listening to her, she had anger management issues, etc.  Everyone defending Kira’s place on the course began with the phrase, “I’m not excusing her actions, but...” 

She was excused.  Though barred her from maths lessons, she was kept on her main course and it was unclear what (if any) sanction was put in place.  That was not the end of it.  Several months later a new student came to my colleague Mike’s class.  She very politely explained that she had changed course and been transferred to this group, and could she sit in today while the admin was being sorted out.  Mike agreed (I know, technically he shouldn’t have without confirmation, but this is FE and messy course changes happen all the time.  He was trying to be helpful).   

Halfway through the lesson the “new student,” Kira, leapt out of her chair and attacked someone.   Mike tried to separate the flurry of bites, scratches and screams and, as he did so, Kira’s nails actually flayed a strip of skin from the other young woman’s arm.  After she had been taken to A&E we learned what had happened.  It was something to do with which one of them owed money to the other over some drugs they had purchased together.  Mike was almost reprimanded.  This had happened, apparently, because he allowed a student to enter his class who was not on the register.  The union objected, rightly, on the grounds that Kira should not have been permitted to remain in the college after assaulting a member of staff earlier in the year. 

In a separate incident a student was chased out of the building and attacked by three of his peers in a shopping centre near college.  It followed an argument about his behaviour toward one of their girlfriends when he was previously in a relationship with her.  Both the alleged behaviour in the relationship and the assault were serious incidents, but when the various charges were all dropped by mutual agreement, the college leadership decided all four students should be allowed to remain on their courses.  Since the assault had taken place off site, it was outside of our jurisdiction.  This was something of a technicality, since they had started pursuing him from the college lobby. 

In the past, I have explained our tolerance of poor behaviour, terrible attendance and bad attitudes in terms of the college funding model.  For large numbers of students causing low level disruption and regularly truanting a handful of sessions a week, this makes sense.  Most colleges could not afford to enforce the kind of standards they would like to see in an ideal classroom.  Our tolerance of serious incidents like the ones above, though, cannot be explained in this way.  Although the college is big, you could count on one hand the number of such cases in an academic year.  We could easily afford to lose two or three students.  We could easily afford to say “if you throw things at your tutors, you can’t study here.”  So, why don’t we? 

I have a theory.  When we are trained as teachers, we are told it’s all about the students.  When we get a new job somewhere, we are told it’s all about the students.  When marketing produce their publicity material, they say “our college puts students first” (I think it’s a rule of college marketing teams that they literally all have to say this somewhere in the prospectus).  When governors interview for a new principal or head, they want reassurance from the candidates they will put the students first.  Of course, it is all about the students, but the platitude is so ubiquitous we rarely think about what it actually means.  The way we usually take it is: I put whichever student is currently demanding the most attention from me first.    Now factor that into the context of college leadership teams who rarely step into a classroom – their only regular interactions with students are disciplinary meetings.  That means their only opportunity to demonstrate how much they put students first is to give whichever unruly character is currently sweet talking them a second, third or fourth chance.  The same is true for governors.  I once approached one at a lunch which had been organised to allow middle managers to meet the governors and tacitly brought up the question of what exactly our standards were.  She wrung her hands and sighed deeply, then told me with the compassionate smile of a great sage that we did have to keep standards high, but we also had to put the students first. 

The problem with this, of course, is that there are more students in the college to think about than just the ones demanding the most attention.  The most disruptive, violent and dangerous students in our setting occupy the majority of our leaders’ limited contact time with the student body.  But they make up less than one percent of that body, and their continued presence in it damages all the rest, sometimes quite literally.  A second chance for the Pike is bullying, distraction and possibly a trip to A&E for the minnow.   

In the case of Kira, she would not have had the chance to abuse a fellow student if abusing a member of staff had been a red line.  In the case of the boys who committed the violent assault in a shopping centre... who knows what further developments are down the line?   

No one goes to work in a college because they hope one day to kick a student out of it, but an increasing number of my colleagues are looking around and asking each other “why are we allowing this?”  We wonder what message it sends to the civil majority when they see how accommodating we are of uncivil behaviour from a small minority.  Perhaps it gives them the impression they don’t need to try so hard.  Or, perhaps it makes them feel like we are not putting them first. 

Who needs numbers anymore? 

  This week a student dropped in to a support session looking for help with written arithmetic.  This is more noteworthy than it should be.  Whilst all our young learners need help with arithmetic, most of them believe such “primary school stuff” to be beneath them.  Besides, they say, there are barely any questions on that in the exam anyway.  They’re right.  Even in the new “rigorous” GCSE only one out of three papers are non-calculator.  However, he was preparing not for his GCSE maths exam, but for an armed forces aptitude test.  After a quick flick through his practise book I began to wish all our students had to sit this test.  If passing it were a prerequisite to studying for the GCSE, I bet the GCSE pass rate would be a lot higher.    

I was enjoying showing him how to multiply decimals when the student provided yet another pleasant surprise.  His go to method for multiplication turned out to be the column, rather than the grid method.  Why was this a pleasant surprise?  Once I saw that he was able to multiply in quick and efficient steps, I felt instantly reassured.  I was certain I could teach him anything else he needed to know. As a teacher in secondary school, I rarely met a student who could use the column method.  When I did, though, they were usually among the quickest learners. 

I understand why the grid method is taught in primary (and secondary) schools today.  To someone unfamiliar with either approach it looks clearer on the page.  It’s proponents state that it is more “intuitive” and provides students with a deeper understanding of what is happening when you multiply.  It also has the advantage of being a bit newer, and everyone likes to be doing something new.  But basically, it’s easier.  Like “chunking,” (its equivalent in the world of division) it is also painfully slow and requires only a weak grasp of place value.   

In my last two years of primary school I had an ancient maths teacher who made us spend the best part of a lesson working through pages of sums in silence.  There were multiplications in there over which my students today would go on strike.  Huge ones, with multiple steps involving four and five-digit numbers!  I derived immense benefit from it.  It takes a lot of repetition to achieve fluency in a complex method.  Solving problems at speed makes us better at them.  But you cannot solve long multiplications at speed using the grid method.  Beyond three and two-digit numbers it is just too clunky.  It is a gimmicky method, designed to get a student through simpler questions and allow the teacher to tick off that section of the curriculum.  Lip service is paid to “learning other methods later,” but from what I saw in secondary school that is rarely realised. 

Most of the people who observe and rate lessons today would issue a judgement of “needing improvement” to my primary school maths teacher.  Modern commentators on education decry such practises as a waste of time in the age of the smart phone.  But I am convinced the ease with which I later picked up more advanced mathematical topics is directly related to my fluency in written arithmetic.  I see evidence of it every day in the students I teach.  Our older adult learners arrive confident in long multiplication and short division, having had it drilled into them in their early years of school.  They have often never heard of stem and leaf diagrams or probability trees, and yet they pick up these supposedly more advanced concepts with remarkable ease.  Meanwhile, our young learners have been prepared by well-meaning professionals for a world in which everyone has a calculator on their phone.  They didn’t waste time drilling written arithmetic, they gave it a quick pass, got it right once and barely practised it again.  They moved on quickly to the skills and concepts that matter in the modern world.  But they never mastered them.  They come to us having spent five years learning about probability, yet still they cannot understand it.  Within months, the more numerate adults (for whom this is often brand new) have overtaken them. 

Our assessment as it stands emphasises knowledge that is the tip of the iceberg.  The huge body beneath the water is the ability to apply precise algorithmic steps – and that comes with confidence in number work.  You cannot teach it once and move on, terrified of going back to something you’ve already covered because you won’t be showing progress.  It needs to be practised over and over, and then constantly maintained.  It’s the key to unlocking everything else.  It just doesn’t look very exciting to the man at the back of the room with a tick sheet on his clipboard.  In FE we are as obsessed as everyone else with showing progress in lessons.  Ever with one eye on our retention figures, we are even more squeamish than schools about making students do things they don’t want to.  Woe betide you are caught spending a term making grade 3 learners practise how to add, subtract, multiply and divide.  “Why are you teaching them primary school topics?” is a question that is asked far more readily than “Why on earth can’t they do this?”  So, we continue plastering over the cracks, ignoring the fact that the whole edifice is built on shaky foundations.  Even if we get it passed by the building inspector (a grade 4 in this thinly stretched analogy) we know it won’t hold up in the long run. 

Amidst the recent talk of scrapping GCSEs, I would like to propose instead adding a new compulsory qualification.  Make everyone sit and pass the armed forces aptitude test before they are even allowed near an algebraic expression.  Call it the GCSE readiness test.  Make them renew it in year nine before starting the KS4 curriculum.  Make schools and colleges give numeracy the importance it deserves, not as a set of antiquated methods rendered obsolete by modern technology, but as the key to unlocking everything else we want our students to learn in their maths lessons. 

Why Functional Skills Can’t Be What People Want It To Be

Every now and again someone outside the world of education bemoans the lack of numeracy qualifications for post-sixteens.  There are many students, they argue, who need to improve their maths but don’t need to learn about trigonometry and surds.  They need a course on using numbers practically, something that emphasises the relevance of mathematics both in everyday life and in technical jobs.  They need something, essentially, like functional skills (FS). 

A clamour then follows from the profession that such qualifications already exist.  If only employers valued them more, we could release our resit students from doing a course which they have failed already (often more than once) and see as less relevant to them with every passing year. 

This complaint, that people do not adequately respect functional skills, often comes across as though the fault lies purely with those who will not accept a Level 2 certificate instead of a GCSE.  People should respect FS and if they don’t it is because they are ignorant of its content, the argument goes.  All we need do is raise employers’ awareness of the course.  This has been tried and has not worked.  I think it is worth considering another, less palatable, reason why the qualifications are not well regarded. 

Last year we had a student in one of our FS maths classes (let’s call him Chris), who struggled at the beginning with adding two-digit numbers.  He was not disruptive or lazy and he was making good progress in his main course.  He paid attention and worked diligently, but he found it really, really hard.  Over the year he made a little progress.  We have a couple of very experienced Entry Level teachers and they used every trick in the book to get him doing some basic arithmetic. At the end of the year he sat an Entry Level 3 exam and passed it.  We gave him a Level 1 paper to see where he was at with it, but he would clearly need a lot more teaching before he could attempt the exam. 

However, his course leader was not happy.  Chris could not progress without a full Level 1.  (For those unfamiliar with this level of qualification, Entry Levels 1, 2 and 3 come first, then Level 1 and Level 2.  Level 2 is, in theory, an alternative to a good GCSE.  Many BTEC courses require a Leve1 1 certificate in Maths and English to progress from year 1 to year 2).  We explained that we had entered him for hardest test he could pass this year.  He could try Level 1 next year after further studies (this would mean resitting his first year, or moving onto a bespoke programme doing more hours of maths and keeping his main course on the back burner).  We thought that was the end of it. 

This year we enrolled Chris in a level 1 class, only to be told by our auditing team that he was in breach of funding.  He could not do Level 1 FS, they said, because he already has that qualification.  We checked.  It was true.  He had been booked into Level 1 FS in the early summer, brought in for an “intensive maths course” (not with us), and had miraculously passed the exam.  So now he had to do GCSE.  

There is obviously a great deal to be said about the impact this will have on the student.  I want to focus, though, on the impact on the overall qualification.  In other words, the impact on all students who have fairly obtained the certificate.  Chris (and other students like him) will arrive at an apprenticeship one day having shown his level 1 FS certificate at the interview.  They will at some point set him a task requiring a rudimentary level of numeracy.  He will not be able to do it.  The next time someone comes with a FS certificate, the employer will say “go back and get a GCSE.” 

I don’t know how widespread this malpractice is, but I seem to meet a lot of students in GCSE classes who have done Level 1 FS either in school or college, yet cannot do basic arithmetic.  And I can see why it happens.  In our profession, both institutions and individuals are held to account for what results they achieve.  Combine that with informal, internally invigilated exams and you have created the perfect conditions for the erosion of integrity. 

There are grey areas here, too.  It’s not all out and out cheating.  Some staff find themselves alone, invigilating a small group of students who they know personally.  Perhaps one of them struggles to read the question and the invigilator puts a little more emphasis on certain words than they should do.  There are people who will be aware of this happening, but know that looking the other way and celebrating the high attainment figures is in their own interest, too.  And of course, once people realise this goes on in other institutions, why should they be the ones to suffer at the hands of Ofsted because of their own honesty? 

I am not making excuses for this behaviour.  I find it unprofessional and depressing.  But it is worth looking at why it happens.  We will sooner change poor systems than change human nature.  And better systems lead to a better, more professional culture.  The only way I see to raise the profile of FS qualifications is to assess them with the same rigour as GCSEs:  once a year with both internal and external invigilators.   

I once put this argument to an exam board representative.  He made the valid point that the flexibility of FS is one of its attractions and a rigid, annual assessment schedule similar to the GCSEs would make it less accessible to adults.  He also seemed to disbelieve that the integrity of the exam is a real issue in determining its status.  But employers can put two and two together (even if their students can’t).  Perhaps FS would be less accessible to some who need it were it to be assessed more formally, but nobody needs a qualification that has ever diminishing currency because it is not trusted to reliably indicate ability. 

A lifetime of target setting

One rainy day in year nine I arrived a little late to form time and discovered that my teacher had arrived early and begun an activity with the class. This was highly unusual. He was one of those old-school useless-but-wonderful types. Form time for us was a loose twenty minutes in which we chatted to whoever was next in the alphabetic seating plan before lessons started. I know the best teachers and leaders around today would decry that twenty minutes as a horrible waste of time, but I can’t help feeling sorry for its loss. Not every moment of every day needed to be measured and accounted for. Form time now is a fraught affair of hurried uniform checks, maths and English revision, money management, British values and all the bits and pieces that get dropped into the curriculum because someone stands up in parliament and says “schools should be teaching this.” It was during form time of year nine that I forged one of my most enduring friendships swapping science fiction books with the boy next to me. On this particular morning though, there would be no talk of time travel, the ethics of meat eating or evolution. I arrived to find a sheet of paper on my desk with three empty boxes on it.

“What’s this, Sir?” I asked.

“You need to write targets, Colin”

“What?” I asked the boy next to me.

“Don’t know what he means either.” He muttered.

Having overheard us, our form tutor explained,

“You have to think about what you could do better and write that down, and at the end of the year you’ll evaluate it on the back.”

This was my first encounter with setting myself targets. I did not know that I would eventually learn to write them as quickly, absently and uselessly as tying a tie.

“What happens if we don’t achieve them?” My friend asked.

“Nothing.” Replied our form tutor, candidly.

“What happens if we do achieve them?”

“Nothing.”

See, I told you he was wonderful.

I wrote: 1. Improve my handwriting 2. Revise more 3. Daydream less. Although I got better at two (several years later in university) I never made much progress on the other two points. Luckily, I never had to complete my evaluation either. Those first targets went the same way as all the rest that followed them. A drawer somewhere, and then the bin when the drawer space was needed for someone else’s targets.

This was my first experience of a practice that is now so widespread we can barely imagine what life was like before people regularly articulated what they thought other people might want them to want, as a sort of professional ritual. I became adept at writing the twaddle that was deemed acceptable. “I will practice using the accusative in German. I will revise trigonometry. I will read more than just science fiction.” In other words, I will be in school. In the first semester of university I had to write ‘personal development targets’ which included making new friends and trying new activities (ie, being in university). In my first job as a mortgage complaints handler I set myself targets which, apparently, my pay progression depended on meeting. It later transpired that nobody cared about them and promotion depended on being able to do the job. When my boss realised that I was competent, he changed my targets to ensure that I had already met them.

When I went into teaching and started a PGCE I discovered that I would now have to inflict target setting on other people. Indeed, given the amount of lecture time dedicated to this topic (as opposed to, say, behaviour management) it seemed it would be a central aspect of my role... with a difference. This time they would be meaningful. If I believed that writing targets had previously been a complete waste of time, paper and human spirit, I was right! How foolish we all were, laughed the lecturers... How foolish we were not to have used ‘SMART’ targets! Those were the bad old days. The Dark Ages. Education had moved on. This was the decade of the smart phones, smart cookers, smart children’s toys and now SMART targets, too. SMART targets would make a real impact on our students’ lives. Why? Because they were – Specific. Manageable? Achievable (I know that’s right). Realistic, or recognisable? Timely timed something about time...”

Whatever.

I never remembered what all the letters stood for because they were obviously just five words chosen to make the acronym ‘SMART’ and in practice were no different from what I first experienced in year nine. You just had to write more and jump through more linguistic hoops. It was no longer enough to say “I will revise more.” You had to say “I will revise for 10 minutes longer each day over the next month by maximising the efficiency of my time management skills in a blah blah blah you get the idea...” Bonus points if you could shoehorn the word ‘synergise’ in.

Of course, there can be great value in writing down our personal priorities and clarifying our thoughts on what is important to us. When I started this job, I set myself some goals for the end of the year. I kept them in a word document on my desktop and regularly reminded myself what I thought mattered when admin was cluttering my day. It helped keep me focussed on a manageable set of priorities. A bit like targets are meant to do. But the difference between that and the guff I wrote in my official appraisal is that it was private. It was not written to tick someone’s box. And that makes all the difference. One of my goals was to make sure staff were teaching the full lesson rather than finishing early, which was common. To write that on a formal document would have been inviting disaster. Another was to change the culture around equipment and have the students bring their own calculator. That looked too trivial for SLT, but it’s a battle we’re still fighting and it matters on so many levels: personal responsibility, smooth running lessons, not to mention actual performance in the exam.

The problem with making people write targets, whether they are SMART targets, SMARTER targets (yes, they exist), SMARTAA, CLEAR or even PURE targets is precisely that we are making them do it and looking over their shoulder at what they write. No matter what acronyms we come up with, the practice is empty unless a person is doing it honestly for themselves. If we wanted them to get something out of it, we’d tell them to do it privately, but of course that cannot be tracked and ticked off.

So, who are targets written for if they are not written for the employee or student? In the private sector they are for HR, who need to create the appearance they are constantly adding value to employees. In education they are written for SLT and the governors. “Do all the students have SMART targets?” someone will ask in a meeting. If the reply is yes, those present feel reassured that important conversations are happening and students know what they are supposed to be doing. But what they have actually ensured is that good lesson time is wasted measuring (inaccurately) whether or not lesson time is being put to good use. It’s one of those things, like collecting meaningless predicted grades, that makes people feel in control of the unknown. A better use of their time would be simply walking into classrooms, talking to lecturers and students and seeing for themselves what is happening.

NOTE: Some people are more versed than I in the history of SMART targets. Sam Shepherd sent me a piece he wrote on it which I found very interesting (you can find it here: https://samuelshep.wordpress.com/2011/02/02/whats-so-smart-about-targets-arguments-against/).