Monday, 11 January 2021

Department for Education - Requires improvement or Inadequate?

Alongside lots of sectors the English exam system has been thrown into turmoil by the impact of the Covid-19 pandemic. For the first time ever exams were cancelled in the summer of 2020, and now they have been cancelled again for 2021. All of the changes made as part of these cancellations have been done under the direction and leadership of the Department for Education, with Gavin Williamson, the Secretary of State for Education having overall responsibility.

With the cancellation of the 2021 summer exam series for A-levels and GCSEs, schools and colleges responsible for exams are once again in a state of confusion and another cohort of students are currently uncertain about how their qualifications will be awarded. The exams system has now been in turmoil and uncertainty for 10 months and that is set to continue at least until the GCSE and A-level results are released in August 2021. 

Of course there have been impacts in many sectors, and the perhaps over used but appropriate labelling of 2020 and 2021 as “unprecedented times” may excuse some aspects of this uncertainty. However, when the changes to exams for 2020 and 2021 are looked at as a whole there is a pattern that emerges of missed opportunities and warnings that have been ignored, let’s start in March 2020…

Closure of schools in March 2020

When Gavin Williamson announced on the 18th March that schools would close to the majority of pupils from 20th March it instantly raised the question about what would happen for those students with exams scheduled at the end of the academic year. (announcement)

What the DfE could have done is place exams “under review” at least for a few weeks while the impact of school closures were evaluated and consultations could be progressed on possible routes for providing grades. Keeping exams under review would have given students in exam years a clear reason to stay engaged with schools and complete their courses. Through review and consultation with schools and colleges, alongside Ofqual and Exam boards, it may have been possible to establish some form of structured, standardised moderated assessments that could be conducted later in the academic year.

What the DfE actually did was immediately state on 18th March that all exams are cancelled for summer 2020. With schools closed and no exams it became clear that large numbers of Year 11 and 13 students were disengaging from their school and college courses altogether.

In April 2020 Ofqual issued guidance making it clear that any work done by students after 20th March must be treated with extreme caution (see here). This instantly made any further work set or done by examined students pointless, and therefore most schools and colleges stopped setting any at all for exam years.

The impact these decisions had is that many Year 11 and 13 students disengaged from school or college altogether, and as a result many of them will not actually have completed the courses that they now have GCSE or A-Level qualifications in.

Keeping exams under review in the early days of school closures would certainly have increased pressures on schools who would have had to include provision for exam years as part of their hurried moves to remote learning. However ruling out exams altogether at such an early stage certainly harmed the integrity of the qualifications that those students now have.

Method of awarding grades in 2020

What the DfE could have done is use Centre Assessed Grades (CAGs) alongside statistical models, and in fact this is actually what they said they were going to do (see below). This should have ensured that there was a process to compare CAGs with the grades expected from statistical models. As part of this build in allowable variances to recognise that professional judgements of the 2020 cohort may not conform perfectly to historical statistics. Where CAGs were clearly out of line with statistical models seek supporting justifications from schools and colleges (in fact the Ofqual guidance issued in April 2020 [here] stated that centres should retain records of supporting evidence “… in case exam boards have queries about the data”). If justifications are accepted then the CAGs could stand, where justifications were insufficient the statistical model may have overruled, or perhaps where there were large changes there could be an element of splitting the difference.

This would of course require discussion of contentious grades between exam boards and exam centres before results days, however there was plenty of time to plan in the processes for this. Alternatively if the results could not have been discussed in advance a fuller appeals process could have been established that would allow schools and students to submit convincing evidence where an individual was badly served by the statistical model.

Included in all of this could have been the general principle that the statistical model would not raise grades awarded by centres. While it is logical, natural even, for teachers to err on the side of optimism when awarding grades, if a professional has stated that a student should receive a B grade at A-level it is likely because they feel the student genuinely isn’t of sufficient ability to secure an A grade.

What the DfE actually did was design an algorithm that only applied statistics to the rank orders provided by centres and allowed no substantive scope for appeal.

While it was stated in the Ofqual documentation that CAGs would be considered the algorithm only considered the rank order of students with the CAGs completely ignored, even when there were multiple grade differences between the CAG and the statistical expectation.

Despite stating that schools should keep suitable evidence to support grading decisions, there were no processes in place to use this evidence as part of any appeal, and no proactive action taken to query discrepancies between CAGs and the algorithm. Where the algorithm assigned a grade that was different to the grade given by teachers the algorithm was taken as correct without question.

The algorithm artificially raised some grades when it calculated that the school was entitled to more high grades, even if the teachers knew for a fact that the students do not deserve those grades. The algorithm also lowered other grades from CAGs when the statistics didn’t fit the individuals and completely ignored the grades that teachers had assigned, with no scope for hard evidence to be submitted to challenge this.

The process even forced schools to rank students differently even if their performance was academically identical on all school measures. When the algorithm was applied students on the same CAG but different rank were treated very differently, meaning students on the same CAG were awarded different grades, sometimes 2 or 3 grades different.

Early warning of concerns of an algorithmic approach emerged with the issue of results in Scotland, and the devolved Scottish government rapidly U turned and reverted to awarding CAGs. This was ignored and dismissed by the DfE for England and A-Level results were issued via algorithm on 16th August regardless.

The appeals process established was only based on challenging factual errors in the application of the algorithm, or when there were large scale statistical differences that the algorithm failed to account for. There was no scope to appeal small scale local departures from statistics or clearly unfair outcomes at an individual level. 

On 17th August, just 24 hours after results issued the decision in England was overturned, reverting to CAGs and also applying any grade uplifts from the algorithm. This meant that not only was there an element of grade inflation from CAGs, but this was actually increased via the algorithmically uplifted grades.

On 27th August 2020 Boris Johnson placed the blame for the resulting fiasco on a “mutant algorithm”(here). As an A-Level Further Mathematics teacher who teaches a unit on “Algorithms” I can assure you that algorithms do not mutate. The key point of an algorithm is that they do specifically and precisely what they are designed to do and nothing else. The fact that the outcome did not deliver the desired results is entirely the responsibility of those that designed, tested and quality controlled the algorithm.

The fundamental flaw was that the algorithm and all the related quality assurance processes were only focused at a headline statistical level. Nationally the algorithm worked as it produced broadly the right pass rates, the same can even be said at a whole school level. However because the algorithm paid absolutely no notice to the CAGs as judged by professionals, and ignored the impact at an individual student level it completely failed to build in year to year or student to student variation.

The impacts these decisions had are massive grade inflation across GCSE and A-Levels for 2020, combined with, public uproar and loss of faith in the exam system. Delays to the decision meant that A-Level students with grades that changed missed out on university places in 2020 because courses had already been filled by other students.

Start of the school year in September 2020

What the DfE could have done is state that the intention was to run exams in the summer of 2021 but ensure that there was a parallel system in place to guide schools in building a strong evidence base to support potential CAG use in case of disruption.

Possible examples of a parallel system could be clear national standards for the timing and nature of mock exams, perhaps even specify common exam papers to use nationally so that there is a central reference point across schools. Alongside or alternatively exam boards for each subject could have established some core common assessments for each subject that can be done in schools at key points during the year to get a common base of comparison.

Schools could have been given clear and early direction on “Special Consideration” and how this applies to disruption due to the ongoing Covid-19 pandemic (Special consideration is the process through which schools make requests to exam boards to account for pupil illness on the day of an exam or other significant disruption to their exam or preparation). This guidance for example may have laid out how many days of isolation or remote learning would qualify for special consideration. Perhaps give clarity on whether it matters when in the year isolation or remote learning occurs, or whether specialist teacher absence should or could be accounted for. If the student themselves is made unwell due to Covid-19 at any point in the year should a consideration to be made? If a student is impacted by Covid-19 related bereavements of family or close friends, or of teachers should there be special consideration given? Or should account be made for student household impacts such as illness, loss of income, or furlough of parents?

The above guidance could have been put in place at any point during the autumn term, or at least plans established to put this guidance in place with a clear process of consultation and defined point where a decision would be made. Particularly as Covid-19 cases rose nationally from September through December and the fact that attendance at schools was being substantially impacted there should have been some response along these lines.

What the DfE actually did was repeatedly state intention to run exams.

On 12th October Gavin Williamson announced a delay to the exam season to allow students more time to prepare (here).  As part of this he committed that the exams would be “underpinned by contingencies for all possible scenarios”, with a promise to consult further with stakeholders and publish more detail later in the autumn term.

On 3rd December Gavin Williamson announced some further measures to support exams in 2021 (here). These extra measures included a suggestion of students being told some exam topics in advance and  more help in terms of reference materials in exams, however there was no actual detail given at this point so schools and students could not respond in any way other than preparing for exams as normal. He also announced that grading would be more generous than normal, which simply means that the grade inflation impact of 2020 would be carried across into 2021. While he announced an expert group to monitor the variation in learning across the country he gave no actual detail that schools could work with and plan on. There was no confirmation of any form of contingency planning.

When Scotland cancelled their National 5 exams in October (here) and later cancelled their Higher exams in December (here), and other home nations announced moves away from exams the DfE resisted all calls to consider alternatives and put contingencies in place for England. This position continued to be publicly reinforced, including the adherence to taking exams right up to the end of December (here). On 30th December Gavin Williamson emphasised that Exam years would be at the “head of the queue” for rapid covid testing as part of schools reopening in January (here). This position remained the case right up to the point when schools were closed as part of the January 2021 lockdown (here).

The January 4th lockdown announcement (here) stated that “it is not possible for exams in the summer to go ahead as planned.” This also stated that the DfE would “be working with Ofqual to consult rapidly to put in place alternative arrangements that will allow students to progress fairly.”

On 6th January Gavin Williamson changed direction further by announcing that all exams “will not go ahead this summer” (here). This is despite schools being expected to reopen and welcome exam years back well before the end of the academic year. Stating in January that all summer exams are cancelled closes the door to formal standardised testing across any subject, instantly reducing the potential for reliable comparisons across exam centres. This statement included the claim that “The department and Ofqual had already worked up a range of contingency options.” However the statement gave proper details on none of these options, despite apparently working on all possible contingencies since 12th October. The only fragment of detail given was to confirm that he wishes “to use a form of teacher-assessed grades, with training and support provided to ensure these are awarded fairly and consistently.” So on the face of it the plan for this year is to do the same as last year, overseen by the exact same people and regulators as last year. The only difference apparently is that this year they intend to train teachers better to make sure that they don’t make the mistakes that the algorithm designers made last year. In 2020 they made a similar statement that teachers and schools would be given guidance on how to award CAGs fairly… We know how that went.

Even with lockdown in place the government also allowed January vocational exams to go ahead. When questioned on this they advised schools that they can run the exams “where they judge it right to do so” (here)  but gave no guidance at all on how to judge if it was right or not. There are no definitions of what would constitute “safe” or “unsafe” conditions for running an exam in a lockdown and the DfE have resisted all calls to lay them down.

Throughout this no guidance at all has been given on how “special consideration” may relate to Covid, and no timeline given for when guidance might be shared. In fact there has been no indication that any account would be taken for the November series of exams, the January vocational exams that do run, or any other assessments that have or still will happen this year.

The impacts of these actions are that despite making statements that contingency plans were being made the exams have been cancelled again at short notice with no detail at all on what those contingencies will be. At every stage the DfE, Ofqual and Exam boards have been completely unprepared for disruption to the year, meaning schools are without guidance from these vital organisations. While it has been obvious since the first school had to isolate the first child in September that there would be some form of disruption to the 2021 exams, the lack of available detail or action to safeguard comparable assessments during the year shows a complete lack of foresight.

Schools, colleges, and students in exam years have once again been left with no guidance on what is to become of their qualifications. There are now genuine fears that we will have a repeat of the grades debacle that happened last time as the exact same people are at the helm and absolutely nothing has been changed in the meantime to provide more robust preparations for CAGs. Schools again are left unclear on the guidance to give to parents and students.

Overall then….

At every decision point through the pandemic there was a route that would have given a more strategic response, clearer guidance and better outcomes for students and the wider education sector. These more strategic responses do not require hindsight, they simply require foresight and understanding of the education and exams sector. If staff at the DfE and at Ofqual do not have this foresight then they need to acknowledge this and consult those that do as a matter of routine, ideally educating themselves to the point that they can provide this too.

Despite there being opportunities to find a better path at every stage the DfE has chosen to take actions that actively reduce guidance, limit the opportunities for robust assessments and harm outcomes for students. Every decision has been taken too late, and right up to point of the U turn the DfE and wider government have continued to actively brief and reinforce the previous line, even sometimes up to the point at which a different decision is announced. This is so consistently the case it could almost be viewed as a systematic attack on the education and examinations system, with last minute changes enacted specifically to destabilise the entire sector.

To make one or two mistakes or errors of judgement during the unprecedented times of Covid I could understand. To make mistakes and errors of judgement at EVERY stage however requires active incompetence. To ignore the likelihood of disruption to the 2020-21 academic year and make no contingency plans at all is a sign of complete negligence and lack of leadership. From the point of cancellation of exams in 2020 there should have been plans made to make sure we had better information to work with for 2021 and clear parallel processes in place.

Let us remember here, this is Gavin Williamson’s job, alongside all of those that work for him in the Department for Education and Ofqual (Ofqual annual budget of the order of £17.5million). Williamson is paid a ministerial salary of £67,500 per year (in addition to his basic MP’s salary of £74,962) to do this role. Given the lack of action and negligence evident in the above I need to ask what has he, DfE and Ofqual have actually been doing to improve the exam and qualification system since March 2020 (or for since he assumed office in July 2019 for that matter)?

They say “to err is human”… To repeatedly and systematically ignore advice and fail to make adequate contingency plans is at best incompetent, certainly negligent and at worst deliberately undermining. As a country we simply cannot continue treating students in exam years, their parents, and their schools like this. The Department for Education has been complicit in the undermining of qualifications for two years in a row now, in Ofsted parlance they wouldn’t just “require improvement”, they would be “inadequate”. A school this badly run would have been placed in special measures and the leadership team replaced.

Despite this catalogue of failings (and that doesn’t even scratch the surface of the turmoil in the Primary, Early Years and University sectors that he is also responsible for) Gavin Williamson then has the temerity to suggest that parents should complain to Ofsted if their remote learning isn’t up to scratch just two days after the government made the U turn to close schools (here). 

This isn’t party political – it’s about leadership, or more specifically the lack of leadership of the DfE, and that is true regardless of the political affiliations of those involved.

On reflection of all of the above I make you this offer – pay me £67,500, it doesn’t even need to be in addition to my teaching salary… in fact just pay me a basic teacher’s salary and I guarantee that I will make substantially more impact on improving education in the first 3 months than Gavin Williamson has done in the last 17 months that he has been in post…

We cannot continue with an inadequate DfE...


Saturday, 9 March 2019

Data isn't always a demon

I've seen and heard quite a lot recently about data being a bad thing and driving high workloads. Indeed Ofsted nod towards this in their most recent framework, emphasising that they plan to ignore internal data about current students as part of their inspections (more here).


There are also loads of blogs and other comments about data being bad, a waste of time and a massive source of workload, an example is here.


It's now my 5th year as the SLT member responsible for data at my school. As such if I believe all of the hype I could be viewed as the source of all evil in my school, but that misses the point substantially. As with most things we need to seek balance.


Misuse and mismanagement is the demon
Data as a thing is not the issue. The issue is when data is over collected, over interpreted and its importance is over inflated so that it takes a disproportionate amount of time and resources to create and monitor.


Weighing the pig more often does not make it grow more quickly. Equally never weighing the pig leaves the farmer uncertain of whether the pig is growing at the appropriate rate. The farmer needs some data about the pig to work out if it is being fed enough, if it is healthy, and when it is ready to go to market.


A world without data would be a journey without reference points
If we take the "data is bad" knee jerk reaction to its logical conclusion we would never assess anything, never measure anything, never have an opportunity to step back to see if our actions are working.


Sailing hopefully without reference to instruments or other guidance is a great way to get lost. As sailors have discovered since we first took to the seas, taking appropriate measurements is a great way to stay on course. The important bit is that the measures are appropriate, taken at suitable time intervals, and the appropriate corrective action is taken in response.


Well you would say that wouldn't you
Of course, I'm a maths teacher, with a past career in Engineering, and have responsibility for data in the school - I'm bound to say it's important as it's part of my job.


However it's not just my perspective...having the right data around you is part of managing anything effectively - that applies to all aspects of managing, every single business, in every single sector. Managers need to have data to determine if their organisation is working effectively, where there may be strengths, weaknesses, and also where there might be opportunities or risks.


We need to fight misuse of data, not just all data
By railing against data without focus we run the risk of branding it all as bad, and that's simply not true. Lots of data linked to schools is extremely useful both at a leadership/management and at a classroom teacher level, but if we continue to brand it all as bad we undermine or ignore the good uses.


Misuse or mismanagement of data occurs for various reasons:
  1. It's collected too often - Things take time to develop in schools, collection more frequently than 6 weeks is highly unlikely to result in meaningful changes, and even then 6 weeks may be too often depending on what is being measured - appropriate frequency is vital. A good manager will try to make the frequency appropriate to the measure and its intended use.
  2. It's collected but not used - absolutely no point in collecting data that is going to be ignored or if nothing happens as a result of it being collected. It's a waste of time for all involved.
  3. Its accuracy is over estimated. Any data generated internally by schools is subject to errors, particularly assessment data for students. There will be subjective elements to assessments that vary across teachers, there will be judgements applied to turn raw marks into grade boundaries that may shape grades inappropriately. Just because it's data doesn't mean it produced an accurate summary of a cohort. Just because a particular approach to data appeared accurate and appropriate when applied one year does not mean it will always be accurate for future years.
  4. Its biased, intentionally or accidentally it is very easy to bias a set of data. For example when asked to forecast a GCSE grade teachers will often bias their predictions based on the message that they want the student to hear - some will over estimate to motivate and encourage the students, some will under estimate to spur on the students to do better - both are potentially incorrect, and either one can have an impact on the student that is the exact opposite of what was intended. When rolled up at a department or school level these predictions can be wildly inaccurate.
  5. Summaries such as percentages are used inappropriately (see this post about percentages and their misuse)
There are other reasons data can be misused, but probably the biggest over and above those listed above is to use data as the end of a conversation. In a professional environment data should be part of the professional journey, used to inform alongside other sources of information and judgement. If we resort to only using a piece of data to sum up an individual's professional worth, or a student's educational achievements then we have missed the point entirely - it's an evidence base to use as part of a much wider dialogue.


With any data we need to ask 2 questions:
  1. Do we believe what it's telling us?
    1. If so, why do we believe it - what other information supports it?
    2. If not, why don't we believe it - what other information conflicts with it?
  2. What do we need to do next? - we have this data, "so what?
So what am I saying?
I suppose the bottom line here is not all data is bad, but misuse or mismanagement of it is very bad.


Effective use of assessment and the data that this generates is a central part of being an effective teacher at a class level and we mustn't try to hide away from this behind a "data is bad, just let me get on and teach" defence. Doing that would cut off a vital source of information for the class teacher. If class teachers don't understand their class's data then we need to train them in how to use it and work with it in a useful way - that's a CPD need not an issue with the data.


Similarly effective use of data is a key part of leading and managing a school at all levels, but the operative word is "effective". I'll acknowledge that too often in education data is used ineffectively and inappropriately - that is what we need to fight against. But we mustn't throw away the vital information that data can give us alongside this fight. Data isn't a stick to bash people over the head with, it's a tool to be used skilfully to help manage an organisation alongside all the other tools that we need.


We need more people using data as a force for good...

Monday, 21 January 2019

Learning lessons from a pro cyclist

A return post after ages away from blogging... In the time since I last posted I’ve been working on a book about teaching and finding a good work/life balance, it’s now finished and on the way to publication (Possible to order a copy here: bit.ly/TeachLYII). So now I’m back, one way or another and plan to do a bit more blogging again now the book is done, though perhaps not quite as prolific as I might have been - for more on that see the book....

The first thing I want to reflect on in my return to the blog is something that occurred to me recently while in Girona, Spain. I was on a cycling trip at the start of half term and happened to get the chance to chat to a professional cyclist for a while and amongst various thing I was just blown away by his focus and the attention to detail that the teams go into in chasing after victory. While chatting a fellow amateur asked Tom (the pro) what advice he could give about training to help us amateurs get the best from it and get faster. Tom’s key advice was “make sure every ride has a purpose.”

Every ride has a purpose
In cycling you can get reasonably fit and fast by just riding your bike a lot, however this will eventually reach a plateau where improvements do not come without some more specific thought. If rides are unstructured they can easily become “junk miles” they may well count as miles under the wheels and hours in the saddle but do not actually improve your fitness as they don’t challenge your limits enough. Tom’s advice is to know what each ride is for. It could be a hard workout to stress a certain type of muscle or body system, or it could be a deliberately easy ride to allow recovery from another effort, it could even be just a gentle spin with family or friends. The important thing is to know what you are aiming to do with each ride so that you get the intensity and type of exercise right and therefore get the changes you want in your body.

Every "thing" has a purpose
It occurred to me that there are some parallels to Tom's advice that could easily be applied to teaching. It is easy to plough on through a term counting down lessons to exams or ticking off weeks before a holiday, as part of that it is all too easy to lose track of what each lesson, day, week, or term is “for”. Lessons normally have objectives so it is clear what the aim is, but perhaps it is worth reflecting why that should be the aim.
  • Is it the right aim for your students or is it just the aim that happens at that point in the scheme of work?
  • How does it fit with progress to date?
  • How does that link to the bigger aims of the current unit of work or for the year?
From a whole school perspective what is week 18 “for”? What needs to be accomplished in that week to make sure things are on track? How does the activity in week 10 differ from week 20, or 30? Is that a deliberate difference that targets a particular outcome, or is it a difference that is purely accidental that depends on the energy levels and recent inspirations off the staff involved?

Beyond all of this it’s worth a thought about what a particular year in school is "for"?
some years have a clearer, more obvious purpose...
  • Reception, where the purpose is to settle the students into school. 
  • Year 6 is where SATs are completed and the transition to secondary school starts. 
  • Year 7 is about settling in to secondary school. 
  • Year 11 is about successfully completing exams and choosing the post 16 routes.
  • and so on...
However what are the other years “for”? Are Years 1-5 essentially waiting to be Year 6? Are Years 8-10 just biding time until Year 11? With reports like Ofsted's "Key Stage 3: the wasted years?" it is clear that there is the potential for this time to be the educational equivalent of Tom's junk miles.

In these intermediate years what are the consequences of having a good or bad Year at a student, teacher, department or school level? How do we measure or assess successes in these years? Should we measure purely academic progress or is it more about engagement and inspiration? With the direction that Ofsted are taking for 2019 in terms of the primacy of curriculum breadth and opportunity it is probably time for us to consider whether we are doing some of these things deliberately or not. 

Digging deeper there are the other things we do in school, sometimes just because its what we do, not necessarily to fulfil a firm purpose. Some examples:
  • What is an assembly “for”?
    • More specifically what is this assembly for?
    • How do we know it has the impact we want it to have?
    • Is there a better way to get the same or even more impact?
  • What is a particular meeting “for”?
    • Is a morning briefing "just another briefing" or does it have a purpose beyond the mundane?
  • Does the parents' evening have a purpose beyond just something we always do?
I could go on and on with this list... For each of these and all of the other things we do I think we should ask ourselves whether it is done with purpose or just done. Do we really need to do it? Is there a way to get more from the events so that the routine becomes something that delivers real added value for very little extra effort?

A reflective start
I suppose this is quite a reflective return to blogging - I'm not necessarily offering any great ideas or innovations here either. It's more about questioning what we do and why we do it. Pro cyclists advise against junk miles, we need to guard against junk days, weeks and years.

Keen to hear your thoughts - it's been a while!


Tuesday, 13 October 2015

School's own data in the Ofsted inspection data dashboard

Hopefully if you are involved in secondary school data you will already be aware of the publication of the Ofsted "Inspection dashboards" which are available via the RAISEonline website.

However there is an issue with the data that will be shown in these dashboards as for many schools in 2014 there is a difference between best and first entry, meaning that the dashboards may distort the picture. For some schools this may also be the case for 2015.

Furthermore the dashboards published so far do not allow comparison of the latest data - potentially imminent in its release, however it's useful to know what to expect.

To try to overcome some of these issues while still presenting the data in a format very similar to the official dashboard I have thrown together a spreadsheet that emulates the Ofsted layout as much as I can (given Excel's limitations).

The file can be downloaded here: http://bit.ly/Inspdashboard.


The file as shared contains dummy data (mostly derived from the anonymised version of the dashboard that is available in the RAISEonline library), which you can overtype with data from your school. Obviously you'll need to do the actual calculations for the numbers yourselves, or with whatever package your school uses to process/estimate the stats for your figures. The sheet simply presents the data in a way that is similar to Ofsted's official version. Note this was made so that I could present data within my school so I'm afraid that some of the sheets are a bit fiddly to use - afraid I've not had chance to make it slick, but it does work if you take the time to enter your data! Most of the sheets are fairly obvious where the data entry goes - I'm sure you'll figure it out with a bit of trial & error.

My perferred way to print the file is to get Excel to print the entire workbook. It does come out with a blank page on P6 when doing it this way - not been able to fix that so either ignore it or don't print that one.

Hope this is useful!

Saturday, 11 April 2015

The Pareto principle and the great divide

I just scared myself looking at my last post...it was at the end of January! So far this academic year has seen just 5 posts to my blog - in previous years I averaged 1 post per week during term time. In fact I've just passed the 2 year anniversary of starting this blog and I've now written less since September 2014 than I have at any time since I started it. I think it's about time I explained why...

SLT role
Good grief it's busy as SLT! I wrote a post (here) back in October about my first month as SLT and the multitude of unexpected pressures that suck away at your time. In all honesty I've had to stop posting to my blog so regularly because I've simply been that busy that I couldn't justify any more time sat in front of a computer writing a blog post each week. I've wanted to post, in fact I've even started a decent number of them and then saved them in draft because I wasn't able to finish them to a point where I'd be happy to share them publicly. In fact half of this post was written months ago, but seems to fit with what I wanted to say today.

This isn't intended to be a whinge about workload or that kind of thing so don't worry about digging out the violins, however I do have some observations about the tensions caused by the SLT role.

Is there a great divide?
As I got closer to and then started my new role as assistant head and I became increasingly aware of the perceived divide between SLT and middle leaders/mainscale teachers. I know it's not just at my school, I've seen it at every school I've been in but it becomes really clear as you get up close to it and then jump across into SLT, I think it's particularly visible to me as I stayed at the same school.

It started with the jibes from my fellow Friday afternoon staff footballers who were counting down the number of games I was likely to be at before I become "too important" to play with them. I'm pleased to say this has stopped as I've carried on playing!

It continues with discussions about timetables where as an assistant head I teach less than half the number of lessons I did as a head of department, leading to jokey conversations about me having plenty of time to think up jobs & initiatives to keep the middle leaders and mainscale teachers busy.

The thing is the structure of a week for a member of SLT is so massively different to that of a middle leader they look like completely different jobs. Compare SLT to a classroom teacher and it's even more different - I actually teach only slightly more lessons than a main scale teacher gets for PPA.

I'm not necessarily saying its wrong, but it changes your perspective of the job massively. A marking policy that is completely manageable on an SLT timetable can become tough to manage as a middle leader, and completely impossible as a main scale teacher. Teaching a difficult group when you have loads of time to prepare/recover, plus a level of seniority to trade with, is very different to having a similar group amidst full days of teaching.

Of course much of the time not teaching as SLT is spent doing management & leadership functions, so the "loads of time" is a fallacy, but the perception is there from those outside of SLT even if it's not the reality.

I like to think I'm fairly approachable and try to make sure that I spend time talking to colleagues at all levels of the organisation, and have great working relationships across the school. A great part of that, and the fact that I wasn't always SLT at this school, means that I think some are more open and honest with me than they might be with the other members of the leadership team. I know for a fact that that some will say things to me that they wouldn't dream of saying to others in our SLT. This gives an interesting insight at times...

No time for perfection
I think the biggest actual disconnect I've discovered as part of all of this is the perception from many staff in general that all actions from SLT are deliberate, and that SLT have complete control over everything that happens in the school. Now I'm not suggesting that we go around blundering into things and have no control over what goes on, but we are a team of humans and with that comes limitations. Similarly we work in a setting that has a massive number of stakeholders with a vast array of needs and agendas, and are subject to legistative, judgemental, procedural and time constraints that limit or shape things in all manner of ways.

Sometimes the size of a team means that not everyone can be consulted properly in advance. Sometimes the speed a decision needs to be made means that only the most central consequences can be considered (see comments about Pareto below). Sometimes an element of planning falls between areas of responsibility meaning it gets missed. Sometimes a task is simply not done quite as perfectly as it might have been because a human has done it or they ran out of time. All of these are issues to be guarded against, and even planned to be mitigated, but none would be done deliberately. However I know from that the consequences of what I know to be a small human error at SLT level can be seen as a direct decision to undermine or cause issues for other staff.

I've seen a situation where a given member of SLT has been working as hard as they could on something, but due to the realities of life in schools it ended being delivered to a wider audience slightly before it was completely finished. The reception from others in the school was frosty "it's a half baked Idea", "they've not considered the impact on...", "why couldn't they have told us that weeks ago?" and so on. The expectation from the school as a body is that the SLT have all the answers and have the time to plan everything out fully. The reality is that there is so much going on that there is too often no time to complete any task fully, sometimes it just has to be good enough to get the job done.

Pareto principle for leadership
If you've not heard of the Pareto principle it stems from the observations by Italian Economist Vilfredo Pareto that 20% of Italians own 80% of the land. This has been extended to business in various ways with assertions that 80% of profit comes from 20% of your clients, or 80% of sales come from 20% of the sales team. It is also used in health and safety where 80% of injuries come from the 20% most common incidents and so on.

In my experience it can be fairly safely applied that about 80% of the behaviour incidents in a school come from just 20% of the students (you know which ones!). Similarly about 80% of the high grades come from 20% of the students. 80% of absences come from about 20% of staff, I could go on...

Furthermore another aspect of Pareto in terms of leadership is that if you consider 20% of the stakeholders in a given decision then you'll probably expose 80% of the potential issues. Due to time pressures and general day to day management constraints it is common for leaders to have to revert to this 80/20 rule in all sorts of situations in order to see the wood from the trees. Many leaders do this unknowingly, or some knowingly but rarely is it applied in a cold, deliberate way, however the Pareto Principle can be used to prioritise all sorts of things in all sorts of ways. Yes it's a rule of thumb but it actually fits reasonably well in many situations, and does force a level of perspective that can help get the job done.

Of course I'm not saying it is a good thing to be forced to prioritise like this, and certainly if you're one of the 80% of the stakeholders that is not consulted who then raises one of the 20% issues then I completely understand that you'd feel aggrieved, but that's not really what this is about. What I'm trying to say is that SLT sometimes set themselves up a little as infallible, and are often expected to perform that way by wider staff. However often what they are doing is their best given the situation they are in and the conflicting demands they have on their time. Often this means prioritising and that then leads to some people feeling overlooked.

For SLT the key thing is to acknowledge that we're doing this and to communicate more clearly with those involved. Be willing to acknowledge that at the bottom of all decisions is a human that has done their best, but may not have done it perfectly. For those outside of SLT looking in, perhaps consider if it was physically possible to do it better, or if the compromise you need to make is actually the right one for overall progress (are you one of the 80% or the 20%).

Yes SLT are paid more in order to make decisions, yes SLT have more time in the week in order to make decisions, but that doesn't change the laws of physics, and it certainly doesn't make anyone perfect or infallable. Time is finite for all of us. We all have constriants to manage, and differing perspectives.

I think the biggest thing I've learnt as a member of SLT is that I can't always take the time to be a perfectionist, but I can do the best I can given the time and resources I have available.

No, I'm wrong... the BIGGEST thing I've learnt is that once I've had to commit to something that I know might not be perfect I need to avoid beating myself up over it and constantly revisiting it. I've spent too long doing that at various points during the year and it's done me no favours. You don't needs to be perfect in order to get it right for the vast majority of the time; the important thing is to make sure that even if something isn't quite right it is still one of the least bad options!

This is just me rambling again - sharing the thoughts bouncing round my head - comments always welcome!

Saturday, 31 January 2015

RAG123 as defined by the students

Our head of maths has done a piece of work recently on RAG123 that I think I just have to share...

Firstly if you've never heard of RAG123 then look here for my original post on it, and then see here for all the others I've done...

Pupil voice
So far with RAG123 I've seen teacher definitions of what R,A,G mean, and what 1, 2, 3 mean, and we've occasionally had a go at doing a student oriented set of descriptors. However it surprises me to admit it but we've never previously asked the students to define it themselves!

With my stepping up to SLT this year I have had the pleasure of welcoming and line managing a new HoD to run the maths department. Simon Preston arrived at our school, inherited RAG123 from me and then he embraced it and used it for a while. Then he had a brainwave that is so obvious I don't know why nobody had thought of it before... He asked the students to define their understanding of all the RAG123 ratings...

Simon did this by issuing a sheet like the one below and asked the students to fill in the boxes with a description of what types of work or attitudes warranted each rating. Notably he didn't just ask them to define R, A, G and then 1, 2, 3, but he got them to define all 9 combinations of letters and numbers.

What did the students say?
I have been fascinated by the responses that the students gave. Having collated their inputs and drawn together the common themes Simon has compiled the following grid, which for me now seems to be the definitive RAG123 rating grid.

I think the nuance that the students have highlighted between R2 and R3 in the root cause for the low effort is really interesting. Also like the A1 "just enough". Overall I am really pleased by the clear linkage between effort and understanding. It all comes back to the basic position where students on 3 for understanding need clear input from the teacher to move them, and those on R for effort also need a decision from the student to improve.

Involving parents
Also this week we held a parent information evening for our yr 11 students where we were briefing them on revision techniques and ideas to improve home-school partnerships. This RAG123 grid was shared with parents and students in this session. We suggested that parents could work with students to RAG123 their revision processes at home in order to help figure out whether a session was effective or not. This was really well received and we have had several positive comments from parents about this giving them the tools to help review progress with revision, particularly in subjects that they have no expertise in.

Have you done something similar?
The idea of asking the students is so obvious I'm amazed I or someone else haven't already done it - does anyone else have a similar student perspective on RAG123? If you have I'd be really keen to see it.

Once again - if you've not tried RAG123 you don't know what you're missing in terms of building linkage between marking and planning, building dialogue with students and the promoting growth mindset type linkages between effort and progress. Give it a try and let me know how it goes!

Saturday, 3 January 2015

SOLO mats - a 1 page lesson

It's been ages since I did a blog post about a teaching resource, so here we go...

I was looking to build in SOLO (For more posts on SOLO see here), some independence and also some structure to a lesson that also included an amount of differentiation. I try wherever possible to aim my lessons at the most able in the group, with scaffolding back for lower ability so that the more able aren't always subject to differentiation by just being given more work to do.

I mucked about with various ideas and finally landed on this one page lesson structure to try. The basic idea is that it incorporates a starter, core learning points and extension all in one place. It is possible for the strongest students to progress through the whole sheet with relatively little teacher input with prompts for them to reflect on what they've noticed. Students are given an A3 print of the sheet to work on, but can choose to make other notes or even do all of the work in their books if they want to (and some asked for squared paper for plotting the graph in the extending knowledge section).

Clearly this is a VERY maths based example - but I see no reason why this basic approach couldn't be used for any subject/topic that is looking to build on and combine prior knowledge in new ways.

(I should note that this was done for a very high ability group of year 11 students, it assumes quite a lot of knowledge and is certainly not a "start from scratch" position for this topic)

Powerpoint version available here.


The assumption is that the students start broadly in the top left, progressing down the left hand side, and then the right hand side, finishing off with a RAG123 assessment and comment in the bottom right (for more posts on RAG123 see here). Some got stuck straight in with it and progressed from one box to another fairly independently, others needed more support in lesson (possibly delivered by me or sometimes I would direct them to another student to discuss it), and some needed prompting to move on to the next box or to make links beyond what was immediately in front of them.

At the end of the lesson I collected in the sheets to review and complete the RAG123 comments. In the next lesson I issued the next sheet as follows:

Powerpoint version here.

This second sheet builds on the info I knew they had picked up in the first lesson and then structures some extension.

Reflections on using it
I was really taken with this approach and the majority of the students seemed to find them useful. The more inquisitive students came up with interesting ideas to navigate through it and made links readily, often pooling ideas to find solutions.

The lessons were very much of the form "here's your sheet, off you go" - I did very little discussion at a whole class level, in fact for the second lesson the sheets were already out on the desks and the students just came in and got started as they knew what to do. During the lessons my interactions with students were focused on removing barriers to them making links and progressing with the sheets. Sometimes I would add a line to a diagram to help them spot the right angled triangle, sometimes re-phrase or express what they told me verbally into algebraic form, sometimes it would be asking a question to open the door to the next box on the sheet. For those making most progress independently I would occasionally draw them back to earlier boxes to explore reasoning for answers or particular approaches to make sure they had seen the more general patterns among their specific answers.

The lack of formal instruction in a particular method or rule did expose some weaknesses for students who are otherwise strong performers; for some it was simply their discomfort with working with algebraic variables, for others it's a reluctance or lack of practice linking up different mathematical topics.

The most negative responses tended to come from those students who are diligent in making notes when a method is explained explicitly but tend to then apply this as a procedure to follow rather than understanding the underlying concept. In particular I had a group of girls who will probably get A* grades at GCSE (indeed they have already done so in Mocks), who got stuck at every stage because it was presented in a way that didn't signpost a method to apply, and sometimes there was no single clear answer to give. As a result they were fairly difficult to motivate through the lessons, however I still think it was a worthwhile experience for them.

For maximum benefit across the class I did use a final plenary to draw together all of the central key points with a few more formal notes, and then we spent a lesson applying this knowledge to exam type questions to check security of the concepts in different ways.

Other difficulties come with storing the sheets afterwards - A3 is not a convenient size to tuck into a small exercise book, but that's not a reason to not use them - I'll certainly use this approach again.

The group I was working with are generally well motivated and would get on with much of this independently, and also had a large amount of prior knowledge to work with. To use this approach with a weaker group, or with a group prone to behaviour challenges would need some thought as the lack of structure opens the door to classroom management issues if too many get stuck. I do think it could be used with weaker or more challenging groups, but it would need some more thought.

I think this basic approach could be used with almost any topic, it just needs a bit of thought. You also need to know the class well in order to know what knowledge you can assume. there is also no reason why this approach couldn't be used beyond maths.

So there it is - plan and deliver your lesson on a single page...

All thoughts welcome as always.