Wednesday, 29 May 2013

Minding the Gap - Using SOLO to help structure questioning

We're having a real push in our school on the quality of questioning. This sparked various debates across the staff that start along the lines of "our subject is different because..." and frankly these discussions are all a load of hot air. Good questions can be used to extend learning across all subjects. The form that the question may take, and the subject matter it links to will change for obvious reasons, but the core idea of using good questions to assess and extend learning applies regardless of subject, and Maths is no different...

The gap... Closed to open / specific to general
In maths lessons without proper planning we can too often allow students to get caught up with chasing down "the" right answer, and therefore get stuck with apparently closed questioning. However maths is a subject that makes leaps from closed questions to open questions, and from specific cases to general cases with huge regularity.

For example we can step from the specific 3+6=9, to the less specific x+6=9 to the general x+y=9, and beyond into the even more general x+y=z. Sometimes we ask students to make this leap in the space of a single lesson, and too often we do it with students that don't fully understand what the letters represent (for more on levels of understanding of algebra see chapter 8, of this book - it's fairly old, but well worth a read if you've not done so before - can provide a real insight into the barriers to understanding algebra). For those of us with a sound grasp of algebra and who are happy to use letters as variables this step from specific to general is fairly trivial. However for those without that grasp it is too often a step that presents a real challenge, or that can be made only by copying procedures without deeper understanding of the concepts that lie underneath.

Vitally though, if we don't plan and structure it properly our questioning can also make these big leaps from closed and specific all the way up to open and general. There is the potential to leave a huge gap in between, where misconceptions and misunderstandings can sneak through without detection.

It was while I was contemplating this gap that I stumbled upon the SOLO taxonomy and realised that this helps me to explain this issue. In terms of SOLO levels of understanding this gap can require a student to take a leap from Prestructural or Unistructural understanding all the way to Relational or Extended Abstract understanding without making any links on the way. (When I found out about SOLO I was amazed that it wasn't part of my basic teacher training; I think it's really powerful. If you've never heard of SOLO before then take a couple of minutes to have a look at this video as it explains all of the key points - also linked here on youtube if this embedded video doesn't work)

Using existing good practice to help fill the gap
I know from observations that there is a range of excellent practice in terms of questioning within the maths department, but even the best of us do leave this gap open at times in our rush towards generalisation. I thought that with some tweaks we could put something in place to help us to bridge the gap.

To capture examples of the existing good practice I asked each member of the department to send me 2 or 3 examples of questions that they had found effective during a particular week. I then took these questions, added a few more, and tried to align them with the SOLO taxonomy. The result is this sheet:
You can get a PDF version here: LINK

What's the point?
The idea of the sheet is to give us a quick reference sentence starter or framework to build a good question around depending on topic. By aligning the questions with SOLO we can work our way up (or down) the levels of understanding as needed with a particular class or individual. Importantly it structures the questions with several in the Multistructural and Relational areas to help us avoid making jumps too big and losing students in the gap.

Of course those fortunate enough to be really gifted teachers can always from the perfect question at exactly the right time, however for most mortals it can be useful to have the occasional prompt, and there are other uses for a sheet such as this (see below).

Not a definitive list - all about context
This sheet is only intended as a prompt, it's not all encompassing and it should be used with a due level of professional interpretation to judge whether a question is suitable for a given class or student. Similarly the detail of the question can be tweaked as needed, and extended with further questions (ref Pause, pose, pounce, bounce by @TeacherToolkit).

Of course you could debate the location of some of the questions and where I've assigned them in the SOLO taxonomy - this was my first attempt and some are far from clear cut. In fact one of the key points of SOLO is that the response to a question can be at a different level to the question itself. For example the deceptively simple "What is a fraction?" could be Prestructural if we're talking to a student that has only a vague concept of what a fraction is, and it could rise as high as Extended Abstract if we start making links beyond decimals and percentages into algebraic fractions, gradients or differentiation, etc. This is where the context of the question and professional skill/judgement comes into it.

How are we using the sheet?
1) Copies in departmental planners to help form questions at lesson planning stage
2) Copies available in the classroom (soon to be on the wall) to give a quick prompt for the teacher as part of a plenary or mid-lesson review (particularly useful if the lesson hasn't gone exactly as planned so any questions planned in advance can't be used)
3) Whole sheet, or sections of it, given to students as part of a lesson to encourage them to ask challenging questions of themselves or each other as part of group or discussion work.

What's the impact?
Early days really, but having the sheet for reference is undoubtedly useful as part of the planning process. Similarly just asking the department for examples of their good practice provokes reflection and review over and above normal day to day practice. However what is more interesting is where this could take us next...

What next?
The sheet is planned to be kept as a live and developing document. I plan to review it in department meetings over the coming 12 months or so and update it with more questions and refinements to make sure it is as useful as possible to the department both for planning and use during lessons.

I also see a route towards questioning structures linked to schemes of work. This link shows an example that @TeacherToolkit has posted using Bloom's Taxonomy to differentiate questioning for a Design Technology topic, and I see no reason why we can't develop a similar approach in Maths using this SOLO structure.

I am also keen to try to embed this approach alongside our work on establishing a common language for feedback (see this earlier post). If the students can start spotting patterns in our questioning and make links to how this helps to develop understanding then I suspect there is a meta-cognitive benefit to be had. However we've not tried that yet so it's nothing more than speculation for now...

All thoughts welcome
As always I'm keen to know your thoughts. Is this useful? Do you have anything similar? Do you have anything completely different that does the same job? Do you think I'm wasting my time (if so - please say why)?

Thanks for reading. Look me up on twitter... @ListerKev

Saturday, 18 May 2013

Using measures to improve performance

An effect I've noticed managing teams both in industry and education...

To get something to improve just find a way to measure it and make the results visible to the team that are responsible.

I've found that you rarely have to make a big deal of the measurements - just make sure the team knows that they exist and make it possible for the team to access and compare them for themselves. As part of professional pride and natural competitiveness most team members will use the available data to make comparisons within the team without being asked. If they identify that their performance seems to be below others then they will usually take action to improve, again usually without being prompted or asked.

Over time this effect can raise the performance of the whole team without any specific input at a management level other than celebrating successes.

Sounds too easy?
Actually it is a little more complicated - because the most important thing is the selection of what to measure... and that is far from straightforward.

Vitally you need to measure exactly what you really want to improve, and that is not necessarily aligned with something that is easy to measure!

Using the wrong measure will give you the wrong improvement
For example in a bid to improve standards schools were told that the percentage of students achieving 5 GCSEs (or equivalents) at grades A*-C was important. So a large number of schools started using a range of equivalent qualifications that were perceived as easier to deliver then a traditional GCSE, as these helped them to improve their scores in this measure. The schools then got criticized for "cheating" the measure. However the real problem was that the measure selected was not closely enough linked to the improvement that was wanted. If the measure allows a "shortcut" then someone will find it and take it.

Similarly even when a student leaves school with a fantastic set of grades we still get employers or higher education institutions complaining that they don't have the right skills once they get there. In this case it is clear that the exam system that has assessed them as being high achievers hasn't assessed the skills that the next institution wants to see. As such is it the teacher's fault for not teaching/fostering these apparently missing skills, or is it the fault of the curriculum and assessment structure for not detecting and highlighting the lack of them first to the teacher and later to the next institution?

Processes naturally tend towards optimisation
Over time all systems will become optimised to delivering the outputs it is assessed against with the minimum effort. This is true in all sectors and all industries where humans are able to tweak the process. We naturally and often unconsciously tweak a system to make it as easy as possible to achieve the required output. This is true when an operator skips a process to save time on a production line, it is true when we choose to cut across the grass rather than walk around the path on the outside, and it is true in a classroom.

If an exam structure remains in place for some time it is natural for grades to rise as teachers become more used to preparing students for it. It doesn't mean that students have got cleverer, or that grades are being inflated, just that the system delivering them to the exam has become more efficient/effective.

It's not even really about teaching to the test (I don't think there are large numbers of teachers choosing only to teach what is examined and nothing more). However when you have a limited amount of curriculum time if you have a choice between doing something that directly contributes to exam success (that you and the school are measured on) and something that doesn't (that you're not measured on) then most people would gravitate towards the former simply because the results are more tangible later on.

Performance pay
I'm not looking to debate government policy in this blog, however if you make a link between exam results and salary levels/progression all you will do is increase the likelihood of more people teaching explicitly to the test. Not going to say any more on this, but an interesting perspective on drive and payment can be found in this video.

Implications for managing a department
Back to my real reason for writing this - how can this principle of measurement be used to improve a maths department? I think there are 6 basic steps...

1) Decide what you want to improve
2) Find a way to measure it (which may mean collecting new data or processing data in a different way) - this is the most important part - don't rush it!
3) Make the analysis of the measurement data easily available to those responsible for it - talk about headline figures, but not specifics at this stage.
4) Give the team time to draw their own conclusions on it - in my experience the majority of professionals will do the analysis and take action themselves, and the results will improve.
5) Keep measuring over time and celebrate all improvements as publicly as possible.
6) If (and only if) nothing changes over an period of time then use the data to challenge under-performance  (you'll need to use judgement on what time period to assign to this - it depends on what you're measuring and how quickly you NEED to see a change)

From my perspective the key benefit of this approach is that this brings about improvements that are organic and sustainable over the longer term; the drive for improvement comes from within the team as the result of self reflection and analysis. It also builds professional pride in improvements as team members can claim ownership of it.

By contrast improvements made as a result of specific management intervention (as per point 6) are imposed externally and therefore may lapse once the intervention is removed. Clearly this type of action does need to happen from time to time, but shouldn't be the day to day approach. (i.e. it's a tactic not a strategy)

Examples:
Want KS3 test results to improve?
Ensure everyone is doing comparable and equivalent tests and collect the data centrally (we do the same tests at the same time across our KS3 year groups). Make comparisons of result vs student targets automatically available on the central spreadsheet (needs to be clearly visible for easy comparison by class and by teacher). Just make sure the department knows the data is there - you don't need to point out who has the highest or lowest scores. Then watch as the results begin to improve over time.
[Potentially this could have even more impact if you can make the data visible to the students as well - they are responsible for the results as well as the teacher - like we did with the KS4 results in point 1 of this earlier post - a next step for us is to make this kind of information visible to KS3 as well]

Want to increase the percentage of exercise books marked with feedback to a particular standard?
Make sure the whole team knows what is needed to conform to that standard. Collect in samples of books and simply count how many met the standard. Report the findings at a department level - just highlight how many met the standard and how many didn't. Don't target individuals - if you've been clear enough about the standard then they will know which category they fall into. Do it again after a few weeks and you should see an improvement.

Fundamentally "If you can not measure it, you can not improve it" (Lord Kelvin)
Also remember that if you stop measuring once an improvement has happened then you should expect a decline over time. Regardless of how professional and dedicated your team is their focus will naturally drift to what is being measured.

Thoughts welcome
As ever I'm keen to know any thoughts on this - do you agree or disagree. Do you have examples where it wouldn't work? Could you apply it elsewhere? Do you need to share ideas about what to measure for a particular improvement? Leave a comment or find me on twitter: @ListerKev

Saturday, 11 May 2013

Things we've done to help year 11

This post is nothing to do with managing variability other than these are actions that have been taken across the department during the past year or so, and as such are shared core practice within our team... I just wanted to record all of the things we've done with our current year 11 that have contributed to a substantial improvement in results this year.

Some may seem obvious, others less so, but all have made a contribution.

1) Establish a progress measure and make it visible to staff and students
We started mock exams with full papers early in the academic year, assigning grades and sharing them with the students and staff. We then did these regularly throughout the year, progressing through different past papers. Early scores were low due to bits of missing content, but we could also identify gaps in knowledge of content we had already covered. All mocks were accompanied by formative feedback & self assessment as discussed in my earlier post. this is in addition to feedback given on book work, which is also a subject of an earlier post.

Tracking this data centrally made the department staff aware of where work was required with particular students. You can see the progress made at a headline level during the year below, but the data was held at an individual student and class level. We could cut it to look specifically at SEN, FSM or other raise groups as well as general progress:
(You may be thinking that these results don't look that impressive?
Yes I know we're still below the target line, but we've not finished the year yet! - we've got more students within reach of a C taking exams in June - this should close the gap and take us up to, and hopefully even past the target line - I'm also keen to point out that it's not all about C grades either - we had a target of just 2 A* grades this year but have already recorded 10 so far.


I'll agree that we're not yet posting results at a level that would put us at the top of league tables, however the school has been posting results in the mid 60% range for the last 6 years, with similar target levels to this year! Therefore to take a step into the mid-high 70%, or even 80% range by the time the summer results are in will be a big improvement, and the best the school has ever delivered.)


We had actually done this regular mock process with this year group during year 10 in preparation for their earlier unit exams, so they were used to the idea of seeing progress their develop during the year.

As well as nicely sloping graphs we posted visual summaries of individual student performance vs target grades in classrooms and talked about a path to improvement. Note that because some students were concerned about publicly displaying low targets or low grades we didn't talk about actual grades, just position vs their personal target via colour coding. This is an example of what they look like:
By keeping these sheets visible in every classroom and doing the regular mocks we were emphasising that the important thing to see during the year is progress towards targets, not necessarily step changes to target. Students were always keen to see how their colours developed as the mocks progressed.

I acknowledge it's not neat & tidy and doesn't show a flawless progression from red to green/blue, but real data often isn't. However it is clear for the students to see the progress they and their peers have made during the year, which is substantial.

2) Carefully targeted revision support
Where groups of students share a common need in terms of revision then we re-grouped them to maximise focus on these areas of weakness - this happened within classes as part of differentiation and also across classes where students were grouped with a particular teacher for a short time according to need.

We also selected some students for 1:1 withdrawal during lessons (including selected extraction from other subjects for those most in need), and also selected some for short 1:1 sessions during morning registration.

3) Be clear about what is required to reach or exceed targets
We used analysis of past grade boundaries and conversions to recommend minimum marks required to achieve both their target grade and the grade above. The students responded really well to knowing that they needed a particular score, and again this helped them to judge progress in mocks. e.g. if Johnny needed at least 55 for an overall grade B, and scored 34 then 45 in successive mocks he could see progression towards his personal target in a clearer way than two grade Cs would have.

4) Parental involvement
I've already mentioned parental involvement in homeworks in an earlier post, and this did help to push the visibility of maths as a subject at home and in class.

Communications home in addition to the homework information included notification of after school and half term/holiday revision sessions, early details of exam dates and expected equipment.

Ahead of exams revision packs of questions were sent home with students, but the answers were e-mailed to parents to help the parents help the students.

Parents have been really positive about the level and types of information that has been sent home.

5) Maximise access to revision materials
We offered revision guides and revision DVDs for sale at a reduced cost via the school at various points during the year. Approximately 65% of the year took us up on this. We also regularly shared revision website information with the students.

6)  Use a range of revision lessons
This is still under more development as we think of and find more ideas (and will be the subject of a further more detailed post), but once we get to revision time it is important to give students a varied diet of activities. Things we've used are:



Is this just teaching to the test?
No, it's not all about teaching to the test. However there comes a time when we have to seek to maximise the results that the students can deliver. In the long term it is both in their interest and in the interest of the school. 

Anything else?
In amongst this were other actions such as selected re-takes and some students changing from modular to linear exams, but with the changes to the exam structure in England this will not be possible in future years so it's not really worth discussing in detail.

What about next year and the switch to linear exams?
Our approach is intended to be very similar. We have already started using full GCSE papers with our current year 10 and will use the same tools for sharing the data to show progress through the year. Early indication suggests we're starting in a similar place on the curve as for the current year 11 We might selectively enter some in November if we think they will benefit from it, but we'll hold off if there is a chance that it means they might not achieve to their full potential in the end.

Any thoughts?
I'm keen to know if you've done anything similar, or different that has a beneficial effect to your students. Any suggestions for revision lessons? How are you managing the change to the linear specification?

Monday, 6 May 2013

Parent power improving homework completion rates & quality

Having just read back though this it actually seems fairly obvious, but it's made a difference to us.

The problem
Too many students not completing their homework or coming up with some kind of feeble excuse. Responses to incomplete homework varied from class to class and we generally had too many students viewing homeworks as optional.

The solution
Firstly one of the department analysed pupil performance in tests vs homework completion and got a perfect correlation between those that did more homework getting better grades. Sharing this with their class on its own spurred some students into doing more homeworks, but we wanted to ensure there was a clear and rigorous approach taken across the department.

We decided to get parents involved. Firstly we tried it in one class and e-mailed PDF copies of the homework worksheets direct to parents each week, alongside giving printed copies to the students. We also asked for a parental signature on the bottom of the homeworks to indicate that the student had shown it to their parents. This substantially improved both completion rate and quality. It had such an effect on this one class that we decided to roll it out across the whole of year 11 (we would have done more year groups but for the admin related to e-mailing home)

In the first week of this new process we had 17% of students fail to hand in homework - they all received a detention in line with the school behaviour policy. In the second week we had just 4% fail to hand in, and the figure has hovered somewhere below 4% since then.

All of the homeworks are marked and assigned a grade within a week of being handed in, alongside appropriate formative feedback.

Noticeably the quality of homeworks increased dramatically when we were firm about parental signatures. We even had calls from a few parents to inform us that they had refused to sign in a particular week as they felt the homework was not done to a suitable standard, meaning we were forewarned when the sheets were presented by the students in question and could take appropriate action.

Having the PDF copies e-mailed home means that students have no excuse for not doing the work in the form of lost sheets or "wasn't here when it was set", and it is surprising how many end up handing in a sheet printed at home rather than the ones handed out to them.

While we could simply upload sheets to a shared drive or publish them on a VLE or google docs type platform that would require the students or their parents to go looking for the work. The act of e-mailing it direct to parents means that there is substantially increased visibility and does not require them to go out of their way to find the information.

Is it worth it?
I'm well aware of debates around homework effectiveness and questioning whether it is actually useful. All I can say is that this year we have been much tighter on homework as a department and have seen a substantial improvement in test/exam performances. Clearly we have changed a whole range of other things as well (some already mentioned in this blog) so it is impossible to infer a true causal relationship for this, but I do believe it has made a worthwhile contribution to our school record performance for GCSE results already this year.

We have also had very positive feedback from parents on this initiative, and the pupils appreciate the certainty that the process brings.

Where now?
We're looking at further developments - partly in broadening the ranges and types of homeworks set, while still keeping this direct parental link. We are also planning on extending this further across the year groups as we develop the admin systems to allow this more easily.

What are your thoughts?
Do you do something similar? Something different? Can you suggest anything even more effective? All thoughts and comments welcome.