TLT15 Part 1: Closing real gaps with data days

In education, we are obsessed with closing gaps. Gender gaps, gaps between cohorts, the gap between where a student should be and where they currently are. But we can’t close these gaps easily- at least not directly. They are actually chasms, impossible to fill because they are too abstract and just not specific enough. Gaps tell you nothing about individual students. The only way that we can close these massive gaps is by concentrating on the very specific things students don’t know, and the very specific things they can’t do.

The traditional way of closing these gaps is with written feedback, but it is pretty inefficient. Many others have written about some of the problems with thinking of feedback as equivalent to marking (see links at end of post) and in my own post on solving the problems of feedback, I tried to mitigate for these problems, but if so much effort has to go into making marking more efficient, is it actually the best method? If we count up the hours spent marking books by a typical teacher during the week, can we say that this is an effective use of their time? Over the next couple of posts, I will outline how we can use feedback to greater effect, without spending the hours it takes to write detailed comments in books. In this post, I am looking at how we use data days to close real gaps.

Data Days

At Dixons Kings Academy, we have three ‘Data Days’ throughout the year. Inspired by the book Driven by Data by Paul Bambrick-Santoyo, these are days where students don’t come in and teachers look through their class data, discussing with heads of faculty how they can address any problems. These are not meetings where teachers have to justify their data, but ones where they can explore it, spot patterns and identify tangible next steps.Driven

Bambrick-Santoyo suggests starting these meetings with ‘global’ questions, which look at the wider class picture:

How well did the class do as a whole?

What are the strengths and weaknesses? Where do we need to work the most?

How did the class do on old versus new standards? Are they forgetting or improving on old material?

How were the results in the different types of question type (multiple choice vs open ended; reading versus writing?)

Who are the strong and weak students?

Then these are followed up with ‘dig-in’ questions:

Bombed questions- did students all choose the same wrong answer? Why, or why not?

Break down the standards. Did students do similarly on every question within the standard or were some questions harder? Why?

Sort data by students’ scores: are there questions that separate proficient and non-proficient students?

Look horizontally by student: are there any anomalies occurring with certain students.

Assessment design and analysis

It is no use just turning up to these meetings and looking at grades, because grades just don’t tell us anything meaningful- they are not specific enough. They are a starting point, but not much more than that. These meetings are only effective if we have precise and useful information. To get it we have to ensure that our tasks and assessments are designed with this in mind. A well designed assessment tells you much more than simply the grade achieved. It can tell you in exact detail what students can or can’t do or what they do or don’t know- but only if it is designed well. Often, assessments can be arbitrary tasks which we set because we have to. Or we just mark something randomly completed in class: I haven’t marked their books for a while, so I’d best mark this. If we think about some of the questions above, it makes sense that a good assessment will give information at the class level, student level, question level and assessment objective level.

TLT152Assessments can be essays, a mix of different types of questions like a Science exam, or could even be a multiple choice quiz. I often use Quick Key as a highly efficient way of using MCQs to get precise information about students. See here for a walkthrough. If multiple choice questions are designed effectively, they can give us exactly the information we need. In this example, showing itemised data from a quiz, we can see a common wrong answer of B. The wrong answer gives me very useful information about a common misconception. With carefully designed plausible wrong answers, multiple choice questions like this can be incredibly rigorous and often inform you far better than an essay would. It might be more time consuming to input question level data manually, but it wouldn’t take more time than writing comments in books. See the links at the bottom of the page for more on question level analysis.

Intervention planning

Once we have drilled down to exactly what to focus on, we spend the rest of the data day intervention planning. These plans need to start from the gaps, not from the students. If we start from the students, then we end up with a massive gap to close and then try to throw things at the gap, ultimately changing very little. By starting with specific things, the gaps can actually be closed.

I used to start with the student; this is from an old intervention plan: “[student] does not always understand the task. He can produce good work when given very clear instructions but struggles without. Seated at front of class for easy support.” The full intervention plan is littered with similarly vague and useless advice. For this one, moving to the front is my plan. Well, if it was as simple as that, we would design classrooms to be all front! For another, the only thing written was: “Irregular attendance. Hopefully this has been addressed.” Despite the best of intentions, no gap was being closed, at least not as a result of my plan.

TLT153Our intervention planning now starts with the gap to be closed, then the students it affects, then exactly what will be done. Nothing vague. Precise gaps, specific interventions. As you can see, the interventions include different tasks, reteaching, working with other adults etc. There is no guarantee that this closes every gap for good, but at least it is a focussed plan. There are a couple of students on that plan who are not underachieving according to their grade, but have some basic issues that need fixing- these might not be our focus if we started just from the grade. On data day, every teacher produces one of these per class for the following three weeks. The most important thing is that we give teachers time to do it. Time to ask the questions, write the plan and prepare the resources.

Even if data days didn’t exist, the process could still be followed: design effective assessments, ask questions of the data, plan and carry out specific interventions on specific gaps. I contend that the time spent doing this is as valuable as the time spent marking a set of books, and I wonder if this is a more effective proposition. Next time: in-class interventions.

Useful Links

Marking and feedback are not the same from David Didau

Marking is not the same as feedback from Toby French

Is marking the enemy of feedback? from Michael Tidd

Marking is a hornet from Joe Kirby

Exam feedback tool from Kristian Still- this is a timesaving question level analysis and feedback tool

The problem with levels- gaps in basic numeracy skills identified by rigorous diagnostic testing from William Emeny

Is there a place for multiple choice questions in English? Part 1 and Part 2 from Phil Stock

Question Level Analysis with Quick Key

Quick Key is a tool that I use regularly to provide me with detailed feedback on student understanding. I wrote about it briefly in this post about technology that makes my life easier, but I thought that it merited a more detailed explanation.

QKW1First of all, Quick Key is an app for IOS and Android. It is free, but there is a paid version which allows you to do more. It is still incredibly cheap compared to other tech tools. You set a multiple choice quiz, students fill in one of the tickets and then you scan the tickets in using your phone. Students don’t need to have a device themselves- just a pencil. It takes very little time to create quizzes and very little time to scan them in. On your phone, you can sort the questions to find the ones that students struggled with and sort students by score also. This is great, but when you go to the website, you can do much more with the data, including exporting itemised information.

QKW2Here is the question level data from a recent quiz. The quiz was taken by a year 10 class and was on everything that they have studied this year. They should be able to answer every question as everything has been covered. That doesn’t mean that it has been learnt, however. A quick explanation of what you can see. Student names are down the left, scores down the right. Question numbers and correct answers are along the top. (I have shaded questions on the same topic together to help my analysis.) When there is a dot, this indicates that they answered it correctly, a letter indicates a wrong answer and the option chosen, while an X means that a question wasn’t answered (or wasn’t scannable).

QKW3I have identified the two lowest scoring students here. If I was only interested in scores, I would just give each extra work and be done with it. However, on closer inspection I can see that the first student has performed poorly across all topics, which is quite consistent and is borne out by classwork too. But the other student has actually answered a number of the first questions correctly (1-8 were all on language terminology) and is not underperforming relative to his peers. It is later questions where he has struggled. From this information, I can work out a couple of possibilities: 1) The student doesn’t know the poems from the anthology well, particularly Ozymandias and London. 2) Perhaps the student struggled to answer all of the questions in the given time. My next step? Extra revision on these two poems for homework; extra time on the next quiz.

QKW4This question level analysis allows us to identify the differences between pupils who at first seem to be performing at an identical level. If we look at these two students who both achieved 67%, we can see that, with the exception of question 14, they answered completely different questions incorrectly. If we said that 67% equalled a C and put these two students in a C grade intervention group, neither of them would get exactly what they needed! Question level analysis of this kind allows us to really know what individual students need.

QKW5Now let’s move from students to questions. 13 and 14 were answered poorly across the board. This obviously identifies a gap in knowledge and in this case it tells me that students don’t really understand what the exam will look like. They were ‘taught’ this, but I obviously didn’t do as good a job as I thought. This is good information to have because it is easily addressed on a whole class level.

QKW6But what about question 5? This surprised me, as this question was designed to show that students understood what a metaphor was. And in class, they can do this. But a look at the question itself gives me a clue. I can make an assumption here that a number of students just saw “like or as” and thought simile. From this I can work out a couple of things: 1) My students need to read questions carefully and 2) I need to have a better explanation for similes than “it is a comparison using like or as”. QKW7I pulled on this thread and looked at question 1- why did students not answer this correctly when I expected them to? Once again, a quick check makes me think that students saw “verb” and jumped straight to C without really reading the question. It may of course simply tell me that they don’t know what a verb is and that option was an understandable guess. Both of these questions flag up that some students can identify terminology and give examples, but can’t define them.

I could delve deeper into this data but I think the point of its usefulness is made. The quiz took maybe 30 minutes to create, 10 minutes to complete and 5 minutes to scan. Yet the information it has given me is perhaps more valuable than marking a set of books, something which would have taken considerably more time.

Technology that makes my life a little easier

My old classroom had a suite of computers, an interactive whiteboard and a visualiser. I loved the visualiser and it was great to not have to book IT suites, but did these things transform my teaching? I’m not sure. In fact, I spent a lot of time trying out sites and doing things on computers because I could rather than I should (I still have nightmares about those ClassDojo monsters). I’m not complaining- I was very fortunate to have that classroom- but I think that the technology that is most helpful to me now as a teacher is all relatively simple. So here are three pieces of technology that make my life easier.

The Clicker

AugustI think this has been the most useful piece of technology I have ever purchased. It has a couple of very simple functions: it clicks slides back and forward; it has a laser pointer. The greatest benefit is that I can move around the classroom when I’m teaching- I’m never stuck behind my laptop. If, when circulating, I need to revisit something with a student, I can click back through slides. Simple but really useful. It was recommended to me by @amsammons and is available here for £7.25.

Quick Key

Quick Key is an app which scans multiple choice quizzes. Results can be broken down by student and by question on the app. You can then export the data in various forms from the website for further analysis.

It’s easiest to illustrate the benefits with an example, in this case a quick homophone activity I used with a class. The first 5 questions had options of your/ you’re and the next five were there/their/they’re. I scanned it in and then exported the data in question level analysis form, which looked like this.

QK1Where there is an X, the question was not answered. If there is a letter, then that indicates that they were wrong and they chose that option.

With the exception of one student on one question, it is clear that the class coped well with your/ you’re. The X in this case was actually a correct answer which just didn’t scan properly. Questions 7 and 10 were answered poorly, particularly Q7. Both of these required ‘there’ as an answer, yet students generally went for ‘they’re’ for Q7 and ‘their’ for Q10.

Q7: ___________________ are nearly 65 million people in Great Britain.

Q10: I saw him standing _______________

The three students who struggled the most with homophones were given additional exercises during class and I also spent some more time teaching them the differences. The class were then given an extended quiz on there/they’re/their just to make sure that guessing was less likely to be a factor in success. We looked together at the specific examples, addressing why they may have used the wrong one. The first one looked to be that the word ‘are’ had confused them but I was surprised with the errors on Q10.

It’s so useful to have this kind of precise individual and question level data, which can be immediately acted upon. One of my colleagues sets Quick Key quizzes as a Do Now activity, marks them straight away and then provides follow up activities and reteaching where necessary within the same lesson. With well designed multiple choice questions, Quick Key is an incredibly effective tool.

QK2It costs around £20* for the annual Pro subscription. I’d recommend this as the free one does much less and you can have a free month’s trial if necessary. It’s not without frustrations. Students will find all sorts of ways to fill the forms in incorrectly and you need to avoid bright lights when scanning. You can always manually enter the answers as a last resort.

*Smartphone not included.

Evernote

EvernoteI think we could do well by talking more about organisation and productivity for teachers. I am envious of those  people for whom organisation seems to come naturally so I have tried to improve my productivity in the same way that I might try to improve my questioning or my feedback.

I read a number of stories about how Evernote had transformed the way people work, had a look at what the fuss was about and am now a convert. Evernote is an app for creating, organising and storing various forms of media. It’s versatile and can be accessed on different devices and synced between them.

If someone grabs me on the corridor and asks me to do something, I can record it immediately. I can clip pages I want read to view later. I can forward emails there. I can make checklists and tag in a way that keeps me organised. I can set alarms, store documents I will need, add images, record voice reminders. By capturing things in an organised, searchable place, I don’t have to hold things in my head or fill endless notebooks. When things need doing, they are there.

I don’t think Evernote would be nearly as effective without an organised system. I use a variation of the ‘Secret Weapon’ method which you can read about here. Without this- and the excellent advice in Getting Things Done– Evernote could just be a messy electronic version of a notebook. As with Quick Key, there’s a free and a premium version. The free version is pretty good with Evernote.

A final point. All of these are things that I’ve been using for a significant period of time, not merely weeks or months. I filed this blog away for many months to ensure that I wasn’t just talking about the latest fads (which is why Plickers is not on here). These might be just what you need, but they might not be.