This quick brain-dump is based on ideas from Hattie’s Visible Learning for Teachers, Wiliam’s Embedded Formative Assessment and the pdf of The Power of Feedback (Hattie & Timperley) linked below.
I spent much of today trying to grade a large project (Describing the Motion of the Rokko Liner, our local train), which was assessed for MYP Sciences criteria D, E, F. Based on some of our Student Learning Goal work on helping students cope with data presentation and interpretation, the lab had been broken into stages (almost all completed in-class), spread across A4 and A3 paper and GoogleDocs in Hapara.

The result: a lot of visible learning in that I could keep track of each student, see their work in progress and comment where needed. A lotof verbal feedback was given along the way, with some worked examples for students. Breaking the large assignment into stages helped keep it authentic and manageable for students, with some days dedicated to individual strands of the assessment criteria.
The challenge: a Frankenstein’s monster of a grading pile, part paper, part digital and all over the place. After trying to put comments on the various bits of paper and Google Docs I gave up, realising that I would be there for many hours and that potentially very little would be read carefully be students or actioned in the next assignment. I turned to Hattie (and Wiliam). Visible Learning for Teachers has a very useful section on Feedback (d=0.73, though formative assessment is d=0.9) and so I spent some time making up the following document, with the aim of getting all the feedback focused and in one place for students.
It is based on the four levels of feedback: task-level, process-level, self-regulation and self. In each of the first three sections I have check-boxed a few key items, based on things I am looking for in particular in this task and the common advice that I will give based on a first read through the pile. A couple of boxes will be checked for each student as specific areas for improvement, with the ‘quality’ statements explained in person. There is space under each for personal comments where needed. I fudged the ‘self’ domain a bit for the purpose of student synthesis of the feedback they are given – really making it a reflective space, geared towards the positive after the preceding three sections of constructive commentary.
Once I got the sheets ready, I chugged through the grading, paying attention most closely to the descriptors in the rubric, the task-specific instructions to students and then the points for action. However, I put very little annotation directly on the student work, instead focusing on this coversheet. It was marginally quicker to grade overall than the same task would have been normally, but the feedback this time is more focused. The double-sided sheet was given to them in class, attached to the paper components of their work, with the feedback facing out and the rubrics with grades hidden behind. This is a deliberate attempt to put feedback first. We spent about 25 minutes explaining and thinking through this in class.
Importantly, students were given time to think carefully about why certain notes had been made and boxes checked on their sheet. I asked them to respond to the feedback in the ‘self’ section, and make additional notes in the three sections of task-level, process-level and self-regulation. In discussion with individual students, we identified which were most pertinent – for some higher-achieving students they can take action in more detail at the task level, whereas others need to focus more on self-regulation. At the end of the lesson, the sheets and work were collected back, so I can read the feedback and use this to inform next teaching of lab skills.
The purpose of all this is to make it explicit where they need to focus their efforts for the next time, without having to wade through pages of notes. It hopefully serves to make the “discrepancy between the current and desired” performance manageable, and a sea of marking on their work will not help with this. I will need to frame this carefully with students – some need work on many elements, but I will not check or note them, instead focusing on the few that are most important right now. Incidentally, it also allows me to more quickly spot trends and potentially form readiness groupings based on clusters of students needing work on individual elements in the following lab.
At the end of the task I asked students for feedback on the process. They generally found the presentation of feedback in this way easier to manage than sifting through multiple multimedia components, and will keep this document as a reference for next time. A couple of higher-achieving students asked for more detailed feedback by section in their work, which is somthing I can do at request, rather than perhaps by default; I know these students will value and take action on it.
Here’s the doc embedded. If it looks as ugly on your computer as it does mine, click here to open it.
If you’ve used something like this, or can suggest ways to improve it without taking it over one side per section, I’d love to hear your thoughts in the comments or on Twitter. I’ll add to the post once I’ve done the lesson with the students.
UPDATE (2 December): Feedback-first, peer-generated
Having read that adding grades to feedback weakens the effect of the feedback, I’ve been thinking about ways to get students to pay more attention to the feedback first. For this task, a pretty basic spring extension data-processing lab, I checked the labs over the weekend and wrote down the scores on paper. In class I put students in groups of three and asked them to share the GoogleDoc of the lab with their partners. They then completed a feedback circle, using the coversheet below to identify specific areas for improvement and checking them. If they could suggest an improvement (e.g. a better graph title), they could add this as a comment.
This took about 15-20 minutes, after which students completed the process-level and self-regulation sections and returned the form to me, before continuing with the day’s tasks. Before the next class, I’ll add their grades to the form (rubrics are on the reverse of the copy I gave students) and log them in Powerschool. Delaying communication of the grade this way should, I hope, have helped students engage more effectively with the feedback – I learned last week that making changes in Powerschool resulted in automatic emails to students.
I was wary of doing this first thing on a Monday, but the kids were great and enjoyed giving and receiving feedback from peers. Of course some goofed off a little, but they were easy to get back on track. For the high-flyers who enojoyed the method less the first time, this gave them a chance to really pick through each others’ work to give specific feedback for improvement.
Here is the document:
……….o0O0o……….
……….o0O0o……….
The Power of Feedback (pdf). John Hattie & Helen Timperley, University of Auckland. DOI: 10.3102/003465430298487
Thank-you for your comments.