Wayfinder Learning Lab

"Learning is about living, and as such is lifelong." Elkjaer.


Making Feedback Visible: Four Levels in Action

Five years ago I was starting to become concerned with the difference between marking and feedback. What was making a difference to my students’ learning and was the effort I was putting into detailed marking worth it in terms of their improvement? In reading Hattie’s Visible Learning for Teachers, Wiliam’s Embedded Formative Assessment and the pdf of The Power of Feedback (Hattie & Timperley), I developed a four-levels feedback template for use on student work.

This post is to share an updated version – I still really like this method of giving timely, actionable, goal-focused and student-owned feedback. It definitely saves me time, but puts the focus of feedback on what’s most important for the student to take the next step. I’ll keep updating, editing and adding to this post.

When giving feedback on a piece of work, I paste this at the top of the student’s assignment, give some comments in the work and check their self-assessed rubric. Before we open individual feedback, I summarise whole-class feedback.

A copyable GoogleDoc version of the grid (and teacher explanation) is here, and to export it as pdf, click here.


Why present feedback this way?

Screen Shot 2013-11-03 at 9.22.37 PM

Hattie & Timperley, Four Levels of Feedback. Click for the pdf of ‘The Power of Feedback.’

Feedback addresses three questions:

  • Where am I going?
  • How am I going?
  • Where to next?

Feedback is timely, actionable and needs to be more work for the learner than the teacher.

  • Clarity of achievement so far: goal-referenced, tangible & transparent.
  • Understanding “the gap” between where the learner is and where they need to go next (not necessarily the top bands)
  • Timely. Using a system like this saves time   in grading/giving feedback, makes it more  accessible to digest (is user-friendly) and can be easily reviewed for the next time the student works towards similar goals.
  • Feedback first, then grades. Not presented together, to enforce student reflection & action.


Making The Four Levels Work

  1. Goals and outcomes need to be clear – do students & teachers have a shared understanding of what success looks like at different levels of achievement?
  2. Feedback needs to be ongoing. Students are taught to self-assess in the drafting stages and feedback (not grading) given on the drafts with plenty of time to take action before submission.
  3. Students self-assess before submission. Even better – they can peer-assess and give feedback. If tasks are differentiated, this does not present a collusion challenge.
  4. Teacher gives feedback in the grid, on the front page of the work (or in an accessible place):
    1. Check the student’s self-assessment against descriptors
    2. Check the assignment, making comments only on actionable next steps – not an overwhelming number, as this can increase the perceived “gap” for students. Students who want and will take action on very detailed marking can request this in follow-up.
    3. Summarize feedback in the grid: task-level, process level and self-regulation level.
    4. Link to support resources where appropriate
    5. Record grades out of sight of student.
  5. Teacher places value on interaction with feedback by giving class time to digest & reflect
    1. Give “whole class” feedback on common issues and note needs for later workshops
    2. Students read their feedback: table and comments.
    3. Students synthesise this into a “feed-forwards” note to self. Showing this to the teacher and a shared agreement on the next steps releases the grade, not before.
  6. Next time the task type is attempted, the first thing students do is open the feedback and set achievable, specific goals to “level up” based on the feedback & feed forwards.


Reflections in Practice

I worried initially that the pushback from students would be that I wasn’t grading enough. This didn’t happen for a couple of reasons:

  1. We made explicit the reason for doing this and I keep no secrets about the “magic” of learning from students. I explain and demonstrate what works in learning and why we do things this way.
  2. Most students like seeing the next steps really clearly. We’re not all aiming for top levels right away – we’re aiming for progress upwards.
  3. We talk about “the gap” a lot, and our quest to close the gap in prep.
  4. I already know what the grades are likely to be, as we invest time in class for drafting, feedback and conferencing. I expect students to show their work and take action on feedback.
  5. I will happily take a piece of work back and sit with a student, giving really detailed marking and justification if they request it. This rarely happens and it is usually one or two who are working at the very top of the rubric. This is far more efficient and effective doing acres of marking for large classes, the bulk of which won’t have an impact.


Hattie, J. & Timperley, H. (2007) The Power of Feedback. Review of Educational Research. Vol. 77 no 1 (pp 81-112). https://www.jstor.org/stable/4624888 (includes diagram above)

Wiggins, G. (2012) Seven Keys to Effective Feedback. Educational Leadership Magazine. Vol. 70 no. 1. (pp 10-16). www.ascd.org/publications/educational-leadership/sept12/vol70/num01/Seven-Keys-to-Effective-Feedback.aspx   (and related: EL Takeaways Poster http://inservice.ascd.org/seven-things-to-remember-about-feedback )

Dylan Wiliam Centre: Ten Feedback Techniques That Make Students Think (poster). https://www.dylanwiliamcenter.com/wp-content/uploads/2015/02/10-Feedback-Techniques.pdf


More Resources on Feedback & Grading


1 Comment

The Gradebook I Want

As the MYP sciences roll into the Next Chapter and we mull over the new guides, objectives and assessment criteria, we have the opportunity to reflect on our assessment practices. The IB have provided a very clear articulation between course objectives and performance standards (see image), which should make assessment and moderation a more efficient process.

There is a clear connection between the objectives of the disciplines and their assessment descriptors in all subjects in MYP: Next Chapter.

There is a clear connection between the objectives of the disciplines and their assessment descriptors in all subjects in MYP: The Next Chapter.

Underpinning these objectives, however, are school-designed content and skills standards. These are left up to schools for articulation so that the MYP can work in any educational context and this is great, though it does leave the challenge of essentially tracking two sets of standards (or more) in parallel: the MYP objectives and the internal (or state) content-level standards. In a unit about sustainable energy systems, for example, I might have 15-20 content-level, command term-based assessment statements, each of which could be assessed against any (or many) of the multiple strands for each of four MYP objectives.

As I read more about standards-based grading (or more recently standards-based learning on Twitter), I become more dissatisfied with the incumbent on-schedule assessment practices presiding over grading and assessment. I want students to be able to demonstrate mastery of both the MYP objectives and the target content/skills but I am left with questions:

  • If they score well on a task overall but miss the point on a couple of questions/content standards, have they really demonstrated mastery? How can I ensure that they have mastered both content and performance standards?
  • If they learn quickly from their mistakes and need another opportunity to demonstrate their mastery on a single content-level standard (or performance-level standard), do they need to do the whole assignment again? What if time has run out or there is not opportunity to do it again?
  • As we move through the calendar in an effort to cover curriculum and get enough assessment points for a best-fit, are we moving too superficially across the landscape of learning?
  • More importantly, is the single number – their grade – for the task, a true representation of what they know and can do? How can I present this more clearly, to really track growth?

My aim with all this is to encourage a classroom of genuine inquiry (defined as critical, reflective thought), in which I know that students have effectively learned a solid foundation of raw materials (the ‘standards’, if you will), upon which they can ask deeper questions, make more authentic connections and evaluate, analyse and synthesise knowledge. 

Lucky we have Rick Wormeli videos for reference. Here he is on retakes, redos, etc and it is worth watching (and provocative). There is another part, as well. If you haven’t seen them yet, go watch them before reading the rest of this post (the videos are better, TBH).


What do I do already? 

  • Lots of formative assessment: practice, descriptors on worksheets, online quizzes.
    • In each of these, there is a rubric connecting it to MYP Objectives
      • Each question is labeled with the descriptor level and strand (e.g. L1-2a, L5-6b, c).
      • I don’t usually give a grade, though do check the work. Students should be able to cross-reference the questions with the descriptors, carry out their own ‘best fit’ and determine the grade they would get if they so desire. This puts feedback first.
    • Learning tasks usually include target content standards
  • Drafting stages through Hapara/GoogleDocs to keep track of work and give comments as we go
  • An emphasis on self-assessment against performance descriptors and content-level standards (and goals for improvement or revision).
  • I use command terms all the time: every sheet, question, lesson where possible.
  • Set deadlines with students where possible and am flexible where needed.
  • In some cases reschedule assessment or follow up with interview or retake (but not as standard practice). As Wormeli says above (part 2): “at teacher discretion.”
  • Track student learning at the criterion-level (MYP objectives), though with current systems (Powerschool), not in great detail at the objective strands (descriptors) level (e.g. A.i, A.ii, A.iii).
  • I do tests over two lessons, giving out a core section in the first, collecting and checking in-between classes. In the next session, this is supplemented with extra questions that should allow students to take at least one more step up. For example, a student struggling with Level 3-4 questions would get more opportunities to get to that level, whereas another who has shown competency will get the next level(s) up.

What do I want to do? 

  • I want to also be able to effectively track every student’s growth in the content standards and develop deeper skills in inquiry (critical reflective thinking).
  • Develop a system for better tracking learning against the individual strands within each criterion (e.g. A.i, A.ii, A.iii).
  • Better facilitate development of student mastery, allowing us to move further away from scheduled lessons and into more effective differentiation and pacing.

What would help? 

I would really like a standards-based, MYP-aligned, content-customisable gradebook and feedback system that is effective in at least three dimensions:

  • Task-level to put levels for each task, each of which might produce multiple scores, including:
    • Various target content-level standards
    • MYP objective strands at different levels of achievement
  • It would need to allow for retake/redo opportunities for any and all standards that need to be redone – not necessarily whole assessment tasks. 
  • It would have to focus student learning on descriptors and standards, not on the numbers, in order to help them move forwards effectively. Students would need to be able to access it and make sense of it intuitively so that they could decide their own next steps even before I do.
  • It would super-duper if the system could produce really meaningful report cards that focus on growth over the terminal nature of semester grading.
What would a three-dimensional gradebook look like?

What would a three-dimensional gradebook look like?


Here is Rick again, describing another approach to a 3D gradebook:


Taking It To Twitter

1 Comment

Give a Student a Fish…

“Give a man a fish and he’ll eat for a day. Teach him how to fish and he’ll feed his family for a lifetime.” Anne Ritchie, 1885 (maybe)

This short post, again related to Understanding Learners and Learning, Visible Learning and MYP: Mind the Gap, revolves around my (admittedly flawed) memory of an old aid advert, a bit like this:


After defining learning and thinking critically and reflectively about the nature of inquiry and why there might be a tension across the MYP-DP transition, I want to think briefly about the learner that crosses that gap, using the obvious metaphor of the fisherman as the learner and the fish as the content, skills and conceptual understandings that the student brings up from MYP to DP.

What kind of student do you want to come up to your DP class from MYP? The kid with a boatload of fish or the thinker with the ability to catch more fish?

For authentic inquiry (critical, reflective, future-focused, consequence-oriented, ‘what-if’ thought (Elkjaer)) to be successful, students need some fish in their stomachs. We can’t ask good questions of nothing, nor can we evaluate the empty. So content and skills are needed by the student moving into the Diploma Programme. But is it the MYP teacher’s job to pre-teach everything to a DP student? What is important to know and be able to do? What conceptual understandings and approaches to learning are the most advantageous to develop, to ‘clear the path’ for effective learning and success in terminal assessments?

What happens if we ‘teach’ our students too much before they get to DP? Two things concern me here: interference and motivation, both of which I need to learn more about as I continue this assignment.

The first is the known negative impact of interference: the effect of incorrect or poorly-formed conceptual understandings on future learning. This is outline in Hattie’s Visible Learning and the Science of How We Learn, and is of particular relevance to the thoughtful science teacher; students come up to our classes with a multitude of prior learning (correct or otherwise) that can either help or hinder their learning. If they arrive with a solid understanding of the concepts of evolution (Biology) or energy (Physics), for example, they will be better able to make connections (transfer) this learning as they modify existing patterns or construct new schema. Conversely, if their existing understandings are misconceptions these need to be undone before effective learning can take place, and this is very difficult to do. These misconceptions may come from poor prior teaching, superficial learning (e.g. content cramming) or in the confusion between discipline-specific and everyday use (e.g. ‘power’). I would argue here for a very carefully-constructed conceptual curriculum in the MYP years, one that emphasises not a large body of content but a highly-effective approach to constructing correct conceptual understandings.

Parallel to this is the concept of cognitive load and ego-depletion: we need to maintain a careful balance between effective learning to the point of competence and over-exertion to the point of no learning. Knowing is pleasant, but learning is uncomfortable. The ideal student coming up from DP would be fluent in the basic skills, concepts and knowledge that they learned in MYP: the basics of this core curriculum having been automatized and committed to ‘System I’, the ‘fast-thinking’ part of the memory (Kahnemann), leaving cognitive load ready for the heavier lifting in higher-order thinking (‘System II’, slow-thinking’). This is all described with much greater competence in Hattie’s Visible Learning and the Science of How We Learn.

This might be a challenge to teachers ‘across the gap’ as the urge to cover content can be a strong one, but perhaps we should rather think of it as developing students who can fish well over those who are paddling upstream with a boatload of rotten trout.

The second issue that concerns me is one of motivation. In a highly content-driven, test-focused, behavioural/empirical classroom we risk creating or reinforcing a culture of extrinsic motivation, in which grades are king and are used to positively or negatively reinforce learning behaviours (ego orientation). When everything is accounted for, where is the motivation to learn as a true learner, to be truly inspired to know more? In soe school cultures we might say that it doesn’t matter how the students learn, as long as the results are high, but in that case are we really educating them or are we just passing them on to the next set of accountants?

With an inquiry-led, cognitive/rationalist classroom can we develop a more intrinsic motivation to learn, to develop a greater self-efficacy as learners in order to be more critical and reflective in our thought: a mastery goal orientation? How can the MYP classroom develop students effectively through the Approaches to Learning so that they are ready to get fishing as soon as they start Diploma and are carrying with them a solid set of conceptual understandings that will help them transfer their learning and make new connections?

Finally, do we really need to pre-teach such a great deal of content in the MYP that there are no new discoveries in the Diploma Programme? How motivated are we to re-learn what we (think we) already know and what is the effect of boredom (coupled with potential interference of misconceptions) on the effectiveness and meaning-construction in what we are trying to learn?

Once again the tensions in the transition from MYP to DP represent a fine balancing act, one for which I need to do a lot more learning.


Greeno, Collins & Resnick. Cognition & Learning, chapter in Berliner, D. & Calfee, R. (eds.),Handbook of Educational Psychology, Macmillan, New York: 15-46

Illeris, Knud. Contemporary Theories of Learning: Learning theorists… in their own wordsChapters by Knud Illeris, Bente Elkjaer.

Hattie & Yates. Visible Learning & The Science of How We Learn.

Kahnemann. Dual Process Theory.


On a parallel aid-related note, here’s a quick video from the World Food Programme on that old saying:

1 Comment

First Unit Reflections: Is It Working?

Today we took the opportunity in the IBBio class to reflect on the unit we have just completed, including the tasks and assessment. As always with CA students, the results were constructive, positive and useful, with a general affirmation of the value of what we are doing as a class. The feedback included our personal GoogleSites project, with most students keen on continuing and feeling it helped them learn and with some interesting alternatives for those that it is not.

This kind of feedback is really useful once the class has settled in. They are open enough to be able to be honest, but it is early enough to change practices where needed. We will make some adjustments, though we are generally on the right track with this group. I’m really looking forward to seeing the process and products of the students who have elected to become science writers instead of GoogleSiters.

Here are the results in a summary presentation.


Using Student Learning Data to (try to) Improve Student Data Processing: A Department Goal

This year saw the start of a school-wide push to become more data-driven in our decision-making and evaluation. As a result, one of our goals as teachers at the start of the year was the Student Learning Goal, a target set by groups or individuals based on data collected in school. As a science department we opted to investigate Criterion E: Data Processing in the MYP. We picked this criterion as it is something we have all experiences student difficulty with, it feeds directly into success at IBDP and even with the Next Chapter changes the skills and conventions are likely to remain intact as it is such a fundamental part of the scientific method. Data analysis on data analysis; that’s how we roll. 


  • Our initial goal was to analyse the Criterion E scores given in 2011-12 and take actions to improve them in 2012-13, though we realised early on that there were too many uncontrolled variables within the department to make comparisons valid and reliable. Scientists, eh?
    • As a result we shifted this year’s focus to setting up a greater vertical departmental understanding of the elements on Criterion E, as well as preparing resources and exemplars for students.
    • We are confident that the semester 2 data from this year are more reliable, and so will be useful in comparison next year (2013-14) as we take action on our work.
  • To work towards this common goal we needed to have a lot of discussions about data processing, presentation and analysis. This involved:
    • Identifying and discussing common student errors and misconceptions.
    • Unpacking the rubrics to make sure we all shared a common understanding of what is expected.
      • This included a lot of vertical discussion about what elements are appropriate for MYP 1 and 3, and I think this was the most powerful part of our work.
    • Identifying commonalities and differences in expectations in MYP and DP Biology, Chemistry and Physics courses in terms of conventions for data processing.
    • Moderating exemplars of student work.
    • Looking up research, journals or articles on student issues in data processing and sharing these with the group, to further discussion and develop strategies to use in our classes.
  • Towards the end of the year we were able to produce a GoogleSlides set of resources to give common advice to students, with exemplars. This is to be copied and edited to MYP3 and MYP1, to meet the adjusted expectations of the interim objectives. As you can see, there is still some work to complete, though it will be ready for action with students in August.

Read more…

Continue reading

Using personal GoogleSites for learning, assessment & feedback in #IBBio

Click to see an example of how the GoogleSite was set up.

This is reposted from my i-Biology.net blog. To comment, please go there.


Over the last two years, My IB Bio class have been keeping individual GoogleSites as records and reflections of their learning. Based on this experience and their feedback, I have tweaked the project to try to make it more effective as a learning tool.


With the bulk of our resources online (here on i-Biology.net, Slideshare and elsewhere), as well as a 1:1 laptop and GoogleApps environment, it doesn’t make much sense to be using too much paper. The aim of this project was to empower students to build skills and knowledge connected to the IB Biology course, whilst making their thinking visible to me as a teacher. Through this process, students are able to track their progress, stay on top of their grades and prepare at their own pace (especially if they are working ahead).

Continue reading

Leave a comment

Teachers as Researchers & Engaging in Academics

Read it!

In the 1806 of Susanna Clark’s Jonathan Strange and Mr. Norrell, magicians are scholars of magic rather than practitioners. They study magic, wonder where it went and discuss it. Mr. Norrell is the only ‘practical magician’ in the country, a position he holds dear until along comes Jonathan Strange: first as a pupil, then as a master in his own right. I loved it – all 1,000 pages plus – and one thing that has stuck with me in the six years since I read it was the use of the term magician not as someone who ‘does magic’, but as one who studies, just as an historian doesn’t make history.

It popped back into my head this week after reading this:

Which I was drawn to by this:

If you’re still reading this blog and haven’t read Tom Sherrington’s Guardian article, read it first.


Most science teachers aren’t scientists.

I’m certainly not, and I can live with it. I’m a teacher. I have friends who are PhD’d up and active in research: they deserve the title. Perhaps I should be called a scientician, in the Jonathan Strange and Mr. Norrell sense. As a non-practicing scientist, a teacher who is responsible for many students, most of whom won’t go into the sciences but may end up in the arts or humanities, I have no right to be a discipline snob. But I do understand the scientific method and I can apply this to education.

I love the breadth and currency of science across disciplines and sharing that with students much more than the minutiae of specialism (I think months in a fridge counting cells did that to me). I model the scientific method with my students as we learn about Biology, Chemistry, Physics and Environmental Science, but we are not adding to the canon of scientific knowledge as much as we are marching on to the assessments that will get the students out of school and (perhaps) into universities… where they most likely still won’t be real ‘scientists’ until they near the end of their degrees or become postgrads. I can live with that, too, even though I would much rather free up the curriculum to engage students in ‘real science’. We have some good opportunities in the IB MYP and DP for exploring science, and students from our school are engaged in some interesting self-directed science projects.

I hope that as science teachers we’re sowing the seeds and nurturing the roots of scientific inquiry and literacy; that as our students grow beyond our reach our influence remains in some of them and they are engaged in real science. If we have a cognitive surplus that is not engaged in genuine scientific research, then we should harness it to improve education.


Maybe one day I’ll earn my stripes as an educationalist.

I am actively engaged in my own PD, in ‘experimenting’ with learning in my class and with working on an MA in International Education. As my foundations in educational theory and research develop, these informal ‘experiments’ in my classes will hopefully develop into more controlled and reliable projects. I will admit that my inner qualifications-snob did at first shop around for an education-related masters that gave the title of MSc, but I dropped that soon enough when I realised that the international & education elements were more important to me, my values and my family.

I agree that using schools as research institutes could be incredibly powerful PD. We, as science teachers, don’t need to to feel threatened by the perceived ‘soft science’ approach of educational research. Good educational reserach is by no means soft. Just look to the Hattie meta-analysis for evidence of this.* We could take ownership of our own professional development, draw on academic research and apply our understandings of the scientific method, reliability and validity to the work that could take place. The work we produce would be evidence-based, in our own useful context. It would be cognitively engaging and would really count as development – perhaps much more so than the passive forms of PD that tend to be ‘done to’ teachers.

There are bound to be challenges to this, though. Masters-level work or real research in schools takes a significant amount of effort. What is clear from Sherrington’s article is that his school is fostering a research environment – it has become part of the school’s values and I find it difficult to see how really effective research could develop in a high-pressure environment.

As teachers in international IB schools, we have to live up to high expectations, but I would argue that we have fewer of the significant challenges that might inhibit others: behaviour, funding, the whims of educational governance. I look forward to seeing the research that is produced in IB schools (and through their own Journal of Teaching Practice).

I would love to see the sciences lead the way.


*The Hattie meta-analysis gets its own post because, you know, blogs.

1 Comment

How Twitter shook my confidence as a teacher (and why that’s a good thing)

I’ve been a Twitter user for about a year and a half. That’s late to the party, I know, but at first I was skeptical. It seemed a time-suck and a frivolity: what could be worth saying in 140 characters? Then I got on board and it was a bit of fun. I shared some resources, using it in much the same way as my i-Biology Facebook page (pretty one-way). I followed a few people, learned some things about science, picked up some nice little tech ideas. But…

…as the list of people I follow grew, as I cultivated my lists and as I developed my own online PLN, the one thing I really learned was that ignorance is bliss

Thats an interesting article. I’ll use that in class. 

There’s a cool tech tool. Maybe I can adapt that for class. 

I like that assessment idea. I wonder if that would work in my class? 

That’s a great way to give feedback. I totally need to try that in class. 

That school has an interesting approach to curriculum. Would that work in my class? 

Wow, that’s a really effective way to teach. I should incorporate that into my class. 

That’s really different to the way I’ve always done things. My way looks wrong now. What does the evidence say? OMG, what have I been doing? How can I really improve the learning in my class? 


With the stream of Twitter inspirations pumping ideas into my brain I realised that I had a lot left to learn about effective education. It is exciting, but it is exhausting and at times I feel like the Red Queen: the running to keep up never ends. I hold myself to high standards, and seeing that there are better, more effective ways to facilitate learning makes it hard to be satisfied with my practices.

I have learned that although I may have started out as a confident pseudoteacher, or a ‘good teacher‘ by Grant Wiggins’ definition, I had and will always have some way to go to be a ‘great’ teacher. I’m not the ‘cool young teacher’ any more! After almost ten years in education, this powerful and personalised professional development has been the kick up the bum I needed to keep moving forward and to stay excited about our craft.

So yes, Twitter has shaken my confidence, but it is also making me a stronger teacher – and I think that’s just fine.


Some inspirations, and real challenges to my thinking and teaching: 

  • Modeling instruction ideas, from Frank Noschese (@fnoschese) and Gary Abud (@MR_ABUD)
  • The Big Thinkers on curriculum, differentiation, homework, assessment and more
  • The endless stream of awesome ideas for tech integration (and being able to remotely keep up with tweets from conferences)

There is a lot to learn out there, and it is a great tool to develop PD. But it is probably a hard sell to tell teachers that it will make their lives harder!


Thanks to Liz Durkin (@LizDK)  for the discussions in Digital Bytes today and helping form the inspiration for this post! 

Leave a comment

Curriculum Studies Assignment: Physics & the MYP

With permission from my tutor, here is my Curriculum Studies assignment: A critical review of a Grade 10 Introductory Physics course as part of the International Baccalaureate Middle Years Programme, examining selected aims and purposes and analyzing the extent to which these are, in my experience, achieved in practice.  Catchy title, eh?

Some quick reflections on the process and the product: 

  • In the early stages of this unit I really got into the academic reading. The resources set up by the University of Bath are excellent and you can get an idea of what was there (as well as my responses to the tasks) here.
  • It did become a bit of a slog in the writing stages: I took this unit entirely online and there was no activity on Moodle or elsewhere. As a result, I blogged all my thinking and ended up with some interesting discussion and feedback via Twitter.
  • My tutor, Mary Hayden, was a great support. I do wish I’d had the opportunity to meet with her in person for this, as we were limited in our exchanges by the asynchronicity of timezones and busy schedules. As a result the pace of my work slowed, but in the end it came out OK.
  • The importance of getting a good draft in early became apparent here. In the last unit, I had just moved to Japan and was buried in work and adjustment. The MA work suffered and the best I got in was an outline. In this unit, I was able to submit well-structured drafts and received rich and workable feedback. This is something I emphasise in my own teaching, but when the shoe’s on the other foot it is easy to fall behind. The moral of the story – get good drafts in early, front-load the effort, and the results will pay off.
  • Although I enjoyed the academic side of things in the Assessment unit, I really got into it here and this unit helped me realise that I am happiest in the teaching and curriculum side of things – as a teacher, coordinator and instructional leader.

I’m taking a couple of months off now, and will pick up another unit in March and another summer school. I think the best way for me to work would be to get started on the units early, and then come to the summer school with work formed and ready for feedback, rather than waiting to get there to get started.

Anyway, here it is.


Differentiation through a ‘Readiness Filter’?

Carrying on from my last reflection on the differentiation workshops here this week…

Some subjects have a great freedom of curriculum and are natural fits for student-driven inquiry all the way through to MYP 5 (and beyond if they exist as part of our IBDP). In their cases, one might put readiness, interest and learning profile on an equal footing. The path a student takes through the subject could be very different to their peers (with different outcomes), based on the ways in which differentiation is implemented.

Others, such as Science (my own subject) and Maths, feed into quite prescriptive Diploma Programme courses. All paths lead to the same destination – the examination room and assessment of defined outcomes. Clearly there is minimal scope for differentiation of product or content, but plenty of room for differentiation of process. This led our discussions into whether we should be using readiness as a filter* for differentiation in our classes in MYP 4-5 and IBDP.

With clearly-defined command terms linked closely to assessment rubrics and eventually grades, should we (or could we) first use readiness to pitch lessons at the right level for each student and to ensure that they are making those incremental steps towards progress?

I would love to get to the point where I am using readiness and data in most planning decisions, with learning profile and interest to differentiate further within those levels. Flexible grouping tasks would be used to make sure the same kids aren’t always stuck with each other. Lofty ideals, eh?

I’ll let this diagram I cobbled together explain the rest…


*Thanks @LizDK for the word – it fits the idea perfectly!