Te Kete Ipurangi Navigation:

Te Kete Ipurangi
Communities
Schools

Te Kete Ipurangi user options:



Literacy Online. Every child literate - a shared responsibility.
Ministry of Education.

Advanced search


Quick start guide

Literacy leaders around Aotearoa work in schools that are at very different places with respect to how open and reflective they are about their outcomes, systems and practices. Some may be very familiar with reflective inquiry and evaluative thinking, while others will be still developing their inquiry culture and skill set. In some schools there may be localised (or widespread) defensiveness and resistance to genuine reflection on the adequacy of learner outcomes and the effectiveness of teaching practices.

Before starting to engage school leaders, teachers and other staff in a reflective self-review process, it is a good idea to consider the following questions first:

  • How much experience has our school had in engaging in genuine inquiry and reflection on the effectiveness of our practices and the adequacy of the outcomes for our learners? Are we relatively new to this, or is it already infused in “the way we do things around here”?
  • What's been our history with inquiry and self-review? What has happened in the past when difficult truths were highlighted? Did we see pockets of (or, widespread) resistance to disappointing news, or did people generally engage in constructive problem solving to try and make improvements? If there was resistance, was it from relatively influential people or not?
  • What feedback have we had (for example, from ERO, or from external providers) about our capability for self-review, inquiry, reflection and continuous improvement?
  • Given the above, who would be the best three or four people to facilitate this self-review process? Is the principal willing to get directly involved in this role? Do we have senior literacy leaders on staff who are well respected and have the authority, credibility and experience to work through any resistance encountered? Can we keep this person professionally "safe‟? Would it be better to initially work with someone external to help get the ball rolling, for example, from MOE, School Support Services, or another provider?

Our experience in piloting this self-review tool was that resistance often pops up where it's not expected. Even in schools where resistance is not anticipated, you may wish to use some of the tips presented below to help maximise the chances of buy-in and a positive, constructive inquiry process.

Tips for a successful self-review and inquiry process

This tool has been designed for schools to use for themselves rather than being a Professional Development provider tool. Providers may suggest that schools use this tool and will be able to offer support with the review process where needed.

The experiences of the various schools involved in the trial phase of the project highlighted a couple of suggestions that helped get people constructively engaged in the inquiry and reflection process:

  • Rather than begin by showing people the rubrics initially, start instead with a series of open-ended questions to get a discussion going (we have some suggestions in this starter pack). From there, gather some evidence, graph or analyse it, then bring the group back together to consider the evidence alongside the relevant rubric(s) and come to a judgment about how well the school is doing on that aspect of meeting struggling readers‟ and writers‟ needs.
  • Rather than bringing all key stakeholders into one room for a discussion, talk separately to the different individuals or groups, get each of them to generate a rating and some reasoning behind their judgment, and then bring the groups together to discuss differences in their perspectives on effectiveness. [This helps ensure that conversations aren't overly influenced in the direction of the most senior or influential person participating and that different perspectives are well explored.]

Which rubric(s) should we start with?

Based on schools' experiences in the development process and pilot testing of the tool, the best place to start with the inquiry questions and rubrics is the following:

  • Rubric 9: Accelerated progress in literacy for students achieving below curriculum expectations in literacy

In other words, start with the biggest and most important question each school faces in this area: How well are we accelerating our students achieving below curriculum expectations in literacy, really? This will give your school a clear picture of how it's doing overall and how urgent and serious any shortfalls might be. It's probably the most important conversation needed to get the inquiry ball rolling.

Glossary of terms

  • Accelerated progress – progress that is faster than, that is, a steeper trajectory than, the expected rate of progress (not just faster than a particular student's previous rate of progress).
  • Assessment for learning – a two-phase process that begins with initial or diagnostic assessment prior to starting a topic to identify what a student already knows, as well as any gaps or misconceptions. As the unit progresses, the teacher and student work together to assess the student's knowledge, what she or he needs to learn to improve and extend this knowledge, and how the student can best get to that point (formative assessment). Assessment for learning occurs at all stages of the learning process. (Wikipedia)
  • Communities of practice – collaborative networks of teachers who rigorously and transparently examine their instructional techniques in order to raise student achievement.
  • Evaluation – a systematic process for determining the quality, value or effectiveness of an approach, intervention, programme, policy, service, product or other entity.
  • PLCs professional learning communities – an extended learning opportunity to foster collaborative learning among colleagues within a particular work environment or field. It is often used in schools as a way to organize teachers into working groups (Wikipedia). Effective PLCs have a focus on analysing the impact of teaching on learning and support participants to process new understandings and their implications for teaching (BES Teacher Professional Learning and Development).
  • Literacy Learning Progressions – a professional tool that shows what knowledge and skills their students need in order to meet the reading and writing demands of the New Zealand Curriculum.
  • Students achieving below curriculum expectations in literacy – Students who are unable to adequately access the curriculum due to being substantially behind the reading and writing expectations for their cohort (as laid out in the NZC) AND/OR whose rate of progress in reading and writing is too slow to achieve this.
  • Transient students – students who change schools frequently and whose schooling is disrupted by this. More specific definitions exist but are varied. Most consider "frequent‟ moves as being at least two or more changes in school every year or two.
  • The team around the child – the group of parents, teachers, other school staff, extended family and involved professionals who work together to support a child's learning and development.

Engaging teachers and leaders in the inquiry process

Facilitating an initial reflective discussion

As we mentioned earlier (under Tips for a successful self-review and inquiry process), starting with some open-ended discussion questions first can help get a genuine inquiry discussion started. Here are some you may wish to try (or adapt) to start exploring Rubric 9:

Preliminary discussion questions for Rubric 9:

  • How many students do we have who we would describe as "achieving below curriculum expectations in literacy‟? Who are they? What do we know about them?
  • What proportion of our students achieving below curriculum expectations in literacy are accelerating substantially faster than the expected rate of progression? How many are accelerating fast enough to bring them up to curriculum expectations in the next year or two?
  • What proportion of our students achieving below curriculum expectations in literacy do in fact catch up to expected curriculum levels during their time at our school? How do we know? What is our evidence?
  • What does the accelerated progress pattern look like for boys compared to girls? For Māori and Pasifika students? For English language learners? For students with special learning needs and those considered "transient‟? Who is getting "left behind‟?
  • To what extent is there a clear shared understanding across the school (and with students and their parents/whānau) about expectations for accelerated progress?
  • How well can students articulate their progress in reading and writing? What changes are we seeing in their confidence, self-awareness, engagement and motivation?
  • To what extent are students enjoying success and reaching their potential in literacy in ways that support and build on the strengths and worldviews that reflect their family and cultural values and perspectives?

Use the following probes to stimulate and focus discussion:

  • How do we know? What is our evidence? Is the evidence robust enough?
  • Do we have a clear picture of what's going on? What else should we look at – or, how else could we look at it – to understand it better?
  • What would the parents/ care-givers/whānau say? Have we asked them?
  • What would the students say about this? Have we asked them?
  • What would it look like if we were doing this really well? Are we?

Gathering and analysing evidence

Your initial discussion with key stakeholders is likely to highlight the need for some more concrete data about student progress in reading and writing. The logical next step, then, is to gather together whatever evidence you have that will help you answer the discussion questions. Examples might include:

  • results of standardised tests (such as asTTle, e-asTTle, STAR, PATs, observation surveys)
  • running records
  • overall teacher judgments in relation to the Reading and Writing Standards and Literacy Learning Progressions
  • feedback from literacy support staff
  • feedback from teachers, parents/whānau and students.

Use data from your Student Management System (SMS) to create graphs that show the progress of your students over at least two points in time, so you can get a sense of how fast they are accelerating relative to standard peer norm progress rates.

A useful resource when bringing together student data is the excellent (and brief) BECSI guide entitled What kind of student achievement data do we need? This covers all the basics such as exactly which variables to export from the SMS into a spreadsheet such as Excel, which data to get teachers to check for their classes and which tests are appropriate choices for which year levels.

When analysing the data, it is best to do the following:

  • Follow the BECSI guidelines for exporting data into a spreadsheet and having teachers do an initial check for errors and typos.
  • Identify the people on staff with skills in Excel (particularly generating graphs and writing equations that will calculate difference scores, etc) and call on other support for building such skills among a critical mass of literacy leaders and other teachers.
  • Use at least two or three different sources of student achievement data (from two or more different assessment methods) so that you’re not relying on just one. Graph each one, then look across the graphs and displays to understand what each is telling you.
  • Use MOE’s guide for calculating effect sizes to give you a gauge of the size of any shifts or accelerations, and to help with interpretation. The effect size tells you how many more (or fewer) standard deviations of progress your students experienced relative to the relevant comparison. Seek out support from MOE or suitably qualified providers to help you get this part right.

The Literacy Learning Progressions/NZC Reading and Writing Standards set out the expectations for progress and achievement in literacy and should guide your decision making.

Summarising student progress against NZC 

In the downloadable copy of the "Quick Start Guide" (pages 8 -19) a range of examples is presented to illustrate some of the possibilities for displaying data in ways that will help answer the inquiry questions and stimulate discussion about the underlying causes of successes and disappointments. As mentioned earlier, always use at least two or three complementary sources of student achievement data – no single assessment tool tells the whole story, and teacher professional judgment is an important part of the inquiry and sense-making process.

The OTJ guidelines on Te Kete Ipurangi are important here. They outline the need to combine assessment tool data with observations of student process and learning conversations with the student to arrive at an overall judgment about where achievement lies.

For more information on gathering and analysing evidence the following information is included in the Quick Start Guide.

  • Summarising student progress against NZC: In this section there are a range of examples to illustrate some of the possibilities for displaying data in ways that will help answer the inquiry questions and stimulate discussion about the underlying causes of successes and disappointments. Always use at least two or three complementary sources of student achievement data – no single assessment tool tells the whole story, and teacher professional judgment is an important part of the inquiry and sense-making process.
  • Using a Literacy Progress Grid
  • Filling out a Literacy Progress Grid
  • Summarising the overall picture of your school
  • Interpreting the completed Progress Grid
  • Graphing student progress on specific assessment tools: Your school may also wish to plot student achievement in literacy on one or more specific assessment tools so that you can see some of the more fine-grained nuances in your student achievement data.
  • asTTle or e-asTTle

Other assessment tools: Guidelines are available online at Te Kete Ipurangi showing how to make interpretations of student performance on:

  • e-asTTle Writing
  • e-asTTle Reading
  • STAR Reading
    • PAT Reading Comprehension and Vocabulary
    • Observation Survey
    • Vocabulary
  • Progress Trajectory Graph for junior reading
  • Filling out the Progress Trajectory Graph
  • Digging beneath average effects
  • Interpreting Effect Sizes

Using the evidence to make judgments about effectiveness

The next step in the process is to turn the discussions of the data into evaluative judgments about how effective the school has been in achieving progress for its students achieving below curriculum expectations in literacy. To do this, we use a tool called a rubric.

Rubrics have been used for years in student assessment to clarify expectations and standards, and to increase the validity and reliability (consistency) of grading essays and assignments. In evaluation, we can also use these tools to help define "how good is good‟ when it comes to student progress (or literacy programming, or school literacy learning culture, etc) and to judge the mix of evidence we have before us.

Using the first rubric

This step should be used after the initial reflective discussion and gathering and analysis of evidence. This includes plotting student progress relative to NZC and the National Standards using a Progress Grid for each year level (see p. 7).

Our task now is, as a group of literacy leaders (and involving other staff as appropriate), to take the analysed evidence of student progress in literacy and answer the question of “how good” those results are. We do this using an evaluative rubric, which describes what the evidence will look like if our efforts are highly effective versus minimally effective (etc) for students achieving below curriculum expectations in literacy (see p. 21).

Where to start with the rubric
The development schools have experimented with two alternative "quick start‟ approaches to using the rubrics, once they were familiar with the content. Each approach was found to be useful for understanding their data and determining next steps.

Option #1: Start at "the bar‟

  1. Jump straight down to the Minimally Effective description and check whether the evidence at hand meets the requirements there, more or less.
  2. Skip down to Ineffective and then to Detrimental to make sure that none of the items in those levels is evident within the school. If any are found, these are your most urgent points for swift action.
  3. If nothing Ineffective or Detrimental is found, and if the requirements under Minimally Effective are met, move up the levels (Developing Effectiveness → Consolidating Effectiveness → Highly Effective) one by one to see how high a rating seems to be justified.
  4. Remember, you are not aiming for an absolutely exact match here. The key question is, which "picture‟ does our evidence match most closely?

Option #2: Trawl for the "centre of gravity‟

  1. Have the group work through the rubric – individually, in small groups, or as a whole group – and highlight the statements that match the evidence in any and all of the levels.
  2. Next, identify the "centre of gravity‟ (where most of the descriptions fit; your median and/or mode) and note this as your initial approximate rating.
  3. Finally, carefully consider exceptions in the evidence (higher and lower instances of effectiveness in particular areas). Discuss whether these are important enough to justify upgrading or downgrading the overall rating.
  4. Again, the intent here is not to look for an exact match, but to generate an overall conclusion or best fit based on where the greatest weight of evidence lies, while at the same time highlighting any particular points of strength or weakness that should be celebrated or addressed.
  5. Some schools found that their evidence was so mixed (very strong results for some students; much weaker ones for others) that it made little sense to draw an overall conclusion. Instead, they highlighted the strengths and weaknesses relative to the rubric in the outcomes for students achieving below curriculum expectations in literacy.

What next?

Inquiry into the accelerated progress question (Rubric 9) is likely to naturally lead each school toward a selection of the other inquiry questions/rubrics as areas to drill down into and understand better.

Some discussion questions literacy leaders might also use to guide the avenue of inquiry include:

  • What do we believe are the key "drivers‟ (or causes) of our successes and/or disappointing results on accelerated progress for students achieving below curriculum expectations in literacy? [Brainstorm with the group; look at the logic model – Figure 2 (p. 3) – for ideas.]
  • Where do we think we might have significant room for improvement and need to get a serious school-wide conversation started?
  • On which of the questions do we have disagreement within the school about how well we are doing? In which areas would it be helpful to start an inquiry process to clarify our understandings?
  • Which of the inquiry questions is an area of particular interest in our school? Which is a frequent topic of conversation?
  • Which of the questions have we never really considered – but should?
  • Which of the questions would we have some good evidence available for already?
  • Where do we think we are doing quite well and would be energised by the success stories?



Footer: