Overview: Why are continuous improvement systems critical to sustainability?
All tutoring programs and especially new tutoring programs need improvement. Continuous improvement systems allow you to gather, act on and share the information needed to reach and exceed program goals and inform and build support from stakeholders. Without intentional continuous improvement systems embedded in your program, your program will not generate the outcomes necessary, nor will you have the data available to advocate for ongoing support and growth of your tutoring program.
How should you use data for continuous improvement?
What are data reviews for?
Routine data review and formal reflections provide a systematic and timely way to evaluate effectiveness and assess how to adjust the model or its implementation when necessary.
Routinely reviewing recent student data on a small scale (e.g., weekly reviews of school-wide data, or even daily reviews of class-wide data) will allow tutors and their supervisors to catch small gaps before they before they widen, then adapt implementation tactics to meet the specific needs of individual students. Formal data reflections at a larger scale (e.g., monthly, quarterly, or annual analyses of district-wide data) will allow your program to tell a clear story to all stakeholders about its impact so far, then make data-informed recommendations for changes in implementation strategy (or even revisions to the underlying program model).
Read more about Program Evaluation and Improvement on the National Student Support Accelerator website.
For each dataset you collect for your Performance Measurement Plan, outline the following:
- Who is responsible for collecting these data? When and how will they collect them? How often?
- Who is responsible for reviewing these data? When and how will they review them and distill insights?
- Who is responsible for acting on the insights distilled from the data review? What is their timeline?
- Who is responsible for supporting the people acting on the data, and what form will this support take?
- Who needs to be informed about the data, insights, and actions? Who will inform them, and by when?
Standardizing a data review process helps set a clear expectation that the objective is not simply knowledge, but action based on knowledge. Any Data Review Protocol should ensure that raw data are converted into a clear and digestible format beforehand, so that reviewers can focus on interpreting the data, not deciphering it.
The next page lays out a detailed template agenda/protocol for a data review. Your tutoring program leadership team might apply this protocol to end-of-year outcome data; a program lead might apply it to training data at the end of tutor preservice training; a leadership team might apply it to quarterly caregiver feedback. While this protocol can be used in a wide variety of contexts, the rationale for data review remains consistent:
- When: Review data as soon as possible after collecting relevant data because outdated data are less valuable and actionable.
- Why: Focus on learning and improving rather than assigning blame for shortfalls.
- Who: Empower the facilitator to guide the conversation and make sure every voice is heard.
- What: Review both aggregated data and disaggregated data. Disaggregating data by demographics and other characteristics will reveal impact across lines of difference: race, gender, IEP status, home language, school, etc.
- How: Prioritize quality over speed and adjustment time based on the scale of the review:
- Tutors reviewing daily assessment data for their students should only need about 15 minutes.
- An entire team reviewing the past year’s worth of data might take an entire day to review them all.
Data-Review Protocol | ||
Step | Purpose | Possible Questions |
Step 1: WHAT did we want to happen? |
Ensure all participants are on the same page about what the goal or intended outcome was |
|
Step 2: WHAT actually happened? |
Ensure all participants are on the same page about what the actual outcome or result was
Explore the divergences between expectations and realities |
|
Step 3: WHAT did we learn? |
Reflect on successes and failures during the course of the project, activity, event or task
The question ‘Why?’ generates understanding of the root causes of these successes and failures. |
|
Step 4: WHAT can we do better in the future? |
Generate clear, actionable recommendations and next steps for future projects |
|
Step 5: WHAT changes do we need to make to our project and individual plans? |
Incorporate key lessons into your future actions
Document all key lessons for those who may inherit this project in the future |
|