Data Use: Student Progress Measure
Quality Standard |
Student Progress Measure: The program has a system for measuring individual student progress over time and responding to those results, and the measures of progress include both academic growth and adaptive indicators (i.e., student engagement; student confidence).
|
Critical Questions |
- How will collected data reflect a holistic understanding of students’ experiences?
- What processes will be in place to review and act upon collected data?
- How will these review processes promote equity and reduce bias?
|
Implementation Checklist |
- Identify who is responsible for reviewing each type of data.
- Create and routinely use protocols for reviewing data and distilling insights to inform decisions.
- Review disaggregated data to ensure equity of services.
- Set up processes for communicating data (and the insights distilled from it) to relevant stakeholders.
- Make informed decisions and take action based on data, resulting in continuous improvements.
- Establish standards for effective implementation of the tutoring model and improve standards over time.
|
Implementation Tools |
Higher Education Institution (HEI) Specific Tools:
From Existing Resources:
|
Key Insights |
Frequently review data in both formal and informal ways.
- This ensures that program leaders are consistently aware of what actually makes their program effective. Staying up-to-date on data insights not only allows leaders to work with tutors on the ground to make direct improvements, but also helps them maintain a clear understanding of which model design elements are most essential to the program’s success.
Conduct an annual data review at the same time each year.
- This allows program leaders to consistently update their understanding of how effective each element of the program’s model design actually is.
- Annual data reviews are critical for programs looking to innovate, allowing leaders to distinguish among model design changes that preserve what matters most and ones that abandon the core of what makes the program effective.
- For programs seeking to scale, an annual data review provides an opportunity to standardize core parts of a program so that it can be easily and faithfully replicated at scale.
Tell a clear story about the program’s impact so far, such as this example from Peer Power.
- Programs seeking to scale and generate new demand must be able to articulate the impact they have made in their communities. This story is what gets a program’s “foot in the door” with new partners at every level, from individual schools to entire school districts and statewide departments of education.
Disaggregate student data (by race, gender, IEP status, home language, etc.) to ensure equity.
- By reviewing data through demographic breakdowns, programs can identify opportunities for improvement and identify training needs to ensure tutors can effectively serve all students.
Consider conducting rigorous evaluations.
- Rigorous evaluations will provide evidence about what aspects of a program work and don’t work. These insights can improve efficacy and/or reduce costs.
- Well-designed evaluations can provide definitive evidence that a tutoring intervention helps students and can increase the demand and the likelihood programs receive external funding.
- Rigorous evaluations may occur in conjunction with researchers from the tutoring program’s HEI, or in partnership with another organization.
|