Since 2010, SchoolWorks has surveyed all schools undergoing a SchoolWorks Quality Review approximately six months after the site visit. We call this our Impact Survey. (Another survey is sent immediately after the review to get feedback on the process and short-term impacts.) All schools are sent the impact survey, whether their quality review was a charter renewal, formative review, or accountability requirement. The purpose of the survey is to gather perception data about the lasting impact of the quality review.
You’ll find an early blog post regarding the report-out of data from 2010 through the spring of 2013. At that time, we had a survey with a mix of question formats including some Likert scale, simple yes/no, and open response. For the 2013-14 academic year, we revamped our survey system with some outside expert assistance. We shortened the survey a bit and clarified some language. The biggest change was the adoption of a common Likert scale for all questions. We now have results from the 2013-14 school year using this improved survey.
Here at SchoolWorks, we have a lot of internal debate about what the impact should be from a quality review, which is typically a two-day event in a school. Can you link changes in student achievement measured six months down the road to a two-day review? What should be the impact of the review on how business is done in the school? We don’t believe any short-term process can be a silver bullet, but it should offer great value and have an impact on how business is done and be perceived as a positive step toward improving student achievement.
The first three questions on our survey delve into how the visit impacts the adults in the school. Specifically we look for: a) the value of the insights offered to the school; b) the influence on decision-making; and c) changes in adult behaviors and attitudes. The SchoolWorks team agrees that a quality review is specifically designed to have a lasting impact on these aspects of the school. Chart 1 shows the results from the 56 schools that responded to the 2013-14 Impact Survey.
We are happy to report that schools perceive their experience with us to have provided useful insights. That is really the first step to making good use of a quality review. Through our evidenced-based process, we clearly provide useful analysis. One respondent added the following comment about insights in the optional open response:
- The feedback was very helpful to make sure we addressed deficiencies within the school. We were able to support our teachers for the last three months of the year and [the feedback] was a springboard for this year. In addition, the review notes were extremely helpful when the leadership team met at the end of the year to set our imperatives for the year.
Respondents also find that our quality reviews has helped them make decisions about how to improve their schools. A quality review can do this very well by presenting a clear view of strengths and areas for growth. With the big picture brought into sharp focus by an outside team, it is much easier to move the school forward. In addition, many of our quality reviews include a half-day prioritization session, during which the team and school leadership work hand-in-hand to unpack the team’s findings and strategize next steps. Here is what a few respondents volunteered about how the quality review impacted their decision making:
- Our priorities are further enhanced by the findings of the visit. In addition, the visit supports the visioning process around the development of effective processes and strategies to address the findings.
- [The visiting team] confirmed our belief that RTI (MTTS-Multi Tiered System of Support) needed strengthening for success in increasing achievement for our bottom 30%.
- We used information from the report to focus our work on unpacking lesson objectives to improve instruction and student outcomes.
Finally, 72% of the respondents strongly agree or agree that the quality review helped them enact changes in the educational practices. These results are positive, but not as positive as results for providing insights and making decisions. This variation is interesting to us. Some of the voluntary comments made by survey respondents shed more light on the difference:
- A focus on lengthened planning/instructional coaching will be implemented as a result of these findings…
- We are still getting good at using data to drive instruction. There is still a lot of data, but it requires consistent monitoring of interventions based on student learning outcomes.
- The staff members are holding each other more accountable in providing Montessori lessons. The vertical PLC meetings are occurring with more enthusiasm to support each level in providing best practices.
- The work of schools is dense and complex. These results were useful, but they competed with several other interests and responsibilities.
- After the report was given, I asked central administration for PD, and I am waiting for that to occur. Still in working stages. No idea when it will begin.
The comments reveal that progress requires hard work, resources, focus, and time. That is no earth-shattering conclusion to those working to improve schools.
We are very happy with these results. They show the vast majority of participating schools perceive that the process provided useful insights, helped make decisions, and helped enact change. We view these as the primary purpose of school quality reviews. Schools report these perceptions six months after the quality review, a timespan during which a lot of investments in professional development fade.
The last three questions on our survey delve into how the visit impacts behaviors, attitudes, and student achievement. Specifically we look for: a) changes in staff behaviors and attitudes; b) belief about impact on student achievement; and c) noticeable changes in student achievement. The SchoolWorks team has a lot of debate about how much a two-day site visit can contribute to these aspects of a school, particularly the direct attribution of changes in student achievement to a two-day process. While there is strong agreement among our team that a good quality review provides insights, helps make decisions and drives the enactment of change, the quality review typically does not extend into helping schools with implementation. Still, we want to ask the questions about results and see what schools perceive about the impact of the SchoolWorks quality review on these outcomes. Chart 2 shows the results from the 56 schools that responded to the 2013-14 Impact Survey.
Approximately 66% of respondents agree or strongly agree that the SchoolWorks quality review will have a future positive impact on student achievement. Approximately 50% of respondents agreed or strongly agreed that the quality review resulted in positive changes in staff attitudes and behaviors. Even though a quality review is a brief event with no time to support implementation, schools clearly perceive the review as a contributing factor to changing adult thinking and raising results for kids. We believe getting these results with just a two-day investment represents a great value for our clients.
Finally, when asked about noticeable positive changes in student achievement, 27% agree or strongly agree that the quality review has already had a positive impact on student achievement. A large number (57%) are neutral on the relationship. This echoes the thoughts of many of our team members, who raise valid points about correlating changes in student achievement to a single two-day event in the school. That said, we are very excited that almost 30% of schools perceive a direct connection between better student results and the quality review.
In summary, we remain very proud of the impact of the SchoolWorks quality review. It receives great overall rankings on its primary objectives of proving insights, driving decisions, and driving the enactment of changes. Beyond these primary objectives, many schools perceive real changes in adult behaviors and student achievement directly tied to the SchoolWorks quality review.