Hi! I am Maia Elkana, the evaluation director for the Institute for School Partnership at Washington University in St. Louis, where I direct program evaluation across research-practice partnerships between teachers and administrators, education and scientific researchers from the university, and community partners.
K-12 education can be a tricky space for evaluation. On one hand, educators are highly interested in and incentivized to follow data-driven improvement initiatives, but these initiatives can be divisive, reliant on poorly designed metrics that are not well matched to evaluative questions, or otherwise present barriers to effective, valid, and/or useful evaluation. In my work with teachers and administrators, I have learned some valuable lessons that have made a significant difference in how I approach my role as an evaluator:
Lesson Learned #1: Teachers are BUSY.
When teachers don’t follow through with data collection, it’s often because they can’t. Teachers, especially at the elementary level, get very little quiet time to plan lessons and manage the paperwork required by their institutions. It is important to look for ways to collect educational data that do not require extra work. One way to do this is by piggybacking onto their ongoing assessment efforts. We can include teachers in the development of measures that can also meet their assessment needs, for example, adding items to exit tickets. We can also tag along on administrator classroom observations or provide brief observation instruments to include on those walk throughs. In addition to using existing assessments as evaluation vehicles, passive data collection has also proven an incredibly valuable tool for me. While it can sometimes feel a little dystopian, I regularly count on website analytics to measure implementation of the ISP’s curriculum program, which teachers access online.
Lesson Learned #2: Schools have more data than they know what to do with.
Schools take in, process, and report SO MUCH data! The complexities of scheduling and attendance tracking alone are mind-boggling, not to mention efforts to track the health, nutrition, and transportation needs of minors. We must consider all of this before we consider the data involved in educating them.
As an evaluator, I have discovered that it is beneficial to work with educators to pare down the noise so that they can focus on the unique issues at hand. At the ISP, we use the principles of improvement science and tools like Plan-Do-Study-Act (PDSA) cycles to narrow our evaluations and increase utilization of findings. Even when that approach isn’t feasible, we can present data in ways that keep it clear and specific, such as interactive dashboards that allow educators to explore the results without needing to wade through the analyses.
Lesson Learned #3: Educators care deeply about doing the best they can.
Teachers are evaluators at heart. They are eager to delve into data, measure quality, and improve! Their focus on learning makes teachers phenomenal co-conspirators in evaluation projects, but it also can lead to self-critical tendencies and a punitive approach to evaluation, leaving some looking only for what they are doing wrong.
When working with educators, budget for extra time to build trust and buy-in. Evaluators can use educators’ natural inclination to improve by including them in planning and analysis, which adds layers of complexity that add time to the project. In school settings, evaluators must remain sensitive to tensions and context when working with district and building level administrators and with teachers. Strong stakeholder relationships are important in every successful evaluation, but they are absolutely vital in schools.
The American Evaluation Association is hosting PreK-12 Ed Eval TIG Week with our colleagues in the PreK-12 Educational Evaluation Topical Interest Group. The contributions all this week to AEA365 come from our PreK-12 Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.