Hello, we are Haley Johnson and Natasha Saelua, Researchers at McREL International.
This spring we were tasked with evaluating a school-wide, comprehensive, rigorous academic K-12 program. We were asked to include a cost analysis of the program compared to similar programs. A clear-cut quantitative analysis would have sufficed but our team conducted a mixed methods evaluation to investigate the true costs and benefits of the program. Our role included interviewing principals, program coordinators, teachers, parents, and students. Although our analysis is not finished and do not know what the collective findings say, we gleaned insights from the interviews that speak to the power of a mixed methods evaluation design.
Lessons Learned
- Context is Complex: The COVID-19 pandemic defined the context of our lives for the past two years. As evaluators, we may note the trials of the pandemic with an asterisk, describing why test scores are missing in the spring of 2020. But every principal, program coordinator, and teacher shared that the costs associated with the pandemic was far more than missing test scores – it was missing a year of school socialization, it was needing to re-allocated resources to make up for lost time, it was staffing shortages, and it was dealing with student stress, anxiety, and grief. As such, our mixed methods design expanded how we thought about the effect of the pandemic on program implementation. And the pandemic was just one piece of the implementation context we learned from our conversations.
- The Decisions We Contribute to Impact Real People: When looking only at raw numbers we could say this program does some good. Ultimately, however, it hasn’t produced test scores, graduation rates, and college enrollments needed to justify continued funding. And that does make some sense – school and district leaders have limited funds and do need ways to make fair decisions about resources. But talking with the people involved in using the program we began to understand impact more broadly. Participants described how the structure and values of the program created critical thinkers, created opportunities for students to explore their identity, and practice being caring and contributing members of a community. Parents loved that the program teaches skills their 2nd graders would use for the rest of their lives. At the high school level, teachers warned us about what the message to students would be if this rigorous academic program was cut with nothing to replace it. Both were reminders that it is the real people who are impacted by decisions made based on numbers.
- People Want to Be Heard: One lesson we learn every time we conduct an interview project is how willing people are to share. Even though most people we interviewed believed we were there to justify cutting the program – which meant some of the people we interviewed would lose their job – everyone we talked to was extremely generous with their time, their experiences, and their perspectives. We were initially shocked when people frankly addressed the elephant in the room: their collective belief that our role is to justify cutting program. Despite this perception, their conversations were always kind and respectful as they shared their stories. We learned how much participants appreciated that their nuanced experiences were heard.
Ultimately, by using a mixed methods evaluation design, and striving to hear from those affected by programs, we can provide a picture of participants’ lived realities. In doing so, we can provide richer, deeper, and contextualized information that urges leaders to include multiple sources of data to make informed decisions that consider the needs of the community.
The American Evaluation Association is hosting PreK-12 Ed Eval TIG Week with our colleagues in the PreK-12 Educational Evaluation Topical Interest Group. The contributions all this week to AEA365 come from our PreK-12 Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.