Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

Allan Porowski and Heather Clawson on Conducting a Large-Scale, Mixed-Method Evaluation of a Dropout Prevention Program

Hello! We’re Allan Porowski from ICF International and Heather Clawson from Communities In Schools (CIS). We completed a five-year, comprehensive, mixed-method evaluation of CIS, which featured  several study components – including three student-level randomized controlled trials; a school-level quasi-experimental study; eight case studies; a natural variation study to identify what factors distinguished the most successful CIS sites from others; and a benchmarking study to identify what lessons CIS could draw from other youth-serving organizations.  We learned a lot about mixed-method evaluations over the course of this study, and wanted to share a few of those lessons with you.

Lessons Learned:

  • Complex research questions require complex methods. Disconnects exists between research and practice because the fundamental research question in an impact evaluation (i.e., Does the intervention work?) provides little practical utility for practitioners in their daily work. CIS leadership not only wanted to know whether CIS worked, but also how it worked, why it worked, and in what situations it worked so they could engage in evidence-informed decision making. These more nuanced research questions required a mixed methods approach. Moreover, CIS field staff already believed in what they were doing – they wanted to know how to be more effective. Mixed methods approaches are therefore a key prerequisite to capture the nuance and the process evaluation findings desired by practitioners.
  • Practitioners are an ideal source of information for determining how much “evaluation capital” you have. CIS serves nearly 1.3 million youth in 25 states, which opens up the likelihood that different affiliates may be employing different language, processes, and even philosophies about best practice. In working with such a widespread network of affiliates, we saw the need to convene an “Implementation Task Force” of practitioners to help us set parameters around the evaluation. This group met monthly providing incredibly helpful in (a) identifying language commonly used by CIS sites nationwide to include in our surveys, (b) reviewing surveys and ensuring that they were capturing what was “really happening” in CIS schools, and (c) identifying how much “evaluation capital” we had at our disposal (e.g., how long surveys could take before they posed too much burden).
  • The most important message you can convey: “We’re not doing this evaluation to you; we’re doing this evaluation with you.” Although it was incumbent upon us as evaluators to be dispassionate observers, that did not preclude us from engaging the field. Evaluation – and especially mixed-methods evaluation – requires the development of relationships to acquire data, provide assistance, build evaluation capacity, and message findings. As evaluators, we share the desire of practitioners to learn what works. By including practitioners in our Implementation Task Force and our Network Evaluation Advisory Committee, we were able to ensure that we were learning together and that we were working toward a common goal: to make the evaluation’s results useful for CIS staff working directly with students.

Resources:

  • Executive Summary of CIS’s Five-Year National Evaluation
  • Communities In Schools surrounds students with a community of support, empowering them to stay in school and achieve in life. Through a school-based coordinator, CIS connects students and their families to critical community resources, tailored to local needs. Working in nearly 2,700 schools, in the most challenged communities in 25 states and the District of Columbia, Communities In Schools serves nearly 1.26 million young people and their families every year.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.