NME Week: “View From Inside”: A Brief Guidebook to Internal Evaluation by Deborah Good

Hi, I’m Deborah Good, Director of Organizational Learning and Research for a nonprofit called Future Focused Education. Our organization works toward individual and system changes, always with an eye toward our city’s most vulnerable young people. We run a paid internship program for high school students, support a network of innovative schools, elevate student voices, and advocate for policies that nurture student learning.

 

As an internal evaluator, I have the fun of working closely with my colleagues as they design, plan, and implement initiatives, while doing my best to maintain an evaluator’s perspective. We also contract with external evaluators, who provide an invaluable point of view and frequently employ more robust methodologies than we use inhouse. That said, the decision to invest in internal evaluation has proven indispensable to our organization.

 

I am learning lessons along the way. Here is my brief guidebook to success as an internal evaluator.

  1. Prioritize continuous improvement. I strive to use data as a flashlight, not a hammer, but it is common in the nonprofit and social service arenas to assume internal evaluation is for compliance. In a previous job where I served as the quality improvement coordinator, “improvement” was only in the job title. The data I collected was primarily used to demonstrate to our higher-ups and the State of New Mexico that we were in compliance with rules and regulations. Now I gently remind colleagues that grant reporting is important, but using data to improve is our North star.
  2. Regularly revisit your evaluation questions. I have adapted Results Based Accountability (RBA) to the needs of our organization. This means my go-to research questions are (a) How much did we do? (b) How well did we do it? and (c) Is anyone better off? I am learning not to limit myself to these three questions, and to listen everywhere for interesting questions in our work. For example, today’s question is: Does students’ school attendance correlate with their attendance at internships?
  3. Get creative with your methods. RBA traditionally uses quantitative measures, but I always use varied sources, including qualitative data. In addition to surveys, interviews, and observations, what about using mobile applications like ImBlaze, facilitating discussions using unique protocols, or analyzing hand-drawn ecomaps?
  4. Lead frequent data dives with staff, partners, and young people. Who better to make sense of data than the people who live it? When leading data dives, I usually hand out graphs, tables, and qualitative data summaries without significant interpretation. I then lead a discussion on what is interesting and surprising, and what implications the data might have for planning next steps and future strategy.

For more, contact deborah@futurefocusededucation.org.

The American Evaluation Association is celebrating New Mexico (NM) Evaluators (www.nmeval.org) Week. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.