I’m Kathryn Hill, NPFTIG business co-chair with Laura Beals. To close our sponsored week for AEA365, I am focusing on internal evaluation management techniques. As a grants administrator in a nonprofit organization, I spend most of my time preparing progress and final reports on outcomes for programs with funding from foundations. This requires careful evaluation management.
Hot Tips:
Effective evaluation management ensures the necessary, timely information needed for meaningful use—whether it is a report for a funder, a stakeholder group or program decision-making by staff. Here are three tips I’ve used for internal management of the evaluative process in nonprofit settings:
Hot Tip #1: Involve program implementers in the review of all reporting requirements, especially reporting timelines and what information is needed. Review all necessary data that will be needed to answer evaluation questions.
Hot Tip #2: Overlay or integrate evaluation timelines (including report due dates) with program implementation timelines. Take care of this at the front end of implementation.
Hot Tip #3: Explicitly connect the data elements to the theory of action (or logic model) of the program during these conversations with the staff. This gives contextual relevance for the collection of the various data elements. One example: If training sessions are one of the outputs, and if participants are supposed to learn something (outcome), tracking training dates/attendance and having pre/post measures of what participants know makes good sense to everyone involved.
Lessons Learned:
We have learned this week that collaboration and the use of evaluation results for learning maximizes impact and engagement of key stakeholders/primary users in exemplary evaluations. In the same manner, I’ve learned collaborative use of program data internally with program staff members also maximizes engagement with the collection and use of data for internal decision-making. What does this mean? When I involve others in the review of evaluation findings, they become interested in the results and impact. They will question data accuracy and then become diligent about high quality data. They want to ensure appropriate use, offering important process and context clarifications. They will ask for different types of data for regular analysis and review, to ensure evaluation results are useful for both leadership/funder decisions and their own programmatic decision-making. To loosely borrow a quote from “Field of Dreams”: if you build a collaborative structure for evaluation use, they [users] will come.
The American Evaluation Association is celebrating Nonprofits and Foundations Topical Interest Group (NPFTIG) Week. The contributions all this week to aea365 come from our NPFTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.