I am Patti Patrizi an evaluation consultant working primarily with foundations helping them develop evaluation and learning systems. After working at The Pew Charitable Trusts, I founded The Evaluation Roundtable. My tip is an approach I used to help a large foundation develop a learning system that fosters internal learning about their strategies as an antidote to years of producing reports about results and outcomes.
Hot Tips:
- Assessing the current reporting system: We used a modified “post action review” (http://www.fireleadership.gov/documents/Learning_AAR.pdf ) with a 16 person representative staff group asking them to describe their experience with their current system (this included asking about: audience, process, questions, actual use–by whom, gaps and positives) and to describe their hopes. The process took 2 meetings at 1.5 hours each.
- Providing quick feedback: We quickly provided their comments back on a single Excel sheet sent to them for comments.
- Plotting out the new system: Using the information, we generated a rough outline of the major elements of a new reporting system, which they reviewed in one group meeting, and then via email. We then selected four members of the larger group, to help detail mechanics, rules, and flows for the new system.
- The core of the process: The system builds exchange between officers and their directors on each strategy. The exchange is teed up by responses to a set of questions developed to stimulated thinking and discussion on issues. Each officer writes a note; their director reads it, and convenes the group of officers working on a strategy, and then writes his/her own note. Each note represents each person’s own perspective; there are no “corrections” in the process. The group then meets with their program VP to discuss implications.
- Developing good learning questions: The old system focused on listing accomplishments. The new system drives on questions that challenge officers to think critically about the strategy, and about why something happened or not. Using data of some kind (qualitative or quantitative) is a requirement. So as an example:
“Are you finding that you need to rethink the assumptions behind your theory of change, including:
- Time needed to achieve outcomes envisioned
- The extent of partnership and interest delivered by key stakeholders
- Availability or nature of resources needed to make a difference
- Levels of interest from external stakeholders—such as policy makers, NGOs etc.
- Unanticipated changes in policy
- The level of capacity that exists within relevant field/s in order to carry out the work, or as it relates to the key approaches
- Other assumptions that have not materialized as you hoped?”
Last thought: This process will be only as good as the thinking it produces in the organization.
The American Evaluation Association is celebrating Nonprofits and Foundations Topical Interest Group (NPFTIG) Week. The contributions all this week to aea365 come from our NPFTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.