My name is Dr. Shanesha Brooks-Tatum, Executive Vice President of Creative Research Solutions. Based on my conversations and interactions with small and large foundations, one of the greatest (spoken and unspoken) fears that foundations may have regarding external evaluations are seemingly “bad,” “poor,” or inconclusive findings outcomes. Today, I will describe ways that evaluators and foundations can work together to peel back layers of truth-telling to reveal important process findings and diverse perspectives in the evaluation process.
No matter how well-planned monitoring and evaluation activities are, some seem to fear that not only will the organization lose resources in undergoing evaluation activities and reporting that does not present especially “strong” findings, but that the evaluation itself will somehow hurt the organization’s or program’s reputation if it is less-than-ideal.
This is an understandable concern, in that no matter how aligned or well-planned a program is, some findings can be surprising. In cases where the surprises center on less-than-ideal findings, one way that we can use these to our advantage is by emphasizing that outcomes and progress are not always defined the same way among different constituencies and varied contexts.
Hot Tip: Both foundations and evaluators should emphasize progress and process over finality and assumed failure to reach certain milestones. For example, in any evaluation context, we can explore questions such as, what positive changes occurred over time and what might be some promising models?
As we know from many scholars’ work, there are multiple truths in any situation. “Truths” could be defined as: the core impact, mission and vision, and/or the detailed stories behind the challenges that they face in implementing or refining models in the face of constant changes. This will enable an organization to speak their truths to whatever powers that be: funders or other stakeholders and experts in the field.
While locating the ever-evolving core truths of an organization, we must be more accepting of the fact an evaluation only captures a portion of dynamic and ever-changing truths. As evaluators, one of our roles is to peel back layers of truths to reveal the processes, progress and promising practices of an organization or program.
Lessons Learned: Proven Strategies:
- Focus on process and implementation evaluations to a greater effect.
- Perform stakeholder assessment to better understand what voices may be missing in the truths you wish to tell.
- Be flexible about the evaluation process. Evaluators know that things change, but all stakeholders must be amenable to inevitable changes.
Hot Tip: We should be clear about what truths we are focused on, and at what period of time, and from what vantage points or perspectives. And we get closer to any sense of truth through incorporating diverse voices and perspectives.
The American Evaluation Association is celebrating Nonprofits and Foundations Topical Interest Group (NPFTIG) Week. The contributions all this week to aea365 come from our NPFTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Dr. Brooks-Tatum,
Thank you for sharing your experience, understanding and interactions within program evaluation and its utilization. I am currently working through my masters of education and have nearly completed a course on program inquiry and evaluation. Throughout my course readings and own experience in planning my own program evaluation I am realizing more and more the power of the process. An evaluation is designed to collect and analyze results, but I have noticed that often it is possible to begin to make changes and adjust thinking and perspective with the process of the evaluation when the community and its stakeholders are involved in the process.
I really like your ‘hot tip’, what are the truths we are look for? Who is involved in the evaluation? Also through my readings and exploration I feel that clear evaluation questions MUST be identified before moving forward. If we stick to those questions as evaluator and keep the answers to those question at the forefront during the analysis and sharing of results it doesn’t matter if the answers are positive or ‘less than ideal’ we have data to work with when making a plan to move forward and improve the program.
Fiona