Hi, We’re Leah Josephson (she/her) and Lauren Beriont (she/her) with Emergence Collective, and Alex Bauer with a family foundation in Nebraska. This year our two teams collaborated to support some of the foundation’s grant partners in building evaluation capacity.
As a foundation that takes a relationship-based grantmaking approach, trust and shared values and goals are crucial to grantmaking. Creating those relationships in rural areas of the state can be challenging, so the foundation saw this as an opportunity to understand where there was alignment between grantee partners and the work of the foundation. It would also provide an asset to the grantees that they could utilize beyond the foundation’s purposes.
At Emergence Collective, we wanted our nonprofit partners to understand “evaluation” holistically – not just as data collection and reporting, but as an ongoing process of learning, growth, and curiosity. We started by designing a new self-assessment tool focused on internal learning culture.
The survey tool includes assessment items focused on four areas of evaluation capacity and learning culture, designed in consideration of the guiding questions below. Embedded throughout the assessment are questions focused on equitable evaluation practices and approaches. Other themes include participatory approaches, the ways team members relate to one another, and perceptions of management and leaders.
- Capacity: Are we investing in evaluation: people, data systems, time? Do we have the technical skills to do the kind of evaluation we want?
- Praxis: What does evaluation look like in practice? How are we taking time to theorize, act, and learn in real time? Can we point to concrete changes?
- Learning: Is there space to reflect? Is feedback taken seriously across the organization? Which core beliefs drive our evaluation?
- Strategy: Is our vision and mission clear? What is our evaluation plan? Does the way we measure our work make sense?
At Emergence Collective, our team didn’t draw any conclusions from the results of the assessment. Instead, we pulled out composite scores for each practice area, plus other key data, into a colorful one-pager. We then led partner teams through an engaging, facilitated process of interpreting the findings themselves. Based on these conversations and a brainstorming session, we provided a few ideas for next steps for evaluation capacity building, and the partners all chose their next moves, identifying the path forward for the rest of the capacity-building year.
What set this process apart from others that the foundation has seen evaluation firms use in the past is that the tool identified areas of strength and opportunities for growth rather than assuming where capacity building was needed. Emergence Collective was able to tailor the capacity building process to each organization to meet their needs.
Rad Resource:
We pored over many existing assessment resources in conducting research to develop our own Culture of Learning Self Assessment. One was The Performance Imperative, a free organizational assessment tool developed by Leap of Reason. Though our assessment is geared more toward curiosity and learning than the pursuit of excellence, we were inspired by this tool.
Lesson Learned:
“It was an extremely valuable process in terms of clarifying nonprofits’ goals. Now we can have productive conversations based on the theory of change that the nonprofit generated themselves.”- Tyler, program officer at the foundation
Lesson Learned:
We were pleasantly surprised by how thoughtfully our partners interpreted their own assessment findings – reminding us once again that they’re truly the experts! Throughout this ECB process we continue to consider our role as that of facilitator more than anything else.
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Thank you for sharing your writing!
As a teacher who has experienced having outside evaluators come into the schools that I have worked at, and as someone who is now working on a course in program evaluation, I am doubly aware that being a subject of the evaluation process can often feel like an attack, as opposed to an opportunity of learning.
The survey tools that you have described seem to be a starting point towards an approach in evaluation that seeks to build participatory assessment and learning, which is what I would like to see in my future evaluation efforts. I was particularly struck by the assessment questions that you posed about capacity, praxis, learning and strategy.
My hope is that the process of finding and interpreting data together, and coming up with joint recommendations, will allow the evaluation process to become one that builds learning, curiosity, and organizational capacity. Thank you again!
Hello Leah and Lauren,
I was drawn to your post in order to learn how you were able to build capacity for evaluation. It was further encouraging to find that your focus was not only in the not for profit sector but also rurally minded. It is clear within your post that a focus on involving the key stake holders, learning more about the programs that were undergoing the evaluation to allow the tailoring of resources and have an action plan in place for each organization lead to significant involvement from those you worked with.
I am currently working through a course on program inquiry and evaluation at Queens University in Ontario which is how I came across your post in the first place. It is exciting to see the themes we discuss in class being so accurately implemented in practice.
An area that I have always had concern about when thinking about evaluation is the “next steps”, what happens when the evaluator leaves and the program is left with the information that is relevant only to them at that particular time. Your plan to have the programs involved in interpreting the findings and identifying a path forward directly address this concern in a practical and sustainable way.
Thank you for the glimpse into real world evaluation practice and how it can be done to remain relevant to all of those involved.
Gregory
What a wonderful idea. After reading your article, the line that had the most impact on me was “we wanted our nonprofit partners to understand “evaluation” holistically – not just as data collection and reporting, but as an ongoing process of learning, growth, and curiosity.
I think this really resonated with me because when I started my current course on evaluation, I felt that it was the opposite. A professional evaluates, the clients are
bystanders and when all is said and done, a report is delivered. While going through my course on evaluation, I have learned that it’s the total opposite and these words solidified that for me.
I also plan to take your approach in my own classroom and have the kids go further into their own self self assessment of their learning. If I can provide them with guiding questions in this process, I believe they can become more self-aware and change their learning for the better and I am really excited to explore the possibilities.
Thanks
Ryan