I’m Steve Patty with Dialogues In Action. Over the past decade, my team has worked with over 500 organizations, mostly nonprofits, to build their capacity to do credible self-studies of their impact. Helping people do a self-study can be brilliant. It can also be fraught with peril. These are a few lessons learned we’ve learned along the way:
Keep it simple. A self-study shouldn’t be complicated or people will lose their way. Make sure the steps are clear. Keep the focus sharp and the scope contained. Don’t create too many options for indicators, instruments, or data collection methodologies. If the self-study team wants to be fancy or sophisticated, remind them that evaluation is iterative. They can always include more programs and add more complexity next time. At the start of a self-study, people’s eyes are often bigger than their stomachs. Make sure they can be successful the first time around and they will want to do it again.
Keep it moving. Pacing is key. Believe me. We’ve gone too quickly at times and buried people. And we’ve gone too slowly at times and languished with lost momentum. There’s a sweet spot where people can keep at it and not feel overwhelmed. This also means that there needs to be an end to the self-study, even if the context is complex. If the self-study team is at it for too long, they will start to resent the process. Keep the steps snappy and the momentum moving forward.
Keep it relevant. It needs to touch people’s hearts along the way. If evaluation is too clinical, it will take away all the fun. This is why we almost always include a person-to-person qualitative interview component in our self-studies. When people start asking questions and listening to others talk, they get re-connected to the things that matter most. They see impact with their own eyes and hear it with their own ears. They are reminded why they are doing the work they do.
Keep it consequential. Collecting data for data’s sake is rarely worth it. People are too busy for that. They need it to matter. This means that the recommendations coming out of an evaluation about program or strategy adjustments, communications that lead to deeper engagement or better funding, or ideas to fix something that isn’t working are a critical step. In fact, the “So what?” and “What do we do now?” moments after the findings have landed may be the most important step of a self-study.
Keep it repeatable. Remember, the point of helping people do a self-study is for them to be able to do it for themselves in the future. This means that they will need to know how to do every step along the way by themselves. Don’t lose sight of this. Don’t do too much for them. Guide them. Coach them.
Keep it celebratory. Completing a self-study is a big deal. We usually end a capacity-building cohort with two things: a showcase and a release of a publication. The showcase is an event where we bring a crowd together – funders, partners, program participants, community representatives, among others – to hear about the findings and to listen to what they will do from the findings. And then we make a big deal about the release of the report. Both are done in a spirit of celebration and they bring closure and joy.
Leading others to do a self-study is profoundly rewarding. The names and the bylines are not yours. But you will have left something lasting.
Rad Resources
- This is an example of a compendium of self-studies with California Americorps programs.
- This is a resource for thinking creatively about ways to gather data.
- This is an example of how to talk about inviting engagement in data.
- This is a guide to doing a self-study.
We’re looking forward to the fall and the Evaluation 2024 conference with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to contribute to AEA365? Review the contribution guidelines and send your draft post to AEA365@eval.org. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.