Hi folks! I’m JT Taylor, Director of Research and Evaluation at Learning for Action (LFA), and I’m here with my colleague Emily Drake, Senior Consultant and Director of Portfolio Alignment. LFA is a San Francisco-based firm that enhances the impact and sustainability of social sector organizations through evaluation, research, strategy development, and capacity-building services. Emily and I can’t wait to share an easy and reliable approach to facilitating participatory, collaborative qualitative analysis processes at this year’s AEA conference.
Lessons Learned:
Effective facilitation is essential for leading participatory and collaborative evaluation processes: (1) it helps us to surface and integrate a multitude of perspectives on whether, how, and to what extent a program is working for its intended beneficiaries; (2) it is necessary for building and maintaining trust among stakeholders: trust that they are being heard, that their perspectives are weighted equally among others, and that their participation in the evaluation process is authentic and not tokenized; and (3) it is important for producing the buy-in of stakeholders and relevance of results that ensure evaluation findings will inform real action.
Engaging a variety of stakeholders, including program beneficiaries, in the analysis and interpretation of data in a way that authentically includes their perspective and contributions is important—and takes a set of facilitative skills and tools that go beyond evaluators’ typical training in technical analysis. In our work implementing collaborative evaluations, we have found that the same facilitation techniques that produce great meetings and brainstorming sessions can also be used to elicit great insights and findings from a participatory qualitative analysis process.
Hot Tip:
Use participatory analysis techniques when you want to synthesize qualitative data from multiple perspectives and/or data collectors—whether those data collectors are part of your internal team, evaluation partners, or members of the community your work involves.
- Do the work of “meaning-making” together, so that everyone is in the room to clarify observations and themes, articulate important nuances, and offer interpretation.
- Use a 1-2 hour working meeting with all data collectors to summarize themes and pull out key insights together. Have each participant write observations from their own data collection, each on a large sticky note. Then group all observations by theme on the wall, having participants clarify or re-organize as needed.
- Save reporting time later by asking participants to annotate their sticky note observations with references to specific interviews, transcript page numbers, and even quotes from their data collection to make it easy to integrate examples and quotes into your report.
The American Evaluation Association is celebrating Best of aea365, an occasional series. The contributions for Best of aea365 are reposts of great blog articles from our earlier years. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.