Hi folks! I’m JT Taylor, Director of Research and Evaluation at Learning for Action (LFA), and I’m here with my colleague Emily Drake, Senior Consultant and Director of Portfolio Alignment. LFA is a San Francisco-based firm that enhances the impact and sustainability of social sector organizations through evaluation, research, strategy development, and capacity-building services. Emily and I can’t wait to share an easy and reliable approach to facilitating participatory, collaborative qualitative analysis processes at this year’s AEA conference.
Lessons Learned: Effective facilitation is essential for leading participatory and collaborative evaluation processes: (1) it helps us to surface and integrate a multitude of perspectives on whether, how, and to what extent a program is working for its intended beneficiaries; (2) it is necessary for building and maintaining trust among stakeholders: trust that they are being heard, that their perspectives are weighted equally among others, and that their participation in the evaluation process is authentic and not tokenized; and (3) it is important for producing the buy-in of stakeholders and relevance of results that ensure evaluation findings will inform real action.
Engaging a variety of stakeholders, including program beneficiaries, in the analysis and interpretation of data in a way that authentically includes their perspective and contributions is important—and takes a set of facilitative skills and tools that go beyond evaluators’ typical training in technical analysis. In our work implementing collaborative evaluations, we have found that the same facilitation techniques that produce great meetings and brainstorming sessions can also be used to elicit great insights and findings from a participatory qualitative analysis process.
Hot Tip: Use participatory analysis techniques when you want to synthesize qualitative data from multiple perspectives and/or data collectors—whether those data collectors are part of your internal team, evaluation partners, or members of the community your work involves.
- Do the work of “meaning-making” together, so that everyone is in the room to clarify observations and themes, articulate important nuances, and offer interpretation.
- Use a 1-2 hour working meeting with all data collectors to summarize themes and pull out key insights together. Have each participant write observations from their own data collection, each on a large sticky note. Then group all observations by theme on the wall, having participants clarify or re-organize as needed.
- Save reporting time later by asking participants to annotate their sticky note observations with references to specific interviews, transcript page numbers, and even quotes from their data collection to make it easy to integrate examples and quotes into your report.
This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from th eAmerican Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org. Want to learn more from Rebecca? They’ll be presenting as part of the Evaluation 2014 Conference Program, October 15-18 in Denver, Colorado.
Hello. I’m Fran Hoffman, a counseling psychology graduate student and new evaluator. I wanted to comment on your section about synthesizing qualitative data from multiple perspectives. You talked about doing “meaning-making” work together as a team. That strikes me as very important given that we want to make sure all perspectives are heard and are reflected in our work. That might be an exercise that we can do with stakeholders as well to ensure that their perspectives are prioritized. Additionally, I liked your suggestions for sticky notes as a way to organize and integrate key insights from all participants. I can see doing this with my group as well. It could both save time and keep us on track as we work through concepts together. Thank you!