Hello! My name is Zachariah Barghouti, an intern with Evaluation + Learning Consulting, and a graduate student in American University’s Measurement and Evaluation Program. While the pandemic has put a halt to face-to-face evaluation, our work has shifted to online methods. When hosting meetings online directly with stakeholders, evaluative work should still be equitable, cross-culturally valid, and oriented toward participant ownership[1]. As a result, I’ve developed a step-by-step guide for virtual participatory evaluation in the four phases of evaluation: Ideation + Design, Data Collection, Data Analysis, and Action Planning.
The Ideation + Design section includes five methods to use with stakeholders to identify the purpose of evaluative work and how that work leads to change. They can aid participants in making connections between elements of the program and how they impact the big picture.
Data Collection contains a set of six alternative virtual methods to traditional methods of collecting and exploring the data needed to answer the evaluation questions.
Data Analysis covers six innovative techniques to bring stakeholders together to read, create meaning from, and draw conclusions about the collected data.
Action Planning was designed for revealing and using actionable ideas to co-construct a path forward. This section drew exclusively fromLiberating Structures for its virtual retrofits of 25/10, 15% Solutions, Open Space Technology, and What, So What, Now What.
We also include a Virtual Tools section in the guide. Each participatory virtual evaluation method includes a section that recommends which free and popular technology to use for that particular virtual method. All methods can be done using a combination of tools that were selected for their accessibility, adaptability, and cost (all have basic and free consumer versions).
You can download the guidebook here.
As a young and emerging evaluator (YEE) with limited real-world evaluation experience, I found that my background in community organizing and grassroots advocacy grounded me in shaping this guide, which I hope could exemplify equity as a principle. The guidebook’s purpose is to deepen the participatory practices of evaluators in general and virtually in particular. I included a combination of innovative, popular and inclusive structures and virtual tools organized by each phase of decision-making. These methods were adapted because they can create the space for ideas to emerge equally from each individual in the group, and empower the whole group to shape the evaluative work.
Lessons Learned: While working on this research, my inspiration was drawn from many sources, including the tradition and culture of participation of so many communities and organizations. I hope you find the guide useful and that it inspires you in a new way if you’re already familiar with the traditional versions. As a YEE, I need feedback to grow, and this guidebook is my personal experiment in exploring a type of methodology I am increasingly passionate about. I would be very happy to hear from you, whether it’s critical feedback or to offer a method you think should be part of the guide! Please join me in our Evaluators’ Slack Channel to share ideas, tips, thoughts, and suggestions.
We hope this work can serve as a resource to help you advance in your own work, in these times and beyond.
[1] Equitable Evaluation Framework. Three principles ground the Equitable Evaluation Framework.
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Can you please share the guide?
Hi Zachariah,
I really enjoyed browsing through the guidebook and making connections to methods and tools that would be useful for remote collaborative inquiry, a topic I am currently studying alongside program evaluation.
Similar to you, I am a graduate student in the assessment and evaluation stream in Queen’s University’s Professional Master’s of Education program. Having just completed a program evaluation design for a participatory evaluation, I found your guidebook particularly relevant and insightful. I liked the clarity of the step-by-step instructions. It makes it easy to pick up and use. While creating my evaluation design, I also found that I became increasingly passionate about participatory evaluation. Why wouldn’t an evaluator want to hear from more participants? It feels like a choice between primary or secondary data.
Finally, I feel that your guidebook would be useful beyond this pandemic period. The methods that you recommended may be helpful for global program evaluations, too, where collaborators and participants are located around the globe.
Thanks for sharing!
Many thanks for this Zachariah, I’m looking forward to diving into this.
Dear Zachariah,
I really enjoyed reading through your user guide on remote participatory evaluation. COVID has forced the entire world to adapt to a new way of living. In turn, we are having to think of creative ways, using available tools, to deliver quality work. I appreciate how you were able to suggest strategies that maintained the integrity of participatory evaluation when working remotely.
I work in the field of education, and while we are not evaluating a program, many of the same theoretical frameworks apply. In particular, we look at ways to ensure that multiple voices are heard and that we facilitate our program in a culturally sensitive manner.