DEME TIG Week: Lessons Learned from Evolving Evaluation Practice in the Evaluation of Humanitarian Action by Minji Cho, Ann Marie Castleman and Scott Chaplowe

Hi, we are Minji Cho and Ann Marie Castleman, independent evaluation consultants and PhD students studying evaluation at Claremont Graduate University, and Scott Chaplowe, evaluation, strategy and capacity development specialist. We recently collaborated on a project with the Active Learning Network for Accountability and Performance (ALNAP) to understand past, present, and future challenges in the practice of evaluation of humanitarian action (EHA).  Scott presented on this at AEA as part of the Disaster and Emergency Management Evaluation (DEME) TIG lineup. Today we would like to share strategies that humanitarian organizations used to adapt their evaluation practice during the pandemic and visions for the future.

Key considerations for EHA related to the pandemic

Humanitarian organizations adopted a variety of different practices in response to COVID-19 (IED-OIOS, 2020; UNDP, 2020; WFP, 2020). Firstly, many organizations utilized an evaluability assessment to understand the extent to which a program could be evaluated reliably and credibly while also ensuring staff and community safety. Secondly, organizations recommended postponing conducting evaluation. Although the feasibility of conducting delayed evaluations had to be carefully considered, postponing was considered a practical strategy. Thirdly, organizations proposed alternative evaluation designs and methods such as using existing reliable and relevant data, conducting remote data collection, scaling down the scope of the evaluation, and conducting collaborative evaluations. Remote data collection was encouraged by using mobile phones, tablets, or other tools for collecting data without direct interaction with the community to avoid face-to-face contact due to the risks of transmitting the virus. Scaling down the scope of the evaluation was considered appropriate in light of limited resources and capacity during the pandemic (e.g., reducing sample sizes or the geographical or thematic scope of an evaluation). Finally, conducting collaborative evaluations among multiple organizations was utilized to reduce health risks and extra burden on stakeholders during the pandemic.

Future considerations for EHA

We asked humanitarian evaluators from novice to experienced to forecast the future of EHA given the current context of COVID-19 and the “new normal.” Here’s what they told us they envisioned for the future:

  • Increased use of remote M&E approaches. They envisioned that humanitarian evaluators would develop skills to use innovative technology-based remote approaches to conducting M&E that are tailored to the contexts in which they are working.
  • A focus on developing local partner capacity. Evaluators envisioned a future where partners are tapped for their local knowledge and understanding of context and their ability to move about and reach participants that may be traditionally difficult to access. This would strengthen the ability to conduct real-time evaluation and ensure greater inclusion of participants. It also has the potential to decolonize evaluation methodologies and promote evaluation that is more culturally responsive.
  • A decentralized evaluation function. Along with strengthening local partner capacity, humanitarian evaluators envisioned a future where evaluation is locally designed and implemented, enabling local leadership to promote information sharing between agencies and make use of evaluation findings.
  • An evaluation learning culture. Evaluators forecasted a future where learning from evaluation is central, where lessons learned are shared across organizations, where use of evaluation is prized over “perfect” evaluations, and where evaluators take on the role of knowledge facilitators to promote follow through on evaluation recommendations. 

Rad Resources:

If you want to learn more about this topic, check out these resources:


The American Evaluation Association is hosting Disaster and Emergency Management Evaluation (DEME) Topical Interest Group (TIG) Week. The contributions all this week to AEA365 come from our DEME TIG members. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.

Leave a Comment

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.