Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

Navigating Challenges and Unveiling Insights: A Reflection on Evaluating the Young Emerging Evaluators (YEE) Program in Mongolia by Shelli Golson-Mickens, Nada Mousa, and Jennifer Ottolino

Hello, AEA365 community! Liz DiLuzio here, Lead Curator of the blog. This week is Individuals Week, which means we take a break from our themed weeks and spotlight the Hot Tips, Cool Tricks, Rad Resources and Lessons Learned from any evaluator interested in sharing. Would you like to contribute to future individuals weeks? Email me at AEA365@eval.org with an idea or a draft and we will make it happen.


Greetings! We are Shelli Golson-Mickens, Nada Mousa, and Jennifer Ottolino, alumni of American University’s Measurement and Evaluation Program, which has a partnership with the Mongolian Evaluation Association (MEA) to collaborate on evaluations, trainings, and other projects. 

In the dynamic world of international project evaluation, the journey is often as enlightening as the destination. Our recent rapid evaluation of Mongolia’s Young Emerging Evaluators (YEE) internship program unveiled the intricacies of working across borders virtually. Our main challenge was designing an evaluation in partnership with MEA, Eval Youth Mongolia, and Youth Inc given urgent time constraints. To meet an eight week deadline while still delivering meaningful results, we used a rapid evaluation approach. Our experiences with this rapid evaluation present lessons learned for conducting such evaluations virtually. 

Lessons Learned

Supporting Contextual Data Collection: Rapid evaluations support projects where resources and time are limited. We scaled our project to a feasible scope for data collection and analysis. However, the accelerated pace left limited room for cross-verification of data points. It is essential to strike a balance between speed and data validity. Incorporating a phased approach ensures both efficiency and depth.

Developing Creative Engagement Strategies: The rapid nature of the evaluation also affected participant engagement, including lower response rates on surveys and interviews. This hindered our understanding of participant perspectives. Pivoting to creative and targeted engagement strategies (such as using game-related data collection tools to make things fun) and leveraging diverse communication channels (including WhatsApp for surveys) enhances the quality of collected data.

Managing Time Zone Constraints: The project included evaluators from various corners of the globe, operating in very different time zones. While including evaluators from different regions and backgrounds ensured diversity and a rich tapestry of perspectives, it also posed significant time management challenges. Establishing clear communication channels, identifying points of contact and responsibility for tasks, and employing technologies that facilitate collaboration across time zones mitigates these challenges and supports participation. 

Developing Cultural Competence: Exercising cultural responsiveness in evaluation expands the wealth of knowledge and the potential impact of projects. Cultural responsiveness is a form of understanding and making space for other perspectives. In evaluation, this may look like making accommodations when communicating so that written text or spoken words are more sensitive to the cultures of partner organizations and participants. In this evaluation, time constraints posed serious challenges to developing cultural competence. We focused on conducting an equity-based evaluation, where we intentionally listened, held space for various perspectives, and named our limited cultural understanding.

Communicating Virtually in an International Context: Conducting virtual evaluations in an international context introduces unique challenges. Working virtually makes it more difficult to recognize body language and communication cues and respond appropriately. Employing active listening during virtual meetings and holding meetings with partners to cross validate information supports communication. 

Conclusion

The evaluation of the YEE program in Mongolia served as a valuable learning experience for our team. As we reflect on this journey, we emerge not only with insights into the YEE program but also with a deeper understanding of the complexities inherent in evaluating projects on a global scale. Armed with these lessons, we look towards the future, ready to apply our newfound knowledge in the pursuit of more impactful evaluations.


Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.