Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

Applying Digital Development Principles to Locally Contextualize Evaluations by Kim Norris

Hello, AEA365 community! Liz DiLuzio here, Lead Curator of the blog. This week is Individuals Week, which means we take a break from our themed weeks and spotlight the Hot Tips, Cool Tricks, Rad Resources and Lessons Learned from any evaluator interested in sharing. Would you like to contribute to future individuals weeks? Email me at AEA365@eval.org with an idea or a draft and we will make it happen.


Kim Norris

Hi, I’m Kim Norris, Monitoring, Evaluation and Learning (MEL) Director for American Institutes for Research (AIR)’s International Development Division. Part of my role is to lead a MEL practice. As part of our initial strategy, our practice team determined to focus on localizing our work. For us this means we seek out ways to increase local partnering and leadership in and around MEL efforts – from business development to MEL direction and execution. This involves local team leadership, capacity strengthening and engagement on local terms.

We evaluators can borrow valuable lessons learned from digital developments to design more effective, impactful, and locally relevant evaluations. Principles for Digital Development is a powerful tool that can be adapted for more locally contextualized evaluations. The website was developed as a collective effort by the digital development community to share a common set of guiding principles for technology-enabled programs. Each principle represents best practices formed through shared experiences and lessons learned by digital development professionals. As user-centered design principles, the principles provide a framework for guiding digital development and can also be adapted to enhance contextualization of evaluation designs.

The nine principles encompass a range of key considerations for digital development project. By incorporating principles into evaluation designs and processes, organizations can ensure that evaluations are locally centric and aligned with the specific needs and realities of communities being served. Here are some ways the principles can be applied:

  • Design with the user in mind: Engage with local stakeholders when designing evaluation methodologies, defining samples and identifying approaches to recruit community members. This may include more user-centric techniques, such as providing cameras for image-based responses to evaluation questions, or more collective data gathering techniques over individual in-depth interviews and surveys.
  • Understand the existing ecosystem: Consider local infrastructure, resources, and cultural factors that may affect valuation processes and outcomes. Evaluators can time evaluation activities to align to people’s schedules – not just in terms of availability, but in terms of readiness to be interviewed, such as when they are out of the eyes and ears of their family members, or when they have rested after a long day of work.
  • Be data-driven: Utilize data collection and analysis methods appropriate to local context, ensuring data are accurate, reliable, and culturally sensitive. This may include conducting formative research prior to designing the evaluation to ensure processes align to context, or an entire cluster of respondents may be missed.
  • Use open standards and open data: Foster transparency and accountability by making evaluation processes and data accessible to local stakeholders. And be open to innovative methods and sharing methods and information with stakeholders. Participatory processes support this concept.
  • Address privacy and security: This goes beyond keeping data safely stored on a screen-protected tablets and deleted after uploading to a secure server. It may involve finding places and times to meet that allow for free exchange of information and avoiding the popular “youth focus group discussion” that can lead to unanticipated negative outcomes for frank participants. It may mean avoiding targeted data collection at all, utilizing crowd sourced data to avoid unintended “outing” of hidden populations.
  • Be collaborative: Involve local communities, government agencies, and other relevant stakeholders in the evaluation process to ensure diverse perspectives and ownership.
  • Reuse and improve. This is a useful tool for all evaluations. Check on methods, tools and processes during, and debrief after evaluations, managing knowledge that may provide valuable insights the next time an evaluation shares contextual similarities.

By prioritizing the needs and perspectives of local communities and community members, evaluations ensure that designs and results are tailored to their unique context, resulting in more meaningful outcomes.


Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.