Hello, I’m Ashweeta Patnaik and I work at the Ray Marshall Center (RMC) at The University of Texas in Austin. RMC has partnered with Nuru International (Nuru) to use Monitoring and Evaluation (M&E) data to evaluate the impacts of Nuru’s integrated development model. Here, I share some lessons learned.
Nuru is a social venture committed to ending extreme poverty in remote, rural areas in Africa. Nuru equips local leaders with tools and knowledge to lead their communities out of extreme poverty by integrating impact programs that address four areas of need: hunger; inability to cope with financial shocks; preventable disease and death; and, lack of access to quality education for children. Nuru’s M&E team collects data routinely to measure progress and drive data based decision making.
Lessons Learned:
- Establish a study design to measure program impact early – ideally, prior to program implementation.
Nuru has a culture where M&E is considered necessary for decision making. Nuru’s M&E team had carefully designed a robust panel study prior to program implementation. Carefully selected treatment and comparison households were surveyed using common instruments at multiple points across time. As a result, when RMC became involved at a much later stage of program implementation, we had access to high quality data and a research design that allowed us to effectively measure program impacts.
- When modifying survey instruments, be mindful that new or revised indicators should capture the overall program outcomes and impacts you are trying to measure.
Nuru surveyed treatment and comparison households with the same instruments at multiple time points. However, in some program areas, changes made to the components of the instrument from one time-point to the next led to challenges in constructing comparable indicators, affecting our ability to estimate program impact in these areas.
- Monitor and ensure quality control in data entry, either by using a customized database or by imposing rigid controls in Excel.
Nuru’s M&E data was collected in the field and later entered into Excel spreadsheets. In some cases, the use of Excel led to inconsistences in data entry that posed challenges when using the data to analyze program impact.
- When utilizing an integrated development model, be mindful that your evaluation design also captures poverty in a holistic way.
In addition to capturing data to measure the impact of each program, Nuru was also mindful about capturing composite programmatic impact on poverty. At the start of program implementation, Nuru elected to use the Multidimensional Poverty Index (MPI). MPI was measured at multiple time points for both treatment and comparison households using custom built MPI assessments. This allowed RMC to measure the impact of Nuru’s integrated development model on poverty.
Hot Tip! For a more detailed discussion, be sure to visit our panel at Evaluation 2017, Fri, Nov 10, 2017 (05:30 PM – 06:15 PM) in Roosevelt 1.
The American Evaluation Association is celebrating International and Cross-Cultural (ICCE) TIG Week with our colleagues in the International and Cross-Cultural Topical Interest Group. The contributions all this week to aea365 come from our ICCE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.