This week’s posts highlight reflections from the Global Evaluation Initiative (GEI), a global network of organizations and experts working together to support the strengthening of monitoring, evaluation, and the use of evidence in developing countries. GEI uses an integrated systems-based approach and works closely with governments, evaluation professionals, and other stakeholders on efforts that are country-owned and aligned with local needs and perspectives.
Greetings! We are Lycia Lima, Gabriela Lacerda and Marina Lafer, staff members of the Centers for Learning on Evaluation and Results for Lusophone Africa and Brazil, hosted at the São Paulo School of Economics, from the Getulio Vargas Foundation (FGV EESP CLEAR LAB). FGV EESP CLEAR LAB is an implementing partner of the Global Evaluation Initiative (GEI), supporting GEI’s mission of strengthening monitoring and evaluation (M&E) systems and capacities in developing countries and promoting evidence-based decision-making in the public sector.
In late 2021, FGV EESP CLEAR LAB launched an online mentorship pilot program on rapid evaluations for teams in Lusophone Africa and Brazil. The Center is working with evaluation teams in Mozambique (evaluating the national water policy), Cape Verde (national land-use planning policy) and Brazil (a violence prevention, social inclusion and community development policy in the city of Recife and a cash transfer program to increase high-school attendance in the city of Niterói). Around 10-12 weekly meetings are being held between FGV EESP CLEAR staff members and each team.
Here are some lessons learned from the program so far:
- Set up a diverse evaluation team. Setting up a diverse evaluation team helps to leverage individual expertise, bring necessary perspectives to the table, and distribute capacity building across several agencies/organizations. Our project teams consisted of approximately eight members from government agencies (M&E department and sector agency responsible for the intervention) and universities, which worked well to bring diverse points of view to the rapid evaluation process.
- Tailor your approach to local needs and goals. The call for proposal approach and the mentorship session model put an emphasis on locally owned and informed perspectives. Team members developed their own evaluation plan, suggesting evaluation questions, methods to be used, the intervention to be evaluated, and the evaluation committee composition. This approach increases a feeling of ownership of the process, improves the likelihood that evidence from this process will be used, and helps ensure that team members continue applying these practices. CLEAR LAB’s facilitation approach was based on the public policy cycle and used M&E tools such as the problem tree, objective tree, and theory of change – all of which were tailored to the government’s technological, financial and staff capacities.
- Commit to a participatory model. The participatory and cohort mentorship model enabled local, regional, and global knowledge sharing. Participatory workshops were held with special guests who have already evaluated or managed similar interventions in other countries or regions. This enabled evaluation teams and guests to identify and discuss common challenges and best practices. A final workshop will be held where all four evaluation teams will share their lessons learned from undertaking – probably their first – evaluation. They will also share their evaluation findings and plans on how they will use their evidence in their decision-making. This participatory cohort model supports the institutionalization of evaluation practice within governments and encourages the development of an evaluation community across countries in Lusophone Africa and Brazil.
More information on FGV EESP CLEAR LAB’s mentorship program on rapid evaluations can be found on our webpage.
The American Evaluation Association is celebrating Global Evaluation Initiative (GEI) Week. The contributions all this week to aea365 come from GEI members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to email@example.com. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.