AEA365 | A Tip-a-Day by and for Evaluators

TAG | M&E

We are Gabriela Perez Yarahuan, Ana Ramírez Valencia, Indrani Barrón Illescas and Emil Salim Miyar, from CLEAR for Spanish-speaking Latin America. We’re located in Mexico City at CIDE (Center for Research and Teaching in Economics), a leading institution in social sciences. Recently, in May 2016, we worked with other M&E leading groups in Mexico to co-organize the Evaluation Week in Mexico 2016. Since many readers of AEA365 might be involved with putting together M&E conferences, we’re sharing our lessons.

Lesson Learned:

  • Showcase domestic, regional and international advances. To build our program we looked not only to what was happening in Mexico, but also within the broader Latin America region and internationally. We invited practitioner and academic experts from the region, Europe and the US to lead sessions in order to have cross-fertilization of information and ideas.
  • Join forces in organizing the event. The leading organizations – from academia, government, civil society and others – were convened for their leadership in working in M&E in Mexico and Latin America. We joined forces by making available a space to discuss, present and interchange strategies, methodologies, experiences and results at a local level. In doing so, we helped not only in building a robust and diverse program (see here, in Spanish), but also in building ownership at the local level within our expanding community.
  • Make your conference accessible through multi-city sessions and the use of technology. Mexico is a big country and not everyone has the resources to attend in person. Additionally, the Evaluation Week in Mexico 2016 had simultaneous events (more than 90 activities overall!). Recognizing this we connected to one another through social media and also streamed many events. Two featured events during the week were a 2-day Evaluation Utilization Seminar and the Early Child Development Policy Seminar. We livestreamed both seminars and made the presentations available at our website.
  • Encourage active participation in the design of your conference and sessions. We organized many of our panels in the form of debates, with active engagement and discussion from the gathered participants. We set aside time – not just 5 minutes at the end of a session! – to have moderated whole-room discussions. We also distributed voting and texting devices (Connectors) to encourage opinions and information from those gathered.

Rad Resources:

The American Evaluation Association is celebrating Centers for Learning on Evaluation and Results (CLEAR) week. The contributions all this week to aea365 come from members of CLEAR. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

I am Lycia Lima, the executive coordinator of the newest CLEAR center- for Brazil and Lusophone (Portuguese-speaking) Africa. We’re formally joining CLEAR later this year and are planning our inauguration in October 2015. I was also one of the organizers involved in the formation of the Brazilian M&E Network – Rede Brasileira de Monitoramento e Avaliaçã – which has become a very active association.

We’re based in Brazil, at the Sao Paulo School of Economics at Fundação  Getulio Vargas and work jointly with the school´s Center for Applied Microeconomics. Through CLEAR we’re looking forward to expanding into new areas and building bridges with the M&E communities in Brazil and elsewhere. In particular, we’ll be working to advance evaluation capacity development services and products in Portuguese for use in Lusophone (Portuguese-speaking) countries, all to foster evidence-based policy making in these countries.

Historically, our team in Brazil has had a lot of experience in carrying out impact evaluations in all sectors. Though we specialize in impact evaluation, we have experience in and appreciate the broader range of M&E approaches, and think that an integrated approach will make our work better. In this post, I have put together a few tips about impact evaluation that you would not learn in conventional econometrics books. This is advice I’d give to impact evaluators.

Lessons Learned: Know well the theory of change of your intervention! If you don´t know the theory of change well, you might not fully understand the causality channels and might leave important impact indicators out of the analysis. Get your hands dirty! Go to the field, talk to project managers, talk to beneficiaries and make sure you fully understand the intervention you are trying to evaluate. Also, be careful with the quality of your data. Make sure you spend some resources on hiring and training qualified staff to supervise data collection. Good quality data is crucial for your study.

Lessons Learned: Even if you are an empiricist and believe mostly in quantitative methods, do not underestimate the value of mixed methods. In particular, qualitative approaches will help you understand “why and how” things happened. Importantly, get to know M&E “foundational” literature from Patton, Scriven, Bamberger, and others.

Rad Resources: While in general M&E materials available in Portuguese are limited in numbers, there is a very useful impact evaluation book that I have co-authored with other Brazilian experts. The book may be obtained free at

http://www.fundacaoitausocial.org.br/_arquivosestaticos/FIS/pdf/livro_aval_econ.pdf

We look forward to contributing to the M&E literature base in Portuguese, so please check back with us on this.

The American Evaluation Association is celebrating Centers for Learning on Evaluation and Results (CLEAR) week. The contributions all this week to aea365 come from members of CLEAR. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

 

·

I’m Urmy Shukla, Capacity Building Manager at the CLEAR South Asia Regional Center, hosted by J-PAL South Asia at the Institute for Financial Management and Research. Since our 2011 start with CLEAR we’ve developed a wide-range of activities aimed at improving monitoring and evaluation (M&E) capacity throughout the region, including 90 trainings for partners such as the Indian civil services, state governments, NGOs, donor agencies, and academic institutions. Each training requires a significant amount of planning and preparation, including a needs assessment to assess skills and the partners’ role in evaluation, the development of customized content, and delivering the course itself. As such, we want to ensure that are trainings are meeting their objectives. 

How do we know if our trainings are ‘working’?

As evaluators, we know that there are several steps to plan for, and later assess, effectiveness of our activities. Most importantly, we need to:

  • define a theory of change and/or results framework for program activities, focusing on desired outcomes
  • measure/assess the desired outcomes

For evaluation capacity development, these aren’t always easy to design and implement. But we’re taking several steps to assess the effectiveness of our trainings, including developing an organization-specific results framework and tracer surveys to track past training participants. We’re testing our approach as we’re going, and below are sharing some practical and strategic tips.

Hot Tips: For training tracer studies:

  • Clearly define training objectives from the outset. These objectives should go beyond skills gained, but should also include what you hope the participants will do after the training, within what is reasonably feasible during that timeline.
  • Develop a way to systematically organize your multiple objectives. This will make it easier for you to design future tracer surveys and needs assessments. We categorize our objectives by (a) partner type (those who either do evaluations, use evaluations for decision-making, fund evaluations, and/or commission evaluations) and (b) knowledge, attitude, or behavior (KAB). From this, we have developed a database of tracer survey questions, which can be easily filtered for each type of training.
  • Get partner buy-in early. Getting people to participate in a tracer study a year or two after the training can be hard, so give advance notice at the training that a tracer study will occur. Then have some contact with trainees – through newsletters, announcements, listservs – after the training to keep contact info current and so they remain familiar with you.

Rad Resources:

The American Evaluation Association is celebrating Centers for Learning on Evaluation and Results (CLEAR) week. The contributions all this week to aea365 come from members of CLEAR. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

We are Claudia Maldonado Trujillo and Oliver Manuel Peña, from CLEAR for Spanish-speaking Latin America. We’re located in Mexico City at CIDE (Center for Research and Teaching in Economics), a leading institution in social sciences. We’re sharing our work in advancing knowledge in evaluating climate change and how we’re addressing it through the upcoming International Seminar on Climate Change and Development in Latin America.

Lesson Learned: Context and main issues. If you’ve followed monitoring and evaluation (M&E) initiatives over the last 10 to 20 years, you’ll have seen that many advances have occurred in Latin America – such as the creation of exemplary M&E systems at national and subnational levels, innovative approaches to evaluate social programs, and so on. Yet, climate change – one of our most challenging public problems – seems to have gotten considerably less attention from evaluators and policymakers. Why is this?

We think that evaluation of climate change policy faces three main types of challenges: methodological, political, and network-related.

Methodologically, M&E approaches for climate change adaptation and mitigation policies have obvious complexities: measurement, attribution and accurate verification, among others. These challenges require more than program based evaluation models, with interdisciplinary innovations needed to assess how to effectively tackle climate change.

Politically, climate change isn’t often “center stage” in national policymaking. Despite international commitments and assumed national responsibilities, average policymakers often focus on problems that seem more immediate to them or to their constituencies.

Network-related challenges follow political challenges, in that most policymakers do not convene around this topic, unless they are working specifically on climate change and environment issues.CLEAR blog 3

Knowing this, we’re using our platform as a regional center – along with the Inter-American Development Bank’s Office of Evaluation and Oversight (OVE) and the Swiss Agency for Development and Cooperation (SDC) – to convene and match up diverse, yet complementary, environmental specialists and policymakers with policymakers and stakeholders who don’t normally focus on climate change. Our goal is to raise awareness and advance the adoption of sound strategies – with reliable M&E instruments as a backbone at the International Seminar on Climate Change and Development in Latin America.

Lessons Learned: Institutional coordination with the IDB, SDC and other stakeholders on the agenda was key. It captured our complementary expertise, interests and concerns to shape an attractive and relevant agenda, drawing high-level participants with decision-making power.

Rad Resources: Learn more with these resources.

The American Evaluation Association is celebrating Centers for Learning on Evaluation and Results (CLEAR) week. The contributions all this week to aea365 come from members of CLEAR. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

We are Viengsavanh Sanaphane and Katie Moore from Catholic Relief Services.  We are working on an Inclusive Education (IE) project in Southeast Asia.

Currently, IE efforts are expanding. Experienced partners that have worked on IE for years in the local context were planning to share qualitative data about their project with stakeholders new to the initiative; qualitative data is defined here as previous project stakeholders’ perceptions on the projects’ successes, challenges, and how they overcame challenges during implementation.

We had the question:  How may experienced IE stakeholders’ share their project’s qualitative data with incoming stakeholders in a dynamic and engaging way?  The idea for previous stakeholders to use a play to share their qualitative data and new stakeholders to use reflection journals to record qualitative data instead of using Powerpoint for data sharing and traditional surveys to collect data was conceptualized.

The qualitative data for the data sharing event was presented by experienced stakeholders in the form of a play that they wrote and practiced for the explicit purpose of presenting data dynamically.  This was followed by a Q&A session and one-on-one interviews between former and current project implementers.  Stakeholders new to IE efforts were able to use their journals to record what they determined was relevant qualitative data from the play and discussions needed for implementing IE in their own contexts.  New stakeholders requested keeping their journals as an ongoing tool to refer to while strategically planning how they would implement an IE project.

While not trialed yet, the qualitative data that new project stakeholders recorded as relevant could be implemented in their own project, analyzed by again reading the journals, and analyzed at the project’s end to see the impacts of data sharing events between two similar contexts implementing the same type of project.   Further, stakeholders may refer back to their own work, thus lending to an empowerment and utilization approach whereby stakeholders are able to use locally produced data for local decisional making processes and other personal/professional needs.

Rad Resource:  Use free 2013 templates from Microsoft to create booklets/journals that allow stakeholders to record qualitative data presented in a refreshing way that lends towards an empowerment M&E approach.

Lesson Learned:  For an empowerment approach, have stakeholders design questions to be included in the journals in order to be the “owners” of their own processes and work.  Facilitate a “practice-run” of the book with a small group to identify aspects that needed to be modified prior to using with a large group.

Hot Tip: Data collectors or project implementers may photocopy or take pictures with Smartphones, tablets and/or cameras of the journals, given informed consent/permission of stakeholders, as a way to record the qualitative data for later analysis.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

I am Frank Meintjies who has worked in the public sector (the South African government) and the private sector (Deloite Consulting) either managing and implementing programmes or undertaking organisational development. Through this period, I have simultaneously conducted numerous evaluations, including process evaluations and impact evaluations. Although taken as a whole, most of my work has related to poverty reduction initiatives, my largest evaluation assignments have been on multi-year HIV and AIDS programmes.

In recent years, I have begun to focus on formal aspects of Monitoring and Evaluation, including design of frameworks and their institutionalisation within development organisations. Through my work, I am always on the lookout for tactics of quickly embedding M&E within organisations that are embracing it for the first time.

Lesson learned: For an organisation addressing the need for a properly institutionalised evaluation plan for the first time, I have found that it could help immensely to include a brief training session as part of the process. The training introduces key internal role players to basic concepts and is part of the process of building what change guru John Kotter would term a “powerful guiding coalition” behind the introduction of M & E into the organisation.

Lesson learned: In small organisations, I include all professional staff in the training. In large organisations I include all managers in the training. (When the training is with general managers and executives it is of shorter duration). Including a broad range of people in a training/orientation session creates greater possibilities when it comes to assigning roles and responsibilities. In many instances, staff members have informed me that had they not had a thorough orientation to M&E, they would not have stepped up for or accepted roles, for example, as members of the M&E Steering Committee.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice.

·

Hello, my name is Scott Chaplowe, and I am a Senior Monitoring and Evaluation (M&E) Officer with the International Federation Red Cross and Red Crescent Societies (IFRC). The IFRC has a lot of stakeholders –communities, 186 National Societies, local governments, partners, donors, etc. One part of my job is to lead M&E trainings that empower our local stakeholders to better understand and participate in the M&E process. There are two big challenges I encounter in building people’s understanding and practice of M&E:

  1. M&E is not the most exciting (“sexy”) subject that people gravitate towards. Resistance can be heightened for the very reason stakeholders need M&E training; They do not understand and value M&E, but may feel threatened by it, fearing it will burden them.
  2. M&E systems can be a straightjacket, imposing outside, “technocentric” methods that alienate rather than foster local participation in project design, monitoring, and evaluation.

Hot Tip: I like to address both of these challenges through fun, participatory methods to demystify M&E, so people better understand, participate in, and own the M&E process. For example, one way I introduce the key concepts of a logframe is with an activity I call the Logical Bridge. Training participants construct a bridge using straws, tape, scissors and string. The bridge is then used as a simple metaphor to discuss project design for a real bridge – inputs, activities, outputs (the bridge), outcomes (i.e. increase trade and between two towns), and ultimate goal (i.e. improved livelihoods). Everyone can relate to a bridge, and I have found this activity to be a fun, useful springboard into the logical hierarchy of results (whatever semantics is used for each level of the logframe). It also has the added benefit of teambuilding).

Hot Tip & Rad Resource: Consider using illustrations or cartoons to convey key M&E messages – and not just in publications, but also in presentations. Show a cartoon during a training and ask participants what it means to them, whether they can relate (or not), and what we might be able to learn from it. Check out the cartoons in our new IFRC Project and Program M&E Guide!

Rad Resource: Come check out my “Fun and Games with Logframes” professional development workshop at the upcoming annual AEA conference in Anaheim to experience more fun, innovative ways to reinforce the understanding and use of logframes. Wednesday, November 2, 12:00 PM to 3:00 PM. Registration is required – more information online here.

Rad Resource: The guide, “100 Ways to Energise Groups: Games to Use in Workshops, Meetings and the Community,” may not be specifically on M&E, but is useful for lubricating the thought process for how fun and games can be infused into M&E training, and other activities.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hi. My name is Chris Camillo, and I am an auditor and consultant on international child labor and education issues. As part of my auditing work, I visit rural development projects in Africa and Latin America to assess the quality of their GPRA performance data, their compliance with program requirements and their learning environments for beneficiaries.

My Hot Tips are recommendations for improving monitoring systems from an auditor’s perspective.

Hot Tip 1: When designing a project for a rural environment, thoroughly assess potential barriers to efficient monitoring. In many countries that I’ve visited, heavy seasonal rains, rugged terrain, unpaved roads, strikes and inadequate transportation result in significant delays in data collection and reporting from target communities. A monitoring plan that relies on volunteer data collectors making frequent visits on foot to sites that are located many miles apart would be too challenging to implement under these circumstances.

Hot Tip 2: Make certain that the monitoring system is robust by requiring thorough documentation of all data collected and by requiring periodic data audits to validate the accuracy and reliability of performance numbers against the source documentation. Use automated controls whenever possible to help prevent errors in data collection, data entry, and reporting.

Hot Tip 3: In addition to training, consider providing performance-based compensation or incentives to employees and volunteers to ensure the accuracy and timeliness of data collection, transmission and reporting.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like submit a Tip? Send it to aea365@eval.org aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

My name is Amir Fallah and I will be sharing resources for those practicing M&E (Monitoring and Evaluation) in the international sector.

Rad Resource – My M&E: My M&E (Monitoring and Evaluation)
MY M&E is an online collaboratively-developed home for M&E practitioners around the world. The list of partners includes the International Organization for Cooperation in Evaluation (IOCE), UNICEF, and over 10 others. The resources include videos, training, links to job boards, a wiki, and more. http://www.mymande.org/

Rad Resource – Monitoring and Evaluation NEWS
M&E News is a service focusing on developments in monitoring and evaluation methods relevant to development programs with social development objectives. It has been managed by Rick Davies since 1997 and regularly includes details and reviews of major reports as well as updates on training, and major international M&E news items. http://mande.co.uk/

Hot Tip – IOCE Association Listings
Where do you call home? The International Organization for Cooperation in Evaluation (IOCE) maintains a comprehensive list of evaluation associations around the world at http://www.ioce.net/members/members.shtml. Find an evaluation association in your region to tap into a professional support network.

Rad Resource – From the Archives: International M&E Training & Capacity Building Modules
In April of 2010, back when we had about 450 subscribers rather than today’s 2200, Scott Chaplowe wrote an aea365 post focusing on a series of training modules packed with information on everything from hiring M&E Staff to effective reporting writing. You can learn more about these here: http://aea365.org/blog/?p=425

Are you practicing internationally? What resources have you found invaluable? Please share via the comments.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

Hello, my name is Scott Chaplowe, and I am a Senior M&E Officer with the International Federation Red Cross and Red Crescent Societies (IFRC). I have been working in international monitoring and evaluation for about a decade now, and some of my earliest and most impressionable learning experiences in evaluation were with the AEA at its annual conference. Thus, it is great to see AEA utilize the internet through AE365 and other initiatives for knowledge sharing.

Rad Resources: Field-friendly M&E training and capacity-building modules (Ed. Guy Sharrock). This is a series of nine modules on key aspects of monitoring and evaluation (M&E) for international humanitarian and socioeconomic development programs. They are “field-friendly” in that the module topics were selected to respond to field-identified needs for specific guidance and tools. The intended audience includes managers as well as M&E specialists, and the series can also be used for M&E training and capacity building. The American Red Cross and Catholic Relief Services (CRS) produced the series under their respective USAID/Food for Peace Institutional Capacity Building Grants.

Right now, it seems that the website for Catholic Relief Services is the best location to access the complete series of modules, http://www.crsprogramquality.org/publications/2011/1/17/me-training-and-capacity-building-modules.html, and individual module titles include:

  • Capacity-Building Guidance
  • Monitoring and Evaluation Planning
  • Indicator Performance Tracking Tables
  • Hiring M&E Staff
  • Preparing for an Evaluation
  • Managing and Implementing an Evaluation
  • Communicating and Reporting on an Evaluation
  • Effective Report Writing
  • Success and Learning Stories

In addition to the full modules, there are also very handy “Short Cuts” versions of field-friendly M&E training and capacity-building modules (Ed. Guy Sharrock). The Short Cuts provide a ready reference tool for people already familiar with the full modules, or those who want to fast-track particular skills. They can also be reached at Catholic Relief Services, http://www.crsprogramquality.org/publications/2011/1/14/me-short-cuts.html,  and individual titles include:

  • Capacity-Building Guidance
  • Monitoring and Evaluation Planning
  • Using Indicator Performance Tracking Tables
  • Hiring M&E Staff
  • Preparing for an Evaluation
  • Managing and Implementing an Evaluation
  • Communicating and Reporting on an Evaluation
  • Writing Human Interest Stories
  • M&E and Ethics

I admit that I am a little biased towards the series as I was a contributing author while working as an M&E Advisor with the American Red Cross’ Tsunami Recovery Program. I wrote the module on Monitoring and Evaluation Planning. The other day a colleague sent me a link for an additional website to directly access this particular module:

http://www.stoptb.org/assets/documents/countries/acsm/ME_Planning_CRS.pdf

This contribution is from the aea365 Daily Tips blog, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org.

· · · · · ·

<< Latest posts

Archives

To top