AEA365 | A Tip-a-Day by and for Evaluators

TAG | international

Hello! I am Alessandra Galiè, a PhD Candidate at Wageningen University in the Netherlands. From 2006 to 2011 I collaborated with a Participatory Plant Breeding programme coordinated at the International Centre for Agricultural Research in the Dry Areas (ICARDA) to assess the impact of the programme on the empowerment of the newly involved women farmers in Syria. The findings helped to understand how empowerment as a process can take place, and were useful to make the programme’s strategies more gender-sensitive. I chose to work with a small number (Small-N) of respondents (12 women) and a mixture of qualitative methods to provide and in-depth understanding of changes in empowerment as perceived by the women themselves and their community.

Lessons Learned

  • Small-N research is valuable. Small-N in-depth research is often criticised for its limited external validity. However, it was an extremely valuable methodology to explore a field of research that is relatively new with the aim of providing an understanding of complex social processes, of formulating new questions and  identifying new issues for further exploration.
  • Systematic evaluation should include empowerment. Empowerment is an often cited impact of development projects but rarely the focus of systematic evaluation. Assessing changes in empowerment required an approach that was specific to the context and intervention under analysis and that was relevant to the respondents and their specific circumstances. This revealed different positionalities of women in the empowerment process and the inappropriateness of blue print solutions to the ‘empowerment of women’.
  • Measure gender-based implications. An analysis of the impact of a breeding programme on the empowerment of women showed that ‘technical interventions’ have gender-based implications for both technology effectiveness and equity of development concerns.

Resources

The American Evaluation Association is celebrating the Mixed Methods Evaluation and Feminist Issues TIGs (FIE/MME) Week. The contributions all week come from FIE/MME members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

My name is Pablo Rodríguez-Bilella, and I am a social researcher at the Argentine Research Council and a professor of Social Anthropology at the UNSJ (Argentina).  I have been involved for the past few years in different uses of Web 2.0, and below I will share with you some lessons of my blogging experience at Al Borde Del Caos.

Rad Resources: Initially, I began to follow several twitterers who also blog about evaluation and development (see EvalCentral  or my trending tweets). Very soon I realized that I wasn´t satisfied with the stuff I had been finding in Spanish about these issues (with the exception of Evaluateca!).  Being involved in networks (mainly ReLAC and IOCE), conferences and research around evaluation was an extra motivation to begin blogging in Spanish. The idea of increasing the visibility and role of evaluation was pretty clear, so in September 2011 I began this adventure of blogging at Al Borde del Caos!

Hot Tips – most popular post:

I inserted a Translate to English button in my blog that, although far from perfect, it could help readers to have a close idea of what I´m posting.  So, these were the four more popular posts:

Lessons Learned:

  • Having a “bank of topics” is a great idea. So, every time something exciting appears somewhere, I send it to Evernote, and it will be ready when I´ll be looking for fresh ideas to blog.
  • On the other hand, several posts were written based on invitationschallenges or interest to give diffusion to something.
  • Blogging can be a time consuming activity, but it is time well used! The possibilities of networking, of finding similar (and different) people in the evaluation field, of learning by sharing, etc. are great returns for the investment done.
  • Publishing a post is just half of the work. The other half implies engaging with people who make comments or inquires, and letting the world know about the new post.
  • Many times this engagement with commenters doesn´t happen in the same blog, but in discussions groups in LinkedIn (pay a look to the several ones I am part of), the Facebook page of the blog, or Google Plus. If people are having fun in a particular bar, don’t push them to another one (unless everybody wants to move!)

As I like to say when I publish a new post: You are invited to visit Al Borde del Caos and polish your Spanish (or use the translate button!)

This year, we’re running a series highlighting evaluators who blog. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · ·

Hi. My name is Chris Camillo, and I am an auditor and consultant on international child labor and education issues. As part of my auditing work, I visit rural development projects in Africa and Latin America to assess the quality of their GPRA performance data, their compliance with program requirements and their learning environments for beneficiaries.

My Hot Tips are recommendations for improving monitoring systems from an auditor’s perspective.

Hot Tip 1: When designing a project for a rural environment, thoroughly assess potential barriers to efficient monitoring. In many countries that I’ve visited, heavy seasonal rains, rugged terrain, unpaved roads, strikes and inadequate transportation result in significant delays in data collection and reporting from target communities. A monitoring plan that relies on volunteer data collectors making frequent visits on foot to sites that are located many miles apart would be too challenging to implement under these circumstances.

Hot Tip 2: Make certain that the monitoring system is robust by requiring thorough documentation of all data collected and by requiring periodic data audits to validate the accuracy and reliability of performance numbers against the source documentation. Use automated controls whenever possible to help prevent errors in data collection, data entry, and reporting.

Hot Tip 3: In addition to training, consider providing performance-based compensation or incentives to employees and volunteers to ensure the accuracy and timeliness of data collection, transmission and reporting.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like submit a Tip? Send it to aea365@eval.org aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

Hi. I’m Victor Kuo, one of AEA’s Board Directors and liaison to the International Listening Project (ILP). AEA has launched the ILP to consider strategies that will guide AEA into enhanced engagement with the international communities of evaluators, evaluation organizations and clients of evaluation in the coming years.

The Board sees this initiative as aligned with AEA’s mission and values, specifically reflecting AEA’s commitment to valuing a global and international evaluation community and understanding of evaluation practices. AEA is now positioned to engage more actively in the international arena, but we want our activities to be guided by our membership and forged in collaboration with our international partners. The Listening Project, coordinated by Jim Rugh, is intended to help us listen well to the ideas for international evaluation initiatives of our members as well as our colleagues around the globe.

Hot Tip – Make your voice heard: You may have already received an invitation to participate in the ILP, but if you have not, and you are interested in AEA’s presence in the international arena, I would like to invite you to contribute in any of the following three ways:

  1. The primary instrument for soliciting individual input is an online questionnaire accessible at http://www.surveygizmo.com/s3/574552/AEA-International-Listening-Survey. We invite you to go there now and submit your suggestions.  We estimate that it should not take more than 15 minutes of your time.
  2. Another modality for sharing is an interactive blog where multiple persons can offer their suggestions and comment on the suggestions of others.  After you respond to the SurveyGizmo questionnaire we invite you to go to http://aea-internationallistening.wikispaces.com/Invitation+to+participate where you can join the discussions on that blog.
  3. Another mode would be to send your personal comments directly to Jim via e-mail: JimRugh@mindspring.com.

Note that you need not be an AEA member to participate, and we hope to have this first round of data collection completed by August 1.  The synthesis process will be shared on the Wikispaces blog.

A report summarizing the findings of the ILP will be circulated by mid-October. Also, there will be a Think Tank Session in Anaheim on Wednesday, November 2, where participants will be invited to contribute to discussions on ways to follow up those findings.

Thank you in advance for your ideas and participation!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Amir Fallah and I will be sharing resources for those practicing M&E (Monitoring and Evaluation) in the international sector.

Rad Resource – My M&E: My M&E (Monitoring and Evaluation)
MY M&E is an online collaboratively-developed home for M&E practitioners around the world. The list of partners includes the International Organization for Cooperation in Evaluation (IOCE), UNICEF, and over 10 others. The resources include videos, training, links to job boards, a wiki, and more. http://www.mymande.org/

Rad Resource – Monitoring and Evaluation NEWS
M&E News is a service focusing on developments in monitoring and evaluation methods relevant to development programs with social development objectives. It has been managed by Rick Davies since 1997 and regularly includes details and reviews of major reports as well as updates on training, and major international M&E news items. http://mande.co.uk/

Hot Tip – IOCE Association Listings
Where do you call home? The International Organization for Cooperation in Evaluation (IOCE) maintains a comprehensive list of evaluation associations around the world at http://www.ioce.net/members/members.shtml. Find an evaluation association in your region to tap into a professional support network.

Rad Resource – From the Archives: International M&E Training & Capacity Building Modules
In April of 2010, back when we had about 450 subscribers rather than today’s 2200, Scott Chaplowe wrote an aea365 post focusing on a series of training modules packed with information on everything from hiring M&E Staff to effective reporting writing. You can learn more about these here: http://aea365.org/blog/?p=425

Are you practicing internationally? What resources have you found invaluable? Please share via the comments.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

Hello colleagues! My name is Aubrey W. Perry, a graduate student in the Community Psychology doctorate program at Portland State University. I have had the privilege of working on two evaluation practicum projects before completing my master’s requirements. Through the completion of these projects, I have learned several actions that can be taken to ensure a positive experience.

Hot Tip: When entering into a practical experience relationship as a student or agency representative, it is helpful to remember the following guidelines:

1. Define Your Roles and Tasks: Every practicum, internship, or externship is different based on the students, agencies, and projects involved. It is important to consider what role each of these will play throughout the course of the experience.

a. Who will manage the student and their activities? In most cases, the student is under the supervision of their academic advisor and a point person at the agency. The student should work to make sure all parties know the feedback and supervisory structure at the onset of a project.

b. Is the project meeting an established need for the organization? A project that the agency is highly interested in will make it easier to find time to supervise the student and project while motivating the student to stay involved, as they will feel their work is valued.

 

2. Discuss Potentially Touchy Subjects at the Beginning of the Practica: Publication and presentation authorship is often a goal for evaluation professionals. Students and agency representatives should consider this at the onset by asking these questions.

a. What is the end product for this project expected to be? Is there going to be a research report, presentation, or peer-reviewed article that may require authorship guidelines?

b. What are the authorship policies of the agency or organization providing the practical experience? Are there any discrepancies between the student’s wishes and the agency’s policies?

 

3. Keep the Channels of Communication Open: Both the student and the supervisor should take it upon themselves to make sure both parties are staying in constant contact throughout the project. Examples of ways keep communication at the forefront are listed below:

a. Begin the project with a contract. Before the project even starts, the student, academic supervisor, and agency supervisor should draft a document detailing the length, tasks, and structure of the project.

b. Stay involved through email, phone conversations, or meetings. Discuss how often the student and the agency should communicate by creating a meeting schedule that best meets the demands of the project and the parties involved.

Rad Resource: Many career services offices and academic departments at universities maintain a repository of sample contracts, project outlines, and products that you may find useful.

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org. Want to learn more from Aubrey? He’ll be presenting as part of the Evaluation 2010 Conference Program, November 10-13 in San Antonio, Texas.

· · · ·

My name is Elizabeth Hutchinson and I am a Monitoring and Evaluation Specialist with Land O’Lakes International Development. My work focuses on evaluating United States Government-funded agriculture and food system development programs. Most people in the U.S. know of Land O’Lakes for its branded butter, but the division I work for has 30 years experience implementing international development programs that generate economic growth, improve health and nutrition, and alleviate poverty through market-driven business solutions.

Evaluation activities are integrated throughout the lifecycles of our programs – it is critical for our funders and because ongoing feedback enables us to quickly make programmatic course corrections. We often work with external host-country consultants who bring valuable localized knowledge and evaluation expertise to support our programs. I’d like to share a few lessons learned to improve the practice of working with local evaluators in international settings.

Lesson Learned – The capacity of local consultants varies greatly: Solicit proposals using RFAs from a variety of sources including universities, private organizations, and individuals. Keep in mind that some evaluation skills and expertise may overlap with other fields (i.e. economics, statistics, sociology, public health) which may be beneficial, depending on the evaluation question(s), subject area, and selected methodologies. Reviewing samples of past work (ie. reports, protocols, data sets) and checking references are both good ways to understand the skill sets and capacity of local consultants.

Lesson Learned – International settings can offer unique evaluation challenges: The political/social context in which the evaluation is being conducted may have implications on the quality and timeline. Use knowledgeable local informants to ensure your methodology, questions and timeline are appropriate to the local context. For example, you may not want to conduct household surveys in the month before a significant national election. Allowing a cushion is also critical, particularly if you have strict funder-mandated deadlines. We have found staggering deliverables (drafts, etc) throughout the project timeline to be helpful. There should be ample time built in to make significant changes before a hard/final deadline.

Lesson Learned – Open and ongoing communication is critical: Ensure that expectations around communications are clear and agreed upon upfront. Be explicit about the language you would like deliverables submitted in (i.e. most U.S. organizations likely expect reports in English). The submission/delivery mode should also be determined since sharing documents internationally can be difficult – identify the most appropriate way to share large electronic files, photos, or hardcopy reports, etc. since it can have resource implications. For example, e-mail submission may be more cost- and time-effective than requiring a printed/bound hard copy shipped from the field.

Related Resources:

Would you like to discuss issues of capacity building in international development with Elizabeth? She’ll be presenting at Evaluation 2010, the annual conference of the American Evaluation Association this November. Search the online program to find Elizabeth’s roundtable or information about over 500 other sessions.

*AEA members receive 20% off of all books from SAGE when ordered directly from the publisher. If you are a member, sign on to the AEA website and select “Publications Discount Codes” from the “Members Only” menu to access the codes and process.

· ·

My name is Ann Zukoski and I am a Senior Research Associate at Rainbow Research, Inc. in Minneapolis, Minnesota.

Founded as a nonprofit organization in 1974, Rainbow Research’s mission is to improve the effectiveness of socially-concerned organizations through capacity building, research, and evaluation. Projects range in scale from one-time program evaluations to multi-year, multi-site research studies and designs that explicitly include participatory approaches designed to lead to program improvement.

Through my work, I am always looking for creative ways to capture evaluation data. Here is one rad resource and a hot tip on a participatory tool to add to your tool box.

Rad Resource: Participatory evaluation approaches are used extensively by international development organizations. This web page is a great resource for exploring different rapid appraisal methods that can be adapted to the US context.

ELDIS http://www.eldis.org/go/topics/resource-guides/participation/participation-in-development – ELDIS provides descriptions and links to a variety of information sources on participatory evaluation approaches, including online documents, organization’s web sites, databases, library catalogues, bibliographies, and email discussion lists, research project information, map and newspaper collections. Eldis is hosted by the Institute of Development Studies in Sussex, U.K.

Hot Tip: Evaluators are often asked to identify program impacts and measure key outcomes of community based projects. Impact and outcome measures are often externally determined by the funder. Many times, however, collaborative projects lead to unanticipated outcomes that are seen to be of great value by program participants but are overlooked by formal evaluation designs. One participatory technique, Most Significant Change (MSC), offers an alternative approach to address this issue and can be used to surface promising practices.

Most Significant Change Technique (MSC) – MSC is a participatory qualitative data collection process that uses stories to identify the impact of the program. This approach involves a series of steps where stakeholders search for significant program outcomes and deliberate on the value of these outcomes in a systematic and transparent manner. Stakeholders are asked to write stories of what they see as “significant change” and then dialogue with others to select stories of most importance. The goal of the process is to make explicit what stakeholders (program staff, program beneficiaries and others) value as significant change. The process allows participants to gain a clearer understanding of what is and what is not being achieved. The process can be used for program improvement, identifying promising practices as well as to uncover key outcomes by helping evaluators identify areas of change that warrant additional description and measurement.

Where to go for more information:

http://www.mande.co.uk/docs/MSCGuide.pdf

Have you used this tool? Let us all know your thoughts!

This contribution is from the aea365 Daily Tips blog, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org.

· · ·

Hello, my name is Scott Chaplowe, and I am a Senior M&E Officer with the International Federation Red Cross and Red Crescent Societies (IFRC). I have been working in international monitoring and evaluation for about a decade now, and some of my earliest and most impressionable learning experiences in evaluation were with the AEA at its annual conference. Thus, it is great to see AEA utilize the internet through AE365 and other initiatives for knowledge sharing.

Rad Resources: Field-friendly M&E training and capacity-building modules (Ed. Guy Sharrock). This is a series of nine modules on key aspects of monitoring and evaluation (M&E) for international humanitarian and socioeconomic development programs. They are “field-friendly” in that the module topics were selected to respond to field-identified needs for specific guidance and tools. The intended audience includes managers as well as M&E specialists, and the series can also be used for M&E training and capacity building. The American Red Cross and Catholic Relief Services (CRS) produced the series under their respective USAID/Food for Peace Institutional Capacity Building Grants.

Right now, it seems that the website for Catholic Relief Services is the best location to access the complete series of modules, http://www.crsprogramquality.org/publications/2011/1/17/me-training-and-capacity-building-modules.html, and individual module titles include:

  • Capacity-Building Guidance
  • Monitoring and Evaluation Planning
  • Indicator Performance Tracking Tables
  • Hiring M&E Staff
  • Preparing for an Evaluation
  • Managing and Implementing an Evaluation
  • Communicating and Reporting on an Evaluation
  • Effective Report Writing
  • Success and Learning Stories

In addition to the full modules, there are also very handy “Short Cuts” versions of field-friendly M&E training and capacity-building modules (Ed. Guy Sharrock). The Short Cuts provide a ready reference tool for people already familiar with the full modules, or those who want to fast-track particular skills. They can also be reached at Catholic Relief Services, http://www.crsprogramquality.org/publications/2011/1/14/me-short-cuts.html,  and individual titles include:

  • Capacity-Building Guidance
  • Monitoring and Evaluation Planning
  • Using Indicator Performance Tracking Tables
  • Hiring M&E Staff
  • Preparing for an Evaluation
  • Managing and Implementing an Evaluation
  • Communicating and Reporting on an Evaluation
  • Writing Human Interest Stories
  • M&E and Ethics

I admit that I am a little biased towards the series as I was a contributing author while working as an M&E Advisor with the American Red Cross’ Tsunami Recovery Program. I wrote the module on Monitoring and Evaluation Planning. The other day a colleague sent me a link for an additional website to directly access this particular module:

http://www.stoptb.org/assets/documents/countries/acsm/ME_Planning_CRS.pdf

This contribution is from the aea365 Daily Tips blog, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org.

· · · · · ·

Archives

To top