AEA365 | A Tip-a-Day by and for Evaluators

CAT | International and Cross-cultural Evaluation

This is part of a two-week series honoring our living evaluation pioneers in conjunction with Labor Day in the USA (September 5).

Aloha, we are Sonja Evensen, Evaluation Senior Specialist at Pacific Resources for Education and Learning, and Judith Inazu, Acting Director, Social Science Research Institute, University of Hawai’i. In 2006, we worked together to establish the Hawai’i-Pacific Evaluation Association (H-PEA), an affiliate of AEA. We wish to nominate Dr. Lois-ellin Datta as our most-esteemed evaluator.

Why we chose to honor this evaluator:

Lois-ellin Datta has made tremendous contributions to the field of evaluation at the international, national, and local levels. Many know of her leadership and influence on the national scene, but few know of how she has given unselfishly in supporting the development and survival of H-PEA. In those early years, she participated in our conferences as a keynoter, workshop presenter, and panelist — every time we asked, she said yes, and did so without compensation. When we were getting organized she told us that it was easy to start something (H-PEA) but difficult to sustain it. Fortunately, H-PEA is now celebrating its 10th “birthday” and is still going strong! We are forever grateful for her support, both financial and strategic, in our efforts to grow the Hawaii-Pacific Evaluation Association.

Contributions to our field:

Dr. Datta spent many years in Washington, D.C., serving as director or head of research of many organizations, including the National Institutes of Health, U.S. Office of Economic Opportunity, U.S. Department of Health and Human Services, and the U.S. General Accounting Office where she received the Distinguished Service Award and the Comptroller General’s Award.

She has served as past-president of AEA, a board member of AEA and the Evaluation Research Society, chief editor for New Directions in Evaluation, and now serves on the editorial boards of the American Journal of Evaluation, International Encyclopedia of Evaluation, and the International Handbook of Education, among others.

Dr. Datta has written over 100 articles and three books, all while providing in-kind assistance to countless local organizations such as the Hau’oli Mau Loa Foundation, the Maori-Hawaiian Evaluation Hui, and the Native Hawaiian Education Council in support of culturally sensitive evaluation approaches.

Lois-ellin is precise and exacting in her work and simultaneously caring and sensitive to those she works with. She has always been keenly focused on achieving social justice and mindful of the importance of policy. We deeply admire her intellect and wit, but it is her encouragement and boundless enthusiasm which make her such a valued colleague and advisor. Each time her counsel is sought, she thoroughly considers the issue at hand before providing a deliberate, thoughtful response. Over the years, the unique contributions of Lois-ellin Datta to the field of evaluation at the local, national, and international levels have been legendary.

Resources:

Lois-ellin Datta on LinkedIn

The Kohala Center

Books by Lois-ellin Datta

The American Evaluation Association is celebrating Labor Day Week in Evaluation: Honoring Evaluation’s Living Pioneers. The contributions this week are tributes to our living evaluation pioneers who have made important contributions to our field and even positive impacts on our careers as evaluators. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I’m Caitlin Blaser Mapitsa, working with CLEAR Anglophone Africa to coordinate the Twende Mbele programme, a collaborative initiative started by the governments of Uganda, Benin, and South Africa to strengthen national M&E systems in the region. The programme is taking an approach of “peer learning and exchange” to achieve this, in response to an overall weak body of knowledge about methods and approaches that are contextually relevant.

Lessons Learned:

Since 2007, the African evaluation community has been grappling with what tools and approaches are best-suited to “context-responsive evaluations” in Africa. Thought leaders have engaged on this thorough various efforts, including a special edition of the African Journal of Evaluation, a Bellagio conference, an AfrEA conference, the Anglophone and Francophone African Dialogues, and recently a stream in the 2015 SAMEA conference.

Throughout these long-standing discussions, practitioners, scholars, civil servants and others have debated the methods and professional expertise that are best placed to respond to the contextual complexities of the region. Themes emerging from the debate include the following:

  • Developmental evaluations are emerging as a relevant tool to help untangle a context marked by decentralized, polycentric power that often reaches beyond traditional public sector institutions.
  • Allowing evaluations to mediate evidence-based decision making among diverse stakeholders, rather than an exclusively learning and accountability role, which is more relevant for a context where there is a single organizational decision maker.
  • Action research helps in creating a body of knowledge that is grounded in practice.
  • Not all evidence is equal, and having an awareness of the kind of evidence that is valued, produced, and legitimized in the region will help evaluators ensure they are equipped with methods which recognize this.

Peer learning is an often overlooked tool for building evaluation capacity. In Anglophone Africa there is still a dearth of research on evaluation capacity topics. There is too little empirical evidence and consensus among stakeholders about what works to strengthen the role evaluation could play in bringing about better developmental outcomes.

Twende Mbele works to fill this knowledge gap by building on strengths in the region. At Twende Mbele, we completed a 5 month foundation period to prepare for the 3 year program. This is now formalizing peer learning as an approach that will ensure our systems strengthening work is appropriate to the regional context, and relevant to the needs of collaborating partners.

Rad Resources:

The American Evaluation Association is celebrating Centers for Learning on Evaluation and Results (CLEAR) week. The contributions all this week to aea365 come from members of CLEAR. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

I’m Maurya West Meiers, Senior Evaluation Officer at the World Bank, and also a member of the Global Hub team for the CLEAR program (Centers for Learning on Evaluation and Results). Since I work in the international development sphere on M&E capacity building, I try to follow news, resources and events related to these topics. I gain a lot of great information from the sources below, which I hope are useful to you.

  • The Center for Global Development positions itself as a “think and do” tank, where independent research is channeled into practical policy proposals, with great events and the occasional heated debate.
  • DevEx is a media platform ‘connecting and informing’ the global development community and is known as a key source for development news, jobs and funding opportunities.
  • GatesNotes is Bill Gates’ blog on what he is learning and it touches on a variety of international topics related to evidence – plus many other matters that have nothing to do with these topics but are simply interesting.
  • For impact evaluation there are a number of sources including 3ie, which is an international grant-making NGO that funds impact evaluations and systematic reviews. JPAL (Poverty Action Lab) is a network of affiliated professors from global universities. Both 3ie and JPAL provide heaps of impact evaluation resources – research, webinars, funding opportunities and so on.
  • The Behavioural Insights Team is a government institution dedicated to the application of behavioral sciences and is also known as the UK Nudge Unit. This is a good site to visit for research and events on this trendy subject matter.
  • OpenGovHub is a network of 35+ organizations promoting transparency, accountability, and civic engagement around the world. The OpenGovHub newsletter consolidates and shares updates on the work and events from these organizations. For even more resources, be sure to look through the list of OpenGovHub’s network members in Washington and other locations.
  • Many readers of AEA365 will be familiar with these well-known groups in our community: BetterEvaluation (a platform for sharing information to improve evaluation), EvalPartners (a global movement to strengthen national evaluation capacities) and, of course, CLEAR (a global team aimed at improving policy through strengthening M&E systems and capacities).

Hot Tip:

  • If you sign up for newsletters from these groups, create an email account to manage your news feeds and avoid cluttering your work or personal email accounts.

The American Evaluation Association is celebrating Centers for Learning on Evaluation and Results (CLEAR) week. The contributions all this week to aea365 come from members of CLEAR. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Greetings, all! We’re Edward Jackson, Anne Bichsel, Geske Dijkstra and Karin ter Horst, evaluation professionals who have recently evaluated Swiss and Dutch aid programs designed to strengthen national and local governance in developing countries.

Governance is everybody’s business. From the run-up to the 2016 US presidential election to the recent launch of the Global Development Goals, it is clear that governance matters.

But how should governance interventions be evaluated? Here are three tips from our work:

Hot Tips:

  1. Extend the knowledge base by building and testing new evaluation criteria.

In our evaluation of the governance programming and mainstreaming of the Swiss Agency for Development and Cooperation (SDC), we constructed a matrix of evaluation criteria based on the Paris Declaration on Aid Effectiveness, SDC’s own guidelines, and recent research. International and local consultant teams had to agree on their rankings of individual interventions on this matrix. The new set of criteria enabled us to understand, for example, the importance of the roles of legitimacy and adaptive learning in effective governance programs.

Jackson 1
Jackson 2

2. Use mixed methods to examine both the “supply” and “demand” sides of governance.

Ideally, institutions supply policies and services, while citizens demand results and accountability. Building the capacities of both sides to work effectively, separately and with each other, is central to programming success. Evaluators must assess the performance of both supply and demand initiatives.   In our study of the Dutch-supported Local Government Scorecard Initiative in Uganda, for example, we employed quantitative and qualitative methods to assess results in local councils (the supply side) and among citizens (the demand side). In Rwanda, we applied the techniques of fuzzy-set Qualitative Comparative Analysis (fsQCA) to examine Dutch-supported governance interventions.

   3. Be alert to alternative pathways to good-governance outcomes and to unintended consequences.

The Dutch Rwanda study revealed that there were two pathways to good governance in that experience: one driven by strong political will paired with sufficient organizational capacity of implementing partners, and a second path forged by interventions that were context-sensitive, took a long-term perspective, and were implemented by partners with sufficient organizational capacity. It also was clear that, regardless of the pathway, monitoring and evaluation must be well-integrated into the implementation process in order to optimize learning and results.

Jackson 3

Source: Ter Horst, 2015

And be alert to unintended consequences. Case studies in the Swiss evaluation illustrated how central governments can instrumentalize decentralization in order to actually recentralize their power and exerting control over decisions at the local level, while ostensibly supporting decentralization. Donor agencies must be able to recognize such dynamics and adjust their programming accordingly.

Rad Resources: On our evaluations, see PPT, Report; On fsQCA, see QCA, PDF

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello! My name is Nils Junge and I’m an independent consultant conducting policy analysis and evaluation on international development issues, including poverty reduction, water sector reforms, and environment. I often use Poverty and Social Impact Analysis (PSIA), an approach pioneered by the World Bank, for assessing the distributional impacts of reforms.

I’d like to share some tips on how to increase your evaluation’s relevance, and even promote policy dialogue, when working abroad.

In my experience, multilateral donors are often happy for evaluators to conduct their work in a way that increases its relevance, and even promotes policy dialogue and uptake of reforms, but international evaluation work comes with some challenges:

  • You may be finding yourself serving two masters – the client who commissions the evaluation (the donor) and the intended primary user (the government), with different levels of interest in the findings;
  • Navigating the inevitable cultural and linguistic differences, which may have a distancing effect; and
  • Getting counterparts to share data may be difficult.

Hot Tip #1. Build trust with country counterparts. This is critical to maximizing the impact of your work. Trust comes through frequent meetings, demonstrations of reliability, and openness, e.g. share your methodology, questionnaires, and preliminary findings. If the policy or program you’re evaluating is controversial, expect there to be adversarial stakeholders. Talk to them, too!

Hot Tip #2. Encourage formation of a working group, composed of government, civil society, and private sector stakeholders. The group would meet several times to oversee the evaluation, receive the results, spread the word to their respective organizations, and (ideally) feel some ownership over it.

Hot Tip #3. Your outsider status can be an advantage. While you may be unaware of cultural nuances, you will be seen as detached from the country’s politics. Your distance from local is often valued by counterparts.

Hot Tip #4. Just because you don’t see the impact, it doesn’t mean you haven’t made a difference. Remember that no matter how hard you try, the influence of your evaluation will remain largely outside your control, given the many competing factors which determine policy. A lot evaluation work involves seeding ideas, many of which only bear fruit in the long term.

Rad Resources: Check out the Guidance Note: Enhancing In-Country Partnerships in PSIA and explore the World Bank’s webpage. Although specific to the World Bank, many of the lessons and discussions have broad applicability.

The American Evaluation Association is celebrating Washington Evaluators (WE) Affiliate Week. The contributions all this week to aea365 come from WE Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello dear colleagues, We’re Alisher Nazirov and Tatiana Tretiakova, representatives of national evaluation communities of Tajikistan and Kyrgyzstan, and we would like to share our impressions from one of the most memorable exchange of experience and practices – a study tour to the Washington Evaluators. The tour was supported by the American Evaluation Association, while Donna Mertens helped us to overcome all the invisible barriers.

The visit to Washington, DC was quite short but very intense. It was extremely important that we got an opportunity to get acquainted with the work of evaluators at different levels: federal, state, and county. Meetings were held at the Department of State, Government Accountability Office (GAO), George Washington University, with private companies and independent consultants. Each meeting was very interesting, carefully prepared and illuminated a certain side of evaluators’ work. The meetings were held not only with evaluators, but also with donor organizations such as USAID and the World Bank.

Lesson Learned: How Evaluations are Commissioned

We learned that the Department of State only makes the decision to commission an evaluation of a project. This is then followed by a competition, which helps to select a private company or an independent evaluator who evaluates, writes a report and recommendations for decision-making. Evaluation work is funded from different sources. We then had meetings with foundations and private organizations of evaluators who told us about the peculiarities of their work.

Lesson Learned: Consciously Approaching the Commissioning Process

We found it very important that each organization approached evaluation commissioning, implementation, and use in decision making consciously. Take for example, GAO, it is the audit, evaluation, analytical, and investigative body of US Congress. That is, there are clear commissioners of evaluation who are interested in obtaining objective data on implementation of these programs. These customers represent a specific constituency, and therefore are interested in receiving vital information about the effectiveness of the program. This example is very informative for Kyrgyzstan, which is a parliamentary republic.

Lesson Learned: Communication Among Organizations

During the visit we realized that not only the organizations’ activities were important, but also the rules and forms of interaction built among the organizations. Excellent “horizontal” communication methods that are used by different groups of evaluators, and in which we had a chance to participate, showed us the importance of organizing professional communication among evaluators.

We found Washington Evaluators’ activities very active and diverse and we feel very grateful to all involved individuals for showing us a complete and well-organized system that helped us gain a new perspective on the development of our respective organizations.

The American Evaluation Association is celebrating Washington Evaluators (WE) Affiliate Week. The contributions all this week to aea365 come from WE Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Donna M. Mertens; I am an Independent Consultant based in Washington, D.C. and a founding member of the Washington Evaluators (WE), an AEA local affiliate. I had the pleasure of working with the Washington Evaluators’ leadership team on two grants funded to improve evaluation in Kyrgyzstan and to increase understanding of evaluation in Central Asia for WE members. Both grants were collaborations between WE and evaluation organizations in Kyrgyzstan and Tajikistan.

In the first phase, funded for $10,000 by EvalPartners through its Peer-to-Peer (P2P) support program, I delivered two workshops in Bishkek, Kyrgyzstan. The first was a three-day workshop to help evaluators from four countries (Kazakhstan and Russia also) develop their personal capacities in transformative mixed methods for gender- and equity-focused evaluations. I also offered a second, one-day workshop to help build networks among the evaluation associations in those four countries and beyond. The final report of this part of the grant can be viewed by clicking here.

Rad Resources: The National Sustainable Development Plan for the Kyrgyz Republic contains language that highlights the importance of a transparent, evidence-based approach to evaluating their progress. It states: “due to bad governance, corruption and criminalization of certain state institutions during the rule of the first two presidents of the country these undertakings did not give sufficient positive impetus to national development and have largely failed to achieve their objectives.”

Lessons Learned: Any member of a Voluntary Organization for Professional Evaluation (like AEA or WE) can partner with a VOPE from the developing world to apply for a grant to fund activities like this.

Phase two of this collaboration between WE and colleagues in Central Asia was funded by AEA through its International Partnership Protocol Program (IPPP). Tatianna Tretiakova of Kyrgyzstan and Alisher Nazirov of Tajikistan visited Washington, D.C. in March 2015 to make presentations to WE members on the state of evaluation in Central Asia. While in Washington, these leaders also met with WE officials and representatives of other evaluation-related institutions in the Washington, D.C. area to share their experiences, seek advice, and discuss possible future cooperation.

Lessons Learned: AEA’s International Partnership Program is designed to fund temporary, short-term projects between Local Affiliates or Topical Interest Groups (TIGs) and groups of evaluators in other countries. The deadline for the next opportunity is November 30. The IPPP provides up to $5,000 to allow an AEA Local Affiliate or TIG to partner with one or more VOPEs in other countries so that each partner both gives and receives knowledge and improves each other’s capacity.

Rad Resource: Information on AEA’s International Partnership Program can be found here.

The American Evaluation Association is celebrating Washington Evaluators (WE) Affiliate Week. The contributions all this week to aea365 come from WE Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

We are Claudia Maldonado and Alicia López from the CLEAR Center for Spanish-speaking Latin America, proud members of the CLEAR Initiative.

Would you recognize an “enabling environment” if you “saw” it?  The enabling environment concept is very popular among professionals in evaluation. But what does it mean?   How you create and cultivate this kind of environment requires some out-of -the-box kind of thinking.

Last November the CLEAR Centers of Latin America and Anglophone Africa organized a South-South roundtable in Pretoria, South Africa that tried to do something different.  We brought together members of government, parliament and technical experts from eight developing countries (Benin, Colombia, Ghana, India, Mexico, Peru, South Africa, and Uganda).  We had them role-play, discuss and reflect informally about the role of evidence in development. The idea was to provide a neutral space for people in high-level decision positions, with the ability to push legislation, and technical experts to share experiences and knowledge in a very open forma.  And what did we get out of it?  A fascinating working-group, specific country-level commitments and a lot of fun.

Lessons Learned: Plan, plan, plan, ahead! The selection of participants and a well-thought agenda are essential.

Hot Tip: We held bimonthly meetings with a steering group of senior specialists from Mexico, South Africa and Ghana and the facilitation team to plan the round-table’s content and goals. In-depth, participatory work for the construction of the agenda and the selection of adequate strategic participants (combining enough experience, decisiveness and expertise) was crucial for success.

Lessons Learned: A traditional lecture format was just not going to cut it! We needed participants to get to know each other and be willing to openly share the frustrations and challenges they face. A flexible format and a facilitation process that enabled collaboration and engagement across institutional roles and national boundaries helped.

Hot Tip: Carefully select well prepared and experienced facilitators. Group dynamics help to create an atmosphere of trust.  At first these activities may seem silly or a waste of time. They are not!

Lessons learned: Ok, meeting our colleagues is rewarding, but we came here for results!  Aim for written commitments.

Hot Tip:   Set aside enough time to establish agreements. The last three sessions of the event were dedicated to drafting country and regional action plans taking into consideration learning and insights from the interactions.

Country action plans included: Designing of a legal/administrative framework to promote compliance and learning through evaluation; effectively supply relevant information; ensure that recommendations outputs are evidence-based, timely, clear, and politically and economically feasible; and create a parliamentarian forum on M&E.

Rad Resource: South-South Round Table on the Demand and Use of Evidence in Policy Making and Implementation.

Clipped from http://www.clear-la.cide.edu/SouthSouthRoundtable?language=en

The American Evaluation Association is celebrating Centers for Learning on Evaluation and Results (CLEAR) week. The contributions all this week to aea365 come from members of CLEAR. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I am Boubacar Aw, Coordinator of the Regional Centers for Learning on Evaluation and Results (CLEAR) for Francophone Africa hosted at Centre Africain d’Etudes Superieures en Gestion (CESAG) in Senegal, Dakar. I am writing today to offer practical tips on how to develop teaching materials through a Training of Trainers (ToT) model. These tips are especially helpful when you are trying to develop teaching materials adapted to different contexts.

Lessons Learned Through Experience:

There are numerous teaching materials on M&E in English. The main challenge faced by Francophone Africa is to develop materials in French– there is work do to! It is not just about translation; it is about how to adapt materials to Francophone African context with “real example” case studies to make them useful to the practitioners in the field. A great way to develop such materials is through a ToT approach.

Before a ToT program begins, teaching materials are prepared by a team of master trainers. During a ToT event, trainers use these materials for the training. At the same time, trainees are asked to divide themselves into groups by modules of their interests and to provide feedback on the teaching materials. Moreover, trainees share their own experiences in M&E and provide “real examples.” Such examples are incorporated into the teaching materials as case studies.

During the ToT event, mock-training is organized so that trainees can already test the materials as well as case studies. When trainees go back to their own countries and work places, they can further test the materials and provide further suggestions of necessary adjustments to the trainers.

Hot Tips:

  • Involving trainees to develop teaching materials turns out to be a very effective way to make necessary adaptations to the materials to a Francophone African context.
  • Organizing a mock-training during a ToT event is a good way to make necessary modifications to teaching materials. Trainees also feel more at ease to use case studies suggested by them during a mock-training.
  • It is important to have one trainer responsible for harmonizing and finalizing the teaching materials!

Rad Resources:

The American Evaluation Association is celebrating Centers for Learning on Evaluation and Results (CLEAR) week. The contributions all this week to aea365 come from members of CLEAR. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hello – My name is Gemma Stevenson. I am Associate Director for the Center for Economic Research in Pakistan (CERP) where we run rigorous research projects and deliver evaluation trainings as part of CLEAR South Asia.

So what have we learnt over the last three years delivering trainings on M&E to the Pakistani government and NGO community? What are their most pressing constraints to conducting quality evaluations, and what do they need in the way of training?

Cool Trick: Taking the time to conduct a demand assessment is a great way of answering such questions. CERP conducted an assessment at the end of last year through a brief survey and in-depth interviews with our partners. The exercise unearthed a number of interesting findings for the Pakistani context

Lesson Learnt: First, there remain a number of conceptual hurdles in M&E among many government and NGO partners. A common confusion is mixing up inputs and outputs and outputs and outcomes. For example, a project to build a library – the outcome was seen as the completion of the physical building and the purchase of all the books rather than, say, an improvement in literacy or an increase in IT skills. Well, good to know so we can try to tackle these fundamental issues head-on when engaging with certain partners during our training activities.

Lesson Learnt: Another interesting finding was that our partners in Pakistan are less immediately focused on developing skills for collecting their data, but more concerned about up-skilling when it comes to analysing data sets. In fact our partners expressed an overwhelming level of interest in developing their skills using statistical software such as STATA.

But here is something which is really telling: when asked about the most significant challenge in conducting more frequent monitoring & evaluation activities, it was not a lack of infrastructure, nor a lack of qualified personnel that posed the biggest challenge, but the lack of specific technical capacity of their personnel. So CLEAR still has a very important role to play in Pakistan! We’ll continue to roll out further training and other capacity building initiatives to try to meet this demand.

Rad Resources: Did you know that if you are teaching a short course using STATA, you can contact STATA Corporation to arrange for a free temporary license for you and your students to load on their laptops.  It’s not advertised, so call them in their Texas offices.

Clipped from http://www.clearsouthasia.org/

The American Evaluation Association is celebrating Centers for Learning on Evaluation and Results (CLEAR) week. The contributions all this week to aea365 come from members of CLEAR. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Older posts >>

Archives

To top