AEA365 | A Tip-a-Day by and for Evaluators

TAG | Monitoring and Evaluation

Hello – My name is Gemma Stevenson. I am Associate Director for the Center for Economic Research in Pakistan (CERP) where we run rigorous research projects and deliver evaluation trainings as part of CLEAR South Asia.

So what have we learnt over the last three years delivering trainings on M&E to the Pakistani government and NGO community? What are their most pressing constraints to conducting quality evaluations, and what do they need in the way of training?

Cool Trick: Taking the time to conduct a demand assessment is a great way of answering such questions. CERP conducted an assessment at the end of last year through a brief survey and in-depth interviews with our partners. The exercise unearthed a number of interesting findings for the Pakistani context

Lesson Learnt: First, there remain a number of conceptual hurdles in M&E among many government and NGO partners. A common confusion is mixing up inputs and outputs and outputs and outcomes. For example, a project to build a library – the outcome was seen as the completion of the physical building and the purchase of all the books rather than, say, an improvement in literacy or an increase in IT skills. Well, good to know so we can try to tackle these fundamental issues head-on when engaging with certain partners during our training activities.

Lesson Learnt: Another interesting finding was that our partners in Pakistan are less immediately focused on developing skills for collecting their data, but more concerned about up-skilling when it comes to analysing data sets. In fact our partners expressed an overwhelming level of interest in developing their skills using statistical software such as STATA.

But here is something which is really telling: when asked about the most significant challenge in conducting more frequent monitoring & evaluation activities, it was not a lack of infrastructure, nor a lack of qualified personnel that posed the biggest challenge, but the lack of specific technical capacity of their personnel. So CLEAR still has a very important role to play in Pakistan! We’ll continue to roll out further training and other capacity building initiatives to try to meet this demand.

Rad Resources: Did you know that if you are teaching a short course using STATA, you can contact STATA Corporation to arrange for a free temporary license for you and your students to load on their laptops.  It’s not advertised, so call them in their Texas offices.

Clipped from http://www.clearsouthasia.org/

The American Evaluation Association is celebrating Centers for Learning on Evaluation and Results (CLEAR) week. The contributions all this week to aea365 come from members of CLEAR. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Tim Clynick, Acting Director of the Centre for Learning on Evaluation and Results in Anglophone Africa, (University of Witwatersrand in South Africa, Ghana Institute of Public Administration and the Kenya School of Government)

Greetings from CLEAR Anglophone Africa!

We are often asked where our programme – now in its third year – is having the greatest impact.

Lesson Learned: Growing the Supply. Since 2012, we have trained nearly 1,150 M&E practitioners and public servants, some in advanced methods and the majority in M&E and Result-based Management. But our training programmes also need to align better to national systems and standards. Supply constraints hamper efforts to deepen and improve national evaluation system building and significantly more effort is required to grow more skilled practitioners.

Responding to Demand. Our 11 country M&E studies have allowed us to identify opportunities to play a convening role so that national stakeholders can be mobilized around a common diagnostic such as need for a national evaluation policy or framework. CLEAR’s growing body of knowledge of national M&E systems has also been important in enabling mobilization of resources to meet specific local gaps, e.g. technical advisory services to sectoral departments to manage or conduct impact evaluations, or to support policy making and guidelines for ministries responsible for coordinating government programmes.

Receptive Environment. There is no lack of appreciation amongst African governments or stakeholders of the need for evidence-based learning and decision making. But we now understand the political economy of this demand and where it is real and meaningful – as opposed to symbolic or merely rhetorical – and act or respond accordingly.

Beyond Rhetoric: A successful partnership in South Africa. CLEAR recently participated in a session with the South African Presidency reflecting on successes in consolidating the National Evaluation System. The system is coalescing around the national norms and standards and guidelines. Thirty-eight National Evaluations are underway, procured or have been completed since 2011. A further 60 national – and between 50-100 provincial and departmental – evaluations are planned by 2016. M&E officers and programme managers are now demanding support to deepen their own professional evaluative practice. That public service managers are responding in this way can be considered a huge success for the evaluation movement in South Africa. The implications are galvanizing government and service providers across the country. The highpoint reached in South Africa is however part of a larger groundswell across the African Continent – in Kenya, Ghana, Zambia, Uganda, and elsewhere where we can look for similar results.

Rad Resources: In addition to following CLEAR Anglophone Africa, keep on top of what’s happening in African evaluation through the African Evaluation Association-AfrEA.

Clipped from http://www.clear-aa.co.za/

The American Evaluation Association is celebrating Centers for Learning on Evaluation and Results (CLEAR) week. The contributions all this week to aea365 come from members of CLEAR. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

We are Diva Dhar (Program Director) and Urmy Shukla (Capacity Building Manager) at the CLEAR South Asia Regional Center, based at J-PAL South Asia at the Institute for Financial Management and Research.

We work to strengthen M&E skills in South Asia, focusing on India, Pakistan, Bangladesh, Sri Lanka, and Nepal. As part of our work, we conduct custom workshops and M&E technical advisory services for government, NGOs, donor organizations, and associations of professional evaluators. Recent examples including capacity building workshops with Population Council (Bangladesh), the Sri Lankan Evaluation Association, USAID/India, and the Indian Administrative Services.

In this post, we put together a few helpful lessons for designing customized M&E courses and technical advisory services.

Lessons learned: Know your audience! Many different organizations seek out M&E capacity building services, each with different needs, skills, and interests. Needs assessments are a useful tool to better understand your partners’ M&E background and goals. The results of these needs assessments will help you design customized courses and services that target your partners’ specific M&E needs and interests. Organizations also appreciate the effort in getting to know them better.

Needs assessments can be conducted in a variety of ways:

  • Online forms and surveys
  • Structured or semi-structured interviews
  • Diagnostic tools to review organizational systems and processes

Hot Tip: While online surveys are faster and easier, they are often produce inaccurate results. Respondents tend to under- or over-estimate their M&E skills and knowledge, and we often do not get enough information on specific challenges in implementing good M&E practices. Interviews are time-consuming, but helpful in getting a better understanding of M&E practices and abilities.

Lessons Learned: Needs assessment interviews need to be framed correctly to be useful. Organizations should be informed that these exercises are for learning and training purposes, and are not meant to be an appraisal on their M&E capabilities. This also helps in getting buy-in for the needs assessment exercise, as well as ensuring that employees are available for interviews.

Cool Trick: When planning workshops, use the needs assessments to divide participants into more uniform groups for break-out sessions or facilitator-led group exercises. This can be done based on their M&E skills, interests, or focus areas. This ensures that each group can be taught at their level and with relevant examples. For example, participants working on health projects with a basic level of M&E understanding can be grouped together. Similarly, facilitators or trainers can also be assigned to groups based on their levels and interests.

Rad Resource: Check out CLEAR South Asia’s Interactive Course Guide – a quick and easy-to-read manual on conducting effective and interactive training events.

The American Evaluation Association is celebrating Centers for Learning on Evaluation and Results (CLEAR) week. The contributions all this week to aea365 come from members of CLEAR. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello! My name is Rhonda Schlangen and I’m an evaluation consultant specializing in advocacy and development.

By sharing struggles and strategies, evaluators and human rights organizations can help break down the conceptual, capacity and cultural barriers to using monitoring and evaluation (M&E) to support human rights work. In this spirit, three human rights organizations candidly profiled their efforts in a set of case studies recently published by the Center for Evaluation Innovation.

Lessons learned:

  • Logic models may be from Mars: Evaluation can be perceived as at cross-purposes to human rights efforts. The moral imperative of human rights work means that “results” may be potentially unattainable. Planning for a specific result at a point in time risks driving work toward the achievable and countable. Learning-focused evaluation can be a useful entry point, emphasizing evaluative processes like critical reflections and one-day ‘good enough’ evaluations.
  • Rewrite perceptions of evaluation orthodoxy: There’s a sense in the human rights groups reviewed for this project that credible evaluation follows narrow and rigid conventions and must produce irrefutable proof of impact. Evaluators can help recalibrate perceptions by focusing on a broader suite of appropriate approaches complex change scenarios (such as outcome mapping or harvesting).
  • Methods are secondary: Equally important, if not more critical than, the tools and methods used is the confidence and capacity of staff and managers in using them. Investing in training and support is important. Prioritizing self-directed, low-resource internal learning as an integrated part of program work also helps cultivate a culture of evaluation. (See this presentation on organizational learning for an overview of organizational learning and stay tuned for an upcoming paper from the Center for Evaluation Innovation on the topic.)

Rad Resources: Evidence of change journals: Excel workbooks populated with outcome categories, these journals are shared platforms where human rights and other campaigners can log signs of progress and change. The tool facilitates real time tracking and analysis of developments related to a human rights issue and advocacy efforts.

Intense period debriefs: Fitting into the slipstream of advocacy and campaigns, these are a systematic and simple way to review what worked, and what didn’t, after particularly intense or critical advocacy moments. The tool responds to the inclination of advocates to keep moving forward but creates space for collective reflection.

People-centered change models: A Dimensions of Change model, such this one developed by the International Secretariat of Amnesty International, can serve as a shared lens for work that spans different types of human rights and different levels—from global to community.  

Get involved: Evaluators can contribute to the discussion with the human rights defenders through online forums like the one facilitated by New Tactics in Human Rights.

Clipped from http://www.evaluationinnovation.org/

The American Evaluation Association is celebrating APC TIG Week with our colleagues in the Advocacy and Policy Change Topical Interest Group. The contributions all this week to aea365 come from our AP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

We are Viengsavanh Sanaphane and Katie Moore from Catholic Relief Services.  We are working on an Inclusive Education (IE) project in Southeast Asia.

Currently, IE efforts are expanding. Experienced partners that have worked on IE for years in the local context were planning to share qualitative data about their project with stakeholders new to the initiative; qualitative data is defined here as previous project stakeholders’ perceptions on the projects’ successes, challenges, and how they overcame challenges during implementation.

We had the question:  How may experienced IE stakeholders’ share their project’s qualitative data with incoming stakeholders in a dynamic and engaging way?  The idea for previous stakeholders to use a play to share their qualitative data and new stakeholders to use reflection journals to record qualitative data instead of using Powerpoint for data sharing and traditional surveys to collect data was conceptualized.

The qualitative data for the data sharing event was presented by experienced stakeholders in the form of a play that they wrote and practiced for the explicit purpose of presenting data dynamically.  This was followed by a Q&A session and one-on-one interviews between former and current project implementers.  Stakeholders new to IE efforts were able to use their journals to record what they determined was relevant qualitative data from the play and discussions needed for implementing IE in their own contexts.  New stakeholders requested keeping their journals as an ongoing tool to refer to while strategically planning how they would implement an IE project.

While not trialed yet, the qualitative data that new project stakeholders recorded as relevant could be implemented in their own project, analyzed by again reading the journals, and analyzed at the project’s end to see the impacts of data sharing events between two similar contexts implementing the same type of project.   Further, stakeholders may refer back to their own work, thus lending to an empowerment and utilization approach whereby stakeholders are able to use locally produced data for local decisional making processes and other personal/professional needs.

Rad Resource:  Use free 2013 templates from Microsoft to create booklets/journals that allow stakeholders to record qualitative data presented in a refreshing way that lends towards an empowerment M&E approach.

Lesson Learned:  For an empowerment approach, have stakeholders design questions to be included in the journals in order to be the “owners” of their own processes and work.  Facilitate a “practice-run” of the book with a small group to identify aspects that needed to be modified prior to using with a large group.

Hot Tip: Data collectors or project implementers may photocopy or take pictures with Smartphones, tablets and/or cameras of the journals, given informed consent/permission of stakeholders, as a way to record the qualitative data for later analysis.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Greetings from Catholic Relief Services (CRS)! We, Suzanne Andrews and Shaun Ferris, from the Baltimore-based Agriculture and Livelihoods Program, presented at the American Evaluation Association’s Annual Conference in Washington DC on a Farmbook suite of on/off line tools to help us to better build capacity, gather data and develop business plans with smallholder farmers.

Andrews

Photo by Suzanne Andrews (Catholic Relief Services)

Lesson Learned: A Challenge: One of the key problems we face in working with smallholder farmers is understanding who our clients are, where they live, their cropping systems, their costs of production and the market opportunities near their communities. There are very few tools to help field agents gather these types of monitoring and evaluation data in a systematic way and few means of aggregating and sharing this information.

A Product: CRS has been working to develop tools that help field agents to develop farmer group business plans, to gather data on production and profitability levels, sharing this information with farmers, local project managers and globally through a digital data platform.

Rad Resources: We manage, analyze and share our data through cloud-based data management systems that allow global users from CRS and other organizations to view our data and create customized reports.  We are also working with Nethope’s cloud services, creating webinars to share ideas and get feedback, and also link with potential users. We have held several webinars about our e-learning platforms and the business planner /profitability tool . We also share the information through our  ICT4D conferences that we hold every year, in Africa.

Lessons Learned: Field agents who tested the Farmbook business planner and profitability calculator, performed much better when they first enrolled in the e-learning course in marketing and gross margin analysis. We have developed comprehensive training curricula for smallholder capacity building, to support the farm business plan development and data gathering process.

Developing the Farmbook suite required a team of people with diverse expertise, ranging from agriculture advisors, software architects, programmers, instructional designers, subject matter specialists, editors, and artists, to innovative field managers and field agents to design, develop and test the beta versions of Farmbook.  Holding that team together in the build, test and deploy phases has been critical to getting to the starting point. We are still working on the business models!

Get Involved: If you would like to test drive our learning tools, the Farmbook business planner, or the Map and Track service delivery audit, let us know!  Contact Suzanne.Andrews@crs.org to request a training version of the software, allowing you to assess the profitability of your farm and your farmers!

The American Evaluation Association is celebrating Information and Communication Technology for Development (ICT4D) for Monitoring, Evaluation, Accountability and Learning (MEAL) week. The contributions all this week to aea365 come from members who work in ICT4D for MEAL. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Hi, my name is Marianna Hensley, Program Quality Manger for Health with Catholic Relief Services (CRS) in India. I currently support the Reducing Maternal and Newborn Deaths (ReMiND) project that CRS implements in partnership with Dimagi, Inc. and Vatsalya.

The ReMiND project works with government community health workers (CHW) to improve the frequency and quality of their home visits to women and children. CHWs use basic mobile phones operating Dimagi’s open-source CommCare software, which equips them with job aids to support client assessment, counseling, and early identification, treatment and/or rapid referral of complications. With the project’s use of CommCare as a case management tool and job aid for CHWs, leveraging information and communication technologies (ICT) for project monitoring and evaluation (M&E) with the same software platform was an obvious choice for ReMiND. All routine project monitoring is done through CommCare operated on basic mobile phones while data collection for the project’s baseline household survey was done using CommCare on tablets.

Lessons Learned: For all data nerds out there, imagine the excitement of realizing that ICT-enabled M&E means you get all those numbers now! Beware the lure of real-time data with ICT for M&E.

Hensley

Photo by Marianna Hensley (Catholic Relief Services)

With the use of ICT for data collection in either routine monitoring or evaluation comes the strong temptation to ask every question you can think of—just because it’s so easy to capture responses with fewer worries about the delays or errors typically associated with manual data entry following paper-based collection. The risks are multiple: 1) you find yourself left with more data than you can or feasibly will analyze and use; and 2) you hazard user (data collector) and respondent fatigue from a questionnaire that delves too deeply into non-essential information.

Faced with the lure of real-time data from ICT, M&E practitioners must remember more than ever to focus on the need to know information that supports project or evaluation decision-making and objectives.collector) and respondent fatigue from a questionnaire that delves too deeply into non-essential information.

 

Hot Tips:

  • Make sure to choose an ICT device that fits your needs in terms of screen size and resolution. Long questions or lists of select options are easier to deal with on a larger-screen rather than on a smaller-screened device that requires scrolling.
  • Don’t forget to assess the battery life of your device as part of field testing an ICT tool. And have a plan that includes resources such as solar or car chargers to ensure devices are adequately charged throughout data collection or monitoring.

Rad Resources: The ReMiND project’s monitoring tool application and baseline survey application are available for free download on CommCare Exchange.

ReMiND is featured as a case study and the example of M&E in mobile health programming in the Global Health e-Learning Center’s  new mHealth Basics course.

The American Evaluation Association is celebrating Information and Communication Technology for Development (ICT4D) for Monitoring, Evaluation, Accountability and Learning (MEAL) week. The contributions all this week to aea365 come from members who work in ICT4D for MEAL. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hi, my name is Shenkut Ayele, Early Warning Assessment and Response Manager with Catholic Relief Services (CRS) for the Joint Emergency Operation (JEOP) in Ethiopia. JEOP is a USAID-funded emergency food assistance program that over two years is providing food aid to almost 1 million people. The program operates across Ethiopia and is a partnership between many agencies and government.

Until last August, I faced serious challenges: data were slow to arrive and often of poor quality. As a result, reports were delayed and decision-making hampered with serious consequences for JEOP’s ability to respond effectively. However, since August 2012, JEOP has been using an innovative solution that is strengthening our ‘Participatory Early Warning and Response System’. We are using DataWinners, an SMS-based solution implemented in partnership with Human Network International. Registered individuals across 79 districts collect and upload data via SMS each week onto a web-based database. I am able to use these data in real time to inform decision makers. Here are two graphics how the system works and the data collection and information flow.

Ayele 1

Ayele

Lessons Learned: After implementing our system for one year, we have learned that:

  • Vulnerable communities should be viewed as both sources and recipients of early warning information.
  • Adoption of our new SMS-based system has empowered local officials who are now using the reports to undertake better estimates of the number of individuals who might be affected by a disaster.
  • Local officials are better able to represent the needs of vulnerable communities in discussions at higher levels of government.
  • Local officials and others in JEOP have found the better quality data has improved their ability to target the most vulnerable communities.
  • The system has the potential to accommodate other innovative uses, and government officials have expressed their interest to adopt the SMS system more widely.

Hot Tip: An effective SMS-based system provides a strong basis for a participatory early warning and response system because it enhances the likelihood that any data generated will be used to support better decision-making among different users.

The American Evaluation Association is celebrating Information and Communication Technology for Development (ICT4D) for Monitoring, Evaluation, Accountability and Learning (MEAL) week. The contributions all this week to aea365 come from members who work in ICT4D for MEAL. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· ·

Hi, my name is Or Dashevsky, Chief Solution Architect for Catholic Relief Services based in Baltimore. I’m responsible for providing technical leadership to a team who develop CRS’ enterprise architecture.

Malaria is endemic in Sierra Leone, with stable and perennial transmission in all parts of the country. As such, the entire population is at risk of developing the disease. Malaria accounts for about 50% of outpatient morbidity and is presently the leading cause of morbidity and mortality among children under five years of age, with a mortality attributed to malaria estimated to be 38% among this age group and 25% for all ages (Outpatient morbidity statistics, MoHS, 2009, MIS 2010).Dashevsky

Catholic Relief Services (CRS) and the Ministry of Health and Sanitation (MoHS) of Sierra Leone are co-implementing a Global Fund project to fight AIDS, Tuberculosis, and Malaria (Global Fund Round 10). The overall goal of the Global Fund Round 10 Malaria project is to achieve the malaria-related Millennium Development Goals (MDGs) by 2015, not only nationally, but also among the poorest groups across Sierra Leone.

In order to track progress and impact, CRS led the implementation of a Malaria Indicator Survey (MIS) from 31 January – 8 March 2013 covering 6,720 households throughout the country.  Despite the great surge of mobile technologies to accelerate data collection, all surveys prior to this were using paper-based systems in Sierra Leone. The 2013 MIS used Apple 3GS iPhones to collect data via the iFormBuilder platform, a Software as Services application allowing for timely data collection, monitoring, and analysis.

Lessons Learned:

  • Allow enough time to digitize paper questionnaires: It took approximately 9 weeks of intense programming and testing over a 10-month period to program the MIS questionnaire into iFormBuilder.
  • Allow enough time to pre-test: The tool was pre-tested in 100 households in both rural and urban areas three months prior to the start of MIS data collection.
  • Spend enough time on training enumerators prior to data collection:  Data collection training for 28 teams lasted three weeks, which was necessary to ensure that all individuals collecting MIS data fully understood the questions, the functioning of iPhones, and the sequencing and logic of the questionnaires
  • Provide central technical support throughout data collection effort:  Throughout data collection, a CRS Freetown-based team was available 16 hours a day to respond to phone calls from the field teams, especially during the first 10 days of fieldwork. This allowed for real-time review of data and timely corrections.

Hot Tip: Digital data collection will improve timelines and accuracy of the data. It may look more expensive than traditional paper-base systems but in reality cost of digital data collection can be less in the long run.

The American Evaluation Association is celebrating Information and Communication Technology for Development (ICT4D) for Monitoring, Evaluation, Accountability and Learning (MEAL) week. The contributions all this week to aea365 come from members who work in ICT4D for MEAL. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hi! My name is Mike Matarasso and I’m responsible for leading the design, testing and global roll out of a Monitoring and Evaluation / Information Communication Technology (M&E/ICT) platform for Catholic Relief Services (CRS). The platform will help us gather timely and high quality data to track performance across the agency, to inform change at project level and to report to donors and other stakeholders. The platform includes:

  1. Recommended mobile devices for simple and detailed data collection with options for solar charging and collection/syncing offline,
  2. A standard form library and form building interface where projects can select and use existing forms, adapt existing forms or create customized forms,
  3. A database with an interface for data management, cleansing and advanced analysis,
  4. A Geographical Information System (GIS) interface for mapping service delivery,
  5. A real time web reporting and dashboard interface with a standard library
  6. A complete training curriculum for users and support staff
  7. A help desk with tiered service support

We’ve built and piloted the platform in one food security project in Ethiopia with outcomes for Water, Sanitation and Hygiene (WASH), food security, mother and child health, public works and microfinance. Following the initial design, validation was done with five additional projects across four countries.

Experiences from this pilot will be combined with a cost-benefit analysis and field assessment carried out with Accenture and a global CRS project level assessment of requirements to determine the architecture and next steps for scaling globally.

Lessons Learned: If you were to design a similar system, here are some suggestions:

  • A mandate and support from leadership are essential
  • Sufficient budget should be planned and in place
  • The right number of qualified staff should be available to work on building the system and for piloting in the field. Everyone should believe in the system and be excited about it!
  • Requirements should be documented and confirmed by all stakeholders before starting work
  • Testing and adaptation are imperative and should be done in one project before all kinks are worked out. And did I say testing?
  • A training curriculum should only be developed after the initial system design is complete.  Otherwise the training materials will constantly change and be outdated as will the knowledge and skills of the trainees. Intensive mentoring is required.
  • A cost benefit analysis is integral to make a business case for the platform and to improve adoption
  • An Information Technology (IT) help desk and skilled support network need to be in place.
  • Focus on small, realistic releases and timelines and get something done initially to demonstrate success to others. Work in phases rather than expecting to deliver everything at once.

The American Evaluation Association is celebrating Information and Communication Technology for Development (ICT4D) for Monitoring, Evaluation, Accountability and Learning (MEAL) week. The contributions all this week to aea365 come from members who work in ICT4D for MEAL. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· ·

<< Latest posts

Older posts >>

Archives

To top