AEA365 | A Tip-a-Day by and for Evaluators

CAT | Research, Technology and Development Evaluation

Hi, my name is Or Dashevsky, Chief Solution Architect for Catholic Relief Services based in Baltimore. I’m responsible for providing technical leadership to a team who develop CRS’ enterprise architecture.

Malaria is endemic in Sierra Leone, with stable and perennial transmission in all parts of the country. As such, the entire population is at risk of developing the disease. Malaria accounts for about 50% of outpatient morbidity and is presently the leading cause of morbidity and mortality among children under five years of age, with a mortality attributed to malaria estimated to be 38% among this age group and 25% for all ages (Outpatient morbidity statistics, MoHS, 2009, MIS 2010).Dashevsky

Catholic Relief Services (CRS) and the Ministry of Health and Sanitation (MoHS) of Sierra Leone are co-implementing a Global Fund project to fight AIDS, Tuberculosis, and Malaria (Global Fund Round 10). The overall goal of the Global Fund Round 10 Malaria project is to achieve the malaria-related Millennium Development Goals (MDGs) by 2015, not only nationally, but also among the poorest groups across Sierra Leone.

In order to track progress and impact, CRS led the implementation of a Malaria Indicator Survey (MIS) from 31 January – 8 March 2013 covering 6,720 households throughout the country.  Despite the great surge of mobile technologies to accelerate data collection, all surveys prior to this were using paper-based systems in Sierra Leone. The 2013 MIS used Apple 3GS iPhones to collect data via the iFormBuilder platform, a Software as Services application allowing for timely data collection, monitoring, and analysis.

Lessons Learned:

  • Allow enough time to digitize paper questionnaires: It took approximately 9 weeks of intense programming and testing over a 10-month period to program the MIS questionnaire into iFormBuilder.
  • Allow enough time to pre-test: The tool was pre-tested in 100 households in both rural and urban areas three months prior to the start of MIS data collection.
  • Spend enough time on training enumerators prior to data collection:  Data collection training for 28 teams lasted three weeks, which was necessary to ensure that all individuals collecting MIS data fully understood the questions, the functioning of iPhones, and the sequencing and logic of the questionnaires
  • Provide central technical support throughout data collection effort:  Throughout data collection, a CRS Freetown-based team was available 16 hours a day to respond to phone calls from the field teams, especially during the first 10 days of fieldwork. This allowed for real-time review of data and timely corrections.

Hot Tip: Digital data collection will improve timelines and accuracy of the data. It may look more expensive than traditional paper-base systems but in reality cost of digital data collection can be less in the long run.

The American Evaluation Association is celebrating Information and Communication Technology for Development (ICT4D) for Monitoring, Evaluation, Accountability and Learning (MEAL) week. The contributions all this week to aea365 come from members who work in ICT4D for MEAL. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hi! My name is Mike Matarasso and I’m responsible for leading the design, testing and global roll out of a Monitoring and Evaluation / Information Communication Technology (M&E/ICT) platform for Catholic Relief Services (CRS). The platform will help us gather timely and high quality data to track performance across the agency, to inform change at project level and to report to donors and other stakeholders. The platform includes:

  1. Recommended mobile devices for simple and detailed data collection with options for solar charging and collection/syncing offline,
  2. A standard form library and form building interface where projects can select and use existing forms, adapt existing forms or create customized forms,
  3. A database with an interface for data management, cleansing and advanced analysis,
  4. A Geographical Information System (GIS) interface for mapping service delivery,
  5. A real time web reporting and dashboard interface with a standard library
  6. A complete training curriculum for users and support staff
  7. A help desk with tiered service support

We’ve built and piloted the platform in one food security project in Ethiopia with outcomes for Water, Sanitation and Hygiene (WASH), food security, mother and child health, public works and microfinance. Following the initial design, validation was done with five additional projects across four countries.

Experiences from this pilot will be combined with a cost-benefit analysis and field assessment carried out with Accenture and a global CRS project level assessment of requirements to determine the architecture and next steps for scaling globally.

Lessons Learned: If you were to design a similar system, here are some suggestions:

  • A mandate and support from leadership are essential
  • Sufficient budget should be planned and in place
  • The right number of qualified staff should be available to work on building the system and for piloting in the field. Everyone should believe in the system and be excited about it!
  • Requirements should be documented and confirmed by all stakeholders before starting work
  • Testing and adaptation are imperative and should be done in one project before all kinks are worked out. And did I say testing?
  • A training curriculum should only be developed after the initial system design is complete.  Otherwise the training materials will constantly change and be outdated as will the knowledge and skills of the trainees. Intensive mentoring is required.
  • A cost benefit analysis is integral to make a business case for the platform and to improve adoption
  • An Information Technology (IT) help desk and skilled support network need to be in place.
  • Focus on small, realistic releases and timelines and get something done initially to demonstrate success to others. Work in phases rather than expecting to deliver everything at once.

The American Evaluation Association is celebrating Information and Communication Technology for Development (ICT4D) for Monitoring, Evaluation, Accountability and Learning (MEAL) week. The contributions all this week to aea365 come from members who work in ICT4D for MEAL. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· ·

Hi, my name is Guy Sharrock. I work in a small team of advisors working for Catholic Relief Services (CRS) responsible for improving the quality of the agency’s Monitoring, Evaluation, Accountability and Learning (MEAL) activities. CRS is working to improve the lives of individuals in over 90 countries. CRS views the rapid expansion of cellular technology as hugely transformative for even the most vulnerable communities: in 2013, the rate of mobile-cellular penetration in developing countries stands at 89 per cent. For the last five years a cross-departmental Community of Practice (CoP) has sought to understand and promote the use of Information and Communication Technology for Development (ICT4D), inter alia, to improve the efficiency and effectiveness of the agency’s MEAL operations.

The theme of this week is ICT4D for better MEAL. Our journey to date has been both exciting and challenging: over the last few years, the agency’s use of ICT4D has become widespread with over 100 support requests logged in the last year. We have benefited greatly from the work undertaken by many other agencies that have embarked on a similar course; in return, we would like here to share some of our learning!

Hot Tips:

  • ICT4D supports better communication – data captured electronically is available in real-time, leading to prompt corrections and more accurate data, enabling timely feedback to stakeholders, and improved decision making.
  • ICT4D generates greater impact – technology provides greater access to information to help empower local communities, and provides a medium for voices to be heard.
  • ICT4D enables programs to reach scale more quickly – permitting greater coverage and reduced intervention costs per person.
  • ICT4D improves cost efficiencies – the technology pays for itself by eliminating expenses associated with conventional data collection and reporting approaches, and by allowing re-usability of its components.

Lessons Learned:

  • Understand context – how the solution will be used; the impact on day-to-day practices of different stakeholders; environmental (e.g., connectivity and power), political (e.g., government regulations), and cultural (e.g., language, gender, and education levels).
  • Identify solutions that do not require heavy investment in infrastructure; software services that are easily configured and maintained; solutions that are appropriate to users’ data needs and that work in occasionally connected environments with intermittent access to power and work across a range of user devices.
  • Articulate the true cost of the solution including procurement, deployment, maintenance and support over the life of the solution, not merely over the project cycle.
  • Compare the costs of working with and without the support of an ICT4D solution.
  • Implement a change management program to facilitate the behavior changes required to enjoy the benefits of ICT4D for enhanced MEAL.

Rad Resources: Check out the learning taking place around CRS’s annual conferences.

Clipped from http://www.crsprogramquality.org/ICT4D/

The American Evaluation Association is celebrating Information and Communication Technology for Development (ICT4D) for Monitoring, Evaluation, Accountability and Learning (MEAL) week. The contributions all this week to aea365 come from members who work in ICT4D for MEAL. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

My name is Teri Garstka and I am currently a Research Associate in the Institute for Educational Research and Public Service at the University of Kansas. We have a wide portfolio of research and evaluation projects in early childhood, child welfare and youth programs, and family services.

We work with lots of state, local, and community-based agencies. This means we also must work with the many types of Management Information Systems (MIS) used in those agencies to help us evaluate their programs and the families they serve. As we all know, data systems never “talk” to one another and true cross-systems data collection, analysis, and reporting can be quite challenging.

We needed a wonder tool that would:

  • Integrate Data!
    • Have the ability to bring together existing data from disparate agency management information systems into one handy place
    • Link client-level data across systems and data collection methodologies
    • Build the data system to include everything we need and nothing we don’t
  • Secure Data!
    • Keep data secure when collected and not stored on a mobile device or PC
    • Reside on a HIPAA Compliant server to transfer Protected Health Information (PHI) data from agencies
  • Collect Data!
    • Need the ability conduct field interviews in natural environments such as in the home or community to build rapport and get rid of paper and pencil surveys
    • Need to interface with emerging mobile technologies such as iPads and PC Tablets
    • Need web-access surveys and multiple response formats
  • Report Data!
    • Need to export any of the data in multiple formats for analysis and reporting
    • Calendar multiple data collection timepoints
    • De-identify information as needed

Rad Resource: We found a great tool to help us integrate existing data from an agency’s information system with real-time data collection from families in the field. Created by Vanderbilt University with NIH funding, Research Electronic Data Capture (REDCap) is a web-based application that non-computer programmers can learn and use freely if their institution is a consortium member.  After a few tutorials, you can learn to build a REDCap database for your project and customize it to fit the unique data system needs of almost any project. We like REDCap because it is so versatile and easy to use – it’s perfect for social scientists and evaluators looking for a more robust data collection system without having to hire a computer programmer for every project.

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org. Want to learn more from Teri? Teri and her colleagues will be presenting as part of the Evaluation 2011 Conference Program, November 2-5 in Anaheim, California.

· ·

Greetings from beautiful Boise! We are Rakesh Mohan and Bryon Welch from the Idaho legislature’s Office of Performance Evaluations.

Last February, the Idaho legislature asked us to evaluate the state’s new system for processing Medicaid claims. Legislators had received many constituent complaints that claims from Medicaid providers were being denied, delayed, or inaccurately processed. Legislators were beginning to question whether the new system would ever perform as intended.

The $106 million system went live in July 2010 and immediately began experiencing problems. At the time of our review, over 23,000 providers were enrolled in the system, which was processing about 150,000 claims each week.

Lessons Learned: Our review found that problems with processing provider claims were the result of unclear contract requirements, a lack of system readiness, and most importantly, the absence of adequate end user participation. Less than one percent of total providers were selected for a pilot test, but neither the state administrators nor the contractor knew how many claims were actually pilot tested. Further, only about 50 percent of the providers were enrolled when the system went live.

Hot Tip: If you are ever asked to evaluate the implementation of a large IT system that is experiencing problems, make sure you examine the end user involvement in the system’s design and implementation. Too often end user feedback is underappreciated, not used, or completely ignored.

Lessons Not Learned: Nearly ten years ago, Idaho attempted to implement a similar IT system to track student information for K-12 public schools. After spending about $24 million, the project was terminated due to undelivered promises and a lack of buy in from end users. Unfortunately, lessons identified in our evaluation of the failed student information systems were apparently not learned by people responsible for this new Medicaid claims processing system.

Hot Tip: Because the success of an IT system depends on end user buy-in, ask the following questions when evaluating the implementation of large IT systems:

1.       Are end users clearly identified?

2.       Are end user needs identified and incorporated into system objectives?

3.       Do vendors clearly specify how their solutions/products will address system objectives and end user needs?

4.       Is there a clear method for a two-way communication between system managers and end users with technical expertise?

5.       Is there a clear method for regularly updating end users on changes and progress?

The American Evaluation Association is celebrating GOV TIG Week with our colleagues in the Government Evaluation AEA Topical Interest Group. The contributions all this week to aea365 come from our GOV TIG members and you can learn more about their work via the Government TIG sessions at AEA’s annual conference. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hello! We are Xin Wang, Neeley Current, and Gary Westergren. We work at the Information Experience Laboratory (IE Lab) of the School of Information Science & Learning Technologies at the University of Missouri.  The IE lab is a usability laboratory that conducts research and evaluates technology. What is usability? According to Jakob Nielsen’s definition, usability assesses how easy user interfaces are to use. With the advancement of Web technology, in the past eight years, our lab has successfully applied a dozen of usability methods into the evaluation of educational and commercial Web applications. The evaluation methods that we have frequently used include: heuristic evaluation, think-aloud interviews, focus-group interviews, task analysis and Web analytics. Selecting appropriate usability methods is vital and should be based on the development life cycle of a project. Otherwise, the evaluation results would not be really useful and informative for the Web development team. In this post, we focus on some fundamental concepts regarding one of the most commonly adopted usability evaluation methods–Think-Aloud protocol.

Hot Tip: Use think-aloud interviewing! Think-aloud interviewing is used to engage participants in activities and then ask users to verbalize their thoughts as they perform the tasks. This method is usually applied during the mid or final stage of Website or system design.

Hot Tips: Employing the following procedures are ideal:

  1. Recruit real or representative users in order to comply with the User-Centric Design principles
  2. Select tasks based on frequency of use, criticality, new features, user complaints, etc.
  3. Schedule users for a specific time and location
  4. Have users operate a computer accompanied by the interviewer
  5. Ask users to give a running commentary (e.g., what they are clicking on, what kind of difficulty they encounter to complete the task)
  6. Have interviewer probe the user about the task s/he is asked to perform.

Pros:

  1. When users verbalize their thoughts, evaluators may identify many important design issues that caused user difficulties, such as poor navigation design, ambiguous terminology, and unfriendly visual presentation.
  2. Evaluators can obtain users’ concurrent thoughts rather than just retrospective ones, so it may avoid a situation where users may not recall their experiences.
  3. Think aloud protocol allow evaluators to have a glimpse into the affective nature (e.g., excitement, frustration, disappointment) of the users’ information seeking process.

Cons:

  1. Some users may not be used to verbalizing their thoughts when they perform a task.
  2. If the information is non-verbal and complicated to express, the protocol may be interrupted.
  3. Some users may not be able to verbalize their entire thoughts, which is likely because the verbalization could not keep pace with their cognitive processes–making it difficult for evaluators to understand what the users really meant.
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

<< Latest posts

Archives

To top