AEA365 | A Tip-a-Day by and for Evaluators

CAT | Research, Technology and Development Evaluation

We are Shannon L. Griswold, Ph.D., a scientific research evaluator and member of AEA’s Research Technology and Development TIG, Alexandra Medina-Borja, Ph.D., Associate Professor of Industrial Engineering at University of Puerto Rico-Mayaguez, and Kostas Triantis, Ph.D., Professor of Systems Engineering at Virginia Tech. We are thinking about new ways to envision and evaluate impacts from discovery-based scientific research. Tracing dollars spent on funding research in universities to societal impacts is very difficult due to the long time lag between experimentation and commercialization, and the serendipitous nature of discovery.

Lesson Learned: Even though we can’t predict every outcome of scientific research, we can apply a general framework that allows us to envision the complex system of scientific discovery and identify areas of inquiry that could lead to major breakthroughs.

Hot Tip: Gather your research community and ask them to think backwards from societal needs (e.g., in transportation research this might be a solution for traffic congestion). This can be HARD for fundamental researchers; they are accustomed to letting curiosity drive their research questions. From societal needs, ask them to map several enabling technologies that could meet that need. Enabling technologies should be things that could solve that need but that don’t exist yet (e.g., teleportation). Finally, from enabling technologies, ask your research community to map out knowledge gaps. These are the things that we don’t know yet, which prevent us from developing enabling technologies (e.g., how do you convert all the mass in a human body into energy without blowing things up? How do you reassemble that energy at the destination into a human body?). It can be helpful to frame knowledge gaps as questions.

Hot Tip: Use societal needs, enabling technologies, and knowledge gaps to perform a content analysis of your research portfolio. How many of the topics are already funded? How many topics are not yet represented in the portfolio? This analysis should be performed in the context of a portfolio framework, which may help you envision the scope of your funding program’s discipline and relation to other funding streams.

Rad Resource: When mapping societal needs, enabling technologies, and knowledge gaps, it can be helpful to place them in a hierarchical framework to track their relationships. In this diagram, dotted lines show the direction in which the logic framework is generated, working backwards from societal needs. Solid arrows show the flow of scientific knowledge, from discoveries (knowledge gaps) to technologies that meet societal needs.

Logic tree generic color v2

Rad Resource: The flow of knowledge and information in the scientific process is rarely linear. It is probably more accurately represented as a “ripple effect”. We can predict some discoveries and technologies (darker polygons), but others are emergent, and knowledge flows in all directions.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Greetings from Catholic Relief Services (CRS)! We, Suzanne Andrews and Shaun Ferris, from the Baltimore-based Agriculture and Livelihoods Program, presented at the American Evaluation Association’s Annual Conference in Washington DC on a Farmbook suite of on/off line tools to help us to better build capacity, gather data and develop business plans with smallholder farmers.

Andrews

Photo by Suzanne Andrews (Catholic Relief Services)

Lesson Learned: A Challenge: One of the key problems we face in working with smallholder farmers is understanding who our clients are, where they live, their cropping systems, their costs of production and the market opportunities near their communities. There are very few tools to help field agents gather these types of monitoring and evaluation data in a systematic way and few means of aggregating and sharing this information.

A Product: CRS has been working to develop tools that help field agents to develop farmer group business plans, to gather data on production and profitability levels, sharing this information with farmers, local project managers and globally through a digital data platform.

Rad Resources: We manage, analyze and share our data through cloud-based data management systems that allow global users from CRS and other organizations to view our data and create customized reports.  We are also working with Nethope’s cloud services, creating webinars to share ideas and get feedback, and also link with potential users. We have held several webinars about our e-learning platforms and the business planner /profitability tool . We also share the information through our  ICT4D conferences that we hold every year, in Africa.

Lessons Learned: Field agents who tested the Farmbook business planner and profitability calculator, performed much better when they first enrolled in the e-learning course in marketing and gross margin analysis. We have developed comprehensive training curricula for smallholder capacity building, to support the farm business plan development and data gathering process.

Developing the Farmbook suite required a team of people with diverse expertise, ranging from agriculture advisors, software architects, programmers, instructional designers, subject matter specialists, editors, and artists, to innovative field managers and field agents to design, develop and test the beta versions of Farmbook.  Holding that team together in the build, test and deploy phases has been critical to getting to the starting point. We are still working on the business models!

Get Involved: If you would like to test drive our learning tools, the Farmbook business planner, or the Map and Track service delivery audit, let us know!  Contact Suzanne.Andrews@crs.org to request a training version of the software, allowing you to assess the profitability of your farm and your farmers!

The American Evaluation Association is celebrating Information and Communication Technology for Development (ICT4D) for Monitoring, Evaluation, Accountability and Learning (MEAL) week. The contributions all this week to aea365 come from members who work in ICT4D for MEAL. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Hi, my name is Marianna Hensley, Program Quality Manger for Health with Catholic Relief Services (CRS) in India. I currently support the Reducing Maternal and Newborn Deaths (ReMiND) project that CRS implements in partnership with Dimagi, Inc. and Vatsalya.

The ReMiND project works with government community health workers (CHW) to improve the frequency and quality of their home visits to women and children. CHWs use basic mobile phones operating Dimagi’s open-source CommCare software, which equips them with job aids to support client assessment, counseling, and early identification, treatment and/or rapid referral of complications. With the project’s use of CommCare as a case management tool and job aid for CHWs, leveraging information and communication technologies (ICT) for project monitoring and evaluation (M&E) with the same software platform was an obvious choice for ReMiND. All routine project monitoring is done through CommCare operated on basic mobile phones while data collection for the project’s baseline household survey was done using CommCare on tablets.

Lessons Learned: For all data nerds out there, imagine the excitement of realizing that ICT-enabled M&E means you get all those numbers now! Beware the lure of real-time data with ICT for M&E.

Hensley

Photo by Marianna Hensley (Catholic Relief Services)

With the use of ICT for data collection in either routine monitoring or evaluation comes the strong temptation to ask every question you can think of—just because it’s so easy to capture responses with fewer worries about the delays or errors typically associated with manual data entry following paper-based collection. The risks are multiple: 1) you find yourself left with more data than you can or feasibly will analyze and use; and 2) you hazard user (data collector) and respondent fatigue from a questionnaire that delves too deeply into non-essential information.

Faced with the lure of real-time data from ICT, M&E practitioners must remember more than ever to focus on the need to know information that supports project or evaluation decision-making and objectives.collector) and respondent fatigue from a questionnaire that delves too deeply into non-essential information.

 

Hot Tips:

  • Make sure to choose an ICT device that fits your needs in terms of screen size and resolution. Long questions or lists of select options are easier to deal with on a larger-screen rather than on a smaller-screened device that requires scrolling.
  • Don’t forget to assess the battery life of your device as part of field testing an ICT tool. And have a plan that includes resources such as solar or car chargers to ensure devices are adequately charged throughout data collection or monitoring.

Rad Resources: The ReMiND project’s monitoring tool application and baseline survey application are available for free download on CommCare Exchange.

ReMiND is featured as a case study and the example of M&E in mobile health programming in the Global Health e-Learning Center’s  new mHealth Basics course.

The American Evaluation Association is celebrating Information and Communication Technology for Development (ICT4D) for Monitoring, Evaluation, Accountability and Learning (MEAL) week. The contributions all this week to aea365 come from members who work in ICT4D for MEAL. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hi, my name is Shenkut Ayele, Early Warning Assessment and Response Manager with Catholic Relief Services (CRS) for the Joint Emergency Operation (JEOP) in Ethiopia. JEOP is a USAID-funded emergency food assistance program that over two years is providing food aid to almost 1 million people. The program operates across Ethiopia and is a partnership between many agencies and government.

Until last August, I faced serious challenges: data were slow to arrive and often of poor quality. As a result, reports were delayed and decision-making hampered with serious consequences for JEOP’s ability to respond effectively. However, since August 2012, JEOP has been using an innovative solution that is strengthening our ‘Participatory Early Warning and Response System’. We are using DataWinners, an SMS-based solution implemented in partnership with Human Network International. Registered individuals across 79 districts collect and upload data via SMS each week onto a web-based database. I am able to use these data in real time to inform decision makers. Here are two graphics how the system works and the data collection and information flow.

Ayele 1

Ayele

Lessons Learned: After implementing our system for one year, we have learned that:

  • Vulnerable communities should be viewed as both sources and recipients of early warning information.
  • Adoption of our new SMS-based system has empowered local officials who are now using the reports to undertake better estimates of the number of individuals who might be affected by a disaster.
  • Local officials are better able to represent the needs of vulnerable communities in discussions at higher levels of government.
  • Local officials and others in JEOP have found the better quality data has improved their ability to target the most vulnerable communities.
  • The system has the potential to accommodate other innovative uses, and government officials have expressed their interest to adopt the SMS system more widely.

Hot Tip: An effective SMS-based system provides a strong basis for a participatory early warning and response system because it enhances the likelihood that any data generated will be used to support better decision-making among different users.

The American Evaluation Association is celebrating Information and Communication Technology for Development (ICT4D) for Monitoring, Evaluation, Accountability and Learning (MEAL) week. The contributions all this week to aea365 come from members who work in ICT4D for MEAL. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· ·

Hi, my name is Or Dashevsky, Chief Solution Architect for Catholic Relief Services based in Baltimore. I’m responsible for providing technical leadership to a team who develop CRS’ enterprise architecture.

Malaria is endemic in Sierra Leone, with stable and perennial transmission in all parts of the country. As such, the entire population is at risk of developing the disease. Malaria accounts for about 50% of outpatient morbidity and is presently the leading cause of morbidity and mortality among children under five years of age, with a mortality attributed to malaria estimated to be 38% among this age group and 25% for all ages (Outpatient morbidity statistics, MoHS, 2009, MIS 2010).Dashevsky

Catholic Relief Services (CRS) and the Ministry of Health and Sanitation (MoHS) of Sierra Leone are co-implementing a Global Fund project to fight AIDS, Tuberculosis, and Malaria (Global Fund Round 10). The overall goal of the Global Fund Round 10 Malaria project is to achieve the malaria-related Millennium Development Goals (MDGs) by 2015, not only nationally, but also among the poorest groups across Sierra Leone.

In order to track progress and impact, CRS led the implementation of a Malaria Indicator Survey (MIS) from 31 January – 8 March 2013 covering 6,720 households throughout the country.  Despite the great surge of mobile technologies to accelerate data collection, all surveys prior to this were using paper-based systems in Sierra Leone. The 2013 MIS used Apple 3GS iPhones to collect data via the iFormBuilder platform, a Software as Services application allowing for timely data collection, monitoring, and analysis.

Lessons Learned:

  • Allow enough time to digitize paper questionnaires: It took approximately 9 weeks of intense programming and testing over a 10-month period to program the MIS questionnaire into iFormBuilder.
  • Allow enough time to pre-test: The tool was pre-tested in 100 households in both rural and urban areas three months prior to the start of MIS data collection.
  • Spend enough time on training enumerators prior to data collection:  Data collection training for 28 teams lasted three weeks, which was necessary to ensure that all individuals collecting MIS data fully understood the questions, the functioning of iPhones, and the sequencing and logic of the questionnaires
  • Provide central technical support throughout data collection effort:  Throughout data collection, a CRS Freetown-based team was available 16 hours a day to respond to phone calls from the field teams, especially during the first 10 days of fieldwork. This allowed for real-time review of data and timely corrections.

Hot Tip: Digital data collection will improve timelines and accuracy of the data. It may look more expensive than traditional paper-base systems but in reality cost of digital data collection can be less in the long run.

The American Evaluation Association is celebrating Information and Communication Technology for Development (ICT4D) for Monitoring, Evaluation, Accountability and Learning (MEAL) week. The contributions all this week to aea365 come from members who work in ICT4D for MEAL. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hi! My name is Mike Matarasso and I’m responsible for leading the design, testing and global roll out of a Monitoring and Evaluation / Information Communication Technology (M&E/ICT) platform for Catholic Relief Services (CRS). The platform will help us gather timely and high quality data to track performance across the agency, to inform change at project level and to report to donors and other stakeholders. The platform includes:

  1. Recommended mobile devices for simple and detailed data collection with options for solar charging and collection/syncing offline,
  2. A standard form library and form building interface where projects can select and use existing forms, adapt existing forms or create customized forms,
  3. A database with an interface for data management, cleansing and advanced analysis,
  4. A Geographical Information System (GIS) interface for mapping service delivery,
  5. A real time web reporting and dashboard interface with a standard library
  6. A complete training curriculum for users and support staff
  7. A help desk with tiered service support

We’ve built and piloted the platform in one food security project in Ethiopia with outcomes for Water, Sanitation and Hygiene (WASH), food security, mother and child health, public works and microfinance. Following the initial design, validation was done with five additional projects across four countries.

Experiences from this pilot will be combined with a cost-benefit analysis and field assessment carried out with Accenture and a global CRS project level assessment of requirements to determine the architecture and next steps for scaling globally.

Lessons Learned: If you were to design a similar system, here are some suggestions:

  • A mandate and support from leadership are essential
  • Sufficient budget should be planned and in place
  • The right number of qualified staff should be available to work on building the system and for piloting in the field. Everyone should believe in the system and be excited about it!
  • Requirements should be documented and confirmed by all stakeholders before starting work
  • Testing and adaptation are imperative and should be done in one project before all kinks are worked out. And did I say testing?
  • A training curriculum should only be developed after the initial system design is complete.  Otherwise the training materials will constantly change and be outdated as will the knowledge and skills of the trainees. Intensive mentoring is required.
  • A cost benefit analysis is integral to make a business case for the platform and to improve adoption
  • An Information Technology (IT) help desk and skilled support network need to be in place.
  • Focus on small, realistic releases and timelines and get something done initially to demonstrate success to others. Work in phases rather than expecting to deliver everything at once.

The American Evaluation Association is celebrating Information and Communication Technology for Development (ICT4D) for Monitoring, Evaluation, Accountability and Learning (MEAL) week. The contributions all this week to aea365 come from members who work in ICT4D for MEAL. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· ·

Hi, my name is Guy Sharrock. I work in a small team of advisors working for Catholic Relief Services (CRS) responsible for improving the quality of the agency’s Monitoring, Evaluation, Accountability and Learning (MEAL) activities. CRS is working to improve the lives of individuals in over 90 countries. CRS views the rapid expansion of cellular technology as hugely transformative for even the most vulnerable communities: in 2013, the rate of mobile-cellular penetration in developing countries stands at 89 per cent. For the last five years a cross-departmental Community of Practice (CoP) has sought to understand and promote the use of Information and Communication Technology for Development (ICT4D), inter alia, to improve the efficiency and effectiveness of the agency’s MEAL operations.

The theme of this week is ICT4D for better MEAL. Our journey to date has been both exciting and challenging: over the last few years, the agency’s use of ICT4D has become widespread with over 100 support requests logged in the last year. We have benefited greatly from the work undertaken by many other agencies that have embarked on a similar course; in return, we would like here to share some of our learning!

Hot Tips:

  • ICT4D supports better communication – data captured electronically is available in real-time, leading to prompt corrections and more accurate data, enabling timely feedback to stakeholders, and improved decision making.
  • ICT4D generates greater impact – technology provides greater access to information to help empower local communities, and provides a medium for voices to be heard.
  • ICT4D enables programs to reach scale more quickly – permitting greater coverage and reduced intervention costs per person.
  • ICT4D improves cost efficiencies – the technology pays for itself by eliminating expenses associated with conventional data collection and reporting approaches, and by allowing re-usability of its components.

Lessons Learned:

  • Understand context – how the solution will be used; the impact on day-to-day practices of different stakeholders; environmental (e.g., connectivity and power), political (e.g., government regulations), and cultural (e.g., language, gender, and education levels).
  • Identify solutions that do not require heavy investment in infrastructure; software services that are easily configured and maintained; solutions that are appropriate to users’ data needs and that work in occasionally connected environments with intermittent access to power and work across a range of user devices.
  • Articulate the true cost of the solution including procurement, deployment, maintenance and support over the life of the solution, not merely over the project cycle.
  • Compare the costs of working with and without the support of an ICT4D solution.
  • Implement a change management program to facilitate the behavior changes required to enjoy the benefits of ICT4D for enhanced MEAL.

Rad Resources: Check out the learning taking place around CRS’s annual conferences.

Clipped from http://www.crsprogramquality.org/ICT4D/

The American Evaluation Association is celebrating Information and Communication Technology for Development (ICT4D) for Monitoring, Evaluation, Accountability and Learning (MEAL) week. The contributions all this week to aea365 come from members who work in ICT4D for MEAL. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

My name is Teri Garstka and I am currently a Research Associate in the Institute for Educational Research and Public Service at the University of Kansas. We have a wide portfolio of research and evaluation projects in early childhood, child welfare and youth programs, and family services.

We work with lots of state, local, and community-based agencies. This means we also must work with the many types of Management Information Systems (MIS) used in those agencies to help us evaluate their programs and the families they serve. As we all know, data systems never “talk” to one another and true cross-systems data collection, analysis, and reporting can be quite challenging.

We needed a wonder tool that would:

  • Integrate Data!
    • Have the ability to bring together existing data from disparate agency management information systems into one handy place
    • Link client-level data across systems and data collection methodologies
    • Build the data system to include everything we need and nothing we don’t
  • Secure Data!
    • Keep data secure when collected and not stored on a mobile device or PC
    • Reside on a HIPAA Compliant server to transfer Protected Health Information (PHI) data from agencies
  • Collect Data!
    • Need the ability conduct field interviews in natural environments such as in the home or community to build rapport and get rid of paper and pencil surveys
    • Need to interface with emerging mobile technologies such as iPads and PC Tablets
    • Need web-access surveys and multiple response formats
  • Report Data!
    • Need to export any of the data in multiple formats for analysis and reporting
    • Calendar multiple data collection timepoints
    • De-identify information as needed

Rad Resource: We found a great tool to help us integrate existing data from an agency’s information system with real-time data collection from families in the field. Created by Vanderbilt University with NIH funding, Research Electronic Data Capture (REDCap) is a web-based application that non-computer programmers can learn and use freely if their institution is a consortium member.  After a few tutorials, you can learn to build a REDCap database for your project and customize it to fit the unique data system needs of almost any project. We like REDCap because it is so versatile and easy to use – it’s perfect for social scientists and evaluators looking for a more robust data collection system without having to hire a computer programmer for every project.

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org. Want to learn more from Teri? Teri and her colleagues will be presenting as part of the Evaluation 2011 Conference Program, November 2-5 in Anaheim, California.

· ·

Greetings from beautiful Boise! We are Rakesh Mohan and Bryon Welch from the Idaho legislature’s Office of Performance Evaluations.

Last February, the Idaho legislature asked us to evaluate the state’s new system for processing Medicaid claims. Legislators had received many constituent complaints that claims from Medicaid providers were being denied, delayed, or inaccurately processed. Legislators were beginning to question whether the new system would ever perform as intended.

The $106 million system went live in July 2010 and immediately began experiencing problems. At the time of our review, over 23,000 providers were enrolled in the system, which was processing about 150,000 claims each week.

Lessons Learned: Our review found that problems with processing provider claims were the result of unclear contract requirements, a lack of system readiness, and most importantly, the absence of adequate end user participation. Less than one percent of total providers were selected for a pilot test, but neither the state administrators nor the contractor knew how many claims were actually pilot tested. Further, only about 50 percent of the providers were enrolled when the system went live.

Hot Tip: If you are ever asked to evaluate the implementation of a large IT system that is experiencing problems, make sure you examine the end user involvement in the system’s design and implementation. Too often end user feedback is underappreciated, not used, or completely ignored.

Lessons Not Learned: Nearly ten years ago, Idaho attempted to implement a similar IT system to track student information for K-12 public schools. After spending about $24 million, the project was terminated due to undelivered promises and a lack of buy in from end users. Unfortunately, lessons identified in our evaluation of the failed student information systems were apparently not learned by people responsible for this new Medicaid claims processing system.

Hot Tip: Because the success of an IT system depends on end user buy-in, ask the following questions when evaluating the implementation of large IT systems:

1.       Are end users clearly identified?

2.       Are end user needs identified and incorporated into system objectives?

3.       Do vendors clearly specify how their solutions/products will address system objectives and end user needs?

4.       Is there a clear method for a two-way communication between system managers and end users with technical expertise?

5.       Is there a clear method for regularly updating end users on changes and progress?

The American Evaluation Association is celebrating GOV TIG Week with our colleagues in the Government Evaluation AEA Topical Interest Group. The contributions all this week to aea365 come from our GOV TIG members and you can learn more about their work via the Government TIG sessions at AEA’s annual conference. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hello! We are Xin Wang, Neeley Current, and Gary Westergren. We work at the Information Experience Laboratory (IE Lab) of the School of Information Science & Learning Technologies at the University of Missouri.  The IE lab is a usability laboratory that conducts research and evaluates technology. What is usability? According to Jakob Nielsen’s definition, usability assesses how easy user interfaces are to use. With the advancement of Web technology, in the past eight years, our lab has successfully applied a dozen of usability methods into the evaluation of educational and commercial Web applications. The evaluation methods that we have frequently used include: heuristic evaluation, think-aloud interviews, focus-group interviews, task analysis and Web analytics. Selecting appropriate usability methods is vital and should be based on the development life cycle of a project. Otherwise, the evaluation results would not be really useful and informative for the Web development team. In this post, we focus on some fundamental concepts regarding one of the most commonly adopted usability evaluation methods–Think-Aloud protocol.

Hot Tip: Use think-aloud interviewing! Think-aloud interviewing is used to engage participants in activities and then ask users to verbalize their thoughts as they perform the tasks. This method is usually applied during the mid or final stage of Website or system design.

Hot Tips: Employing the following procedures are ideal:

  1. Recruit real or representative users in order to comply with the User-Centric Design principles
  2. Select tasks based on frequency of use, criticality, new features, user complaints, etc.
  3. Schedule users for a specific time and location
  4. Have users operate a computer accompanied by the interviewer
  5. Ask users to give a running commentary (e.g., what they are clicking on, what kind of difficulty they encounter to complete the task)
  6. Have interviewer probe the user about the task s/he is asked to perform.

Pros:

  1. When users verbalize their thoughts, evaluators may identify many important design issues that caused user difficulties, such as poor navigation design, ambiguous terminology, and unfriendly visual presentation.
  2. Evaluators can obtain users’ concurrent thoughts rather than just retrospective ones, so it may avoid a situation where users may not recall their experiences.
  3. Think aloud protocol allow evaluators to have a glimpse into the affective nature (e.g., excitement, frustration, disappointment) of the users’ information seeking process.

Cons:

  1. Some users may not be used to verbalizing their thoughts when they perform a task.
  2. If the information is non-verbal and complicated to express, the protocol may be interrupted.
  3. Some users may not be able to verbalize their entire thoughts, which is likely because the verbalization could not keep pace with their cognitive processes–making it difficult for evaluators to understand what the users really meant.
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Archives

To top