Lauren Spigel, Korinne Chiu, and Emma Sunnassee on Using Automated Data for Program Monitoring

Hi, our names are Lauren Spigel, Korinne Chiu, and Emma Sunnassee. We are from VaxTrac, a global health nonprofit that builds, implements and evaluates a mobile vaccine registry system for developing health systems. Our collective backgrounds are in evaluation, global health and mobile health (mHealth). mHealth is an emerging field that takes advantage of the high mobile penetration rate to increase access to health information and services in some of the most remote regions of the world.

For those who work in international development, you know that the remoteness of project sites can be a major hurdle for monitoring and evaluating programs, as regular site visits can be both time-intensive and costly. However, at VaxTrac, we’ve found that mHealth interventions have an important advantage over traditional interventions: every time a health worker interacts with our project’s mobile phone or tablet, we can see automated data on the back-end.

For example, when health workers use our tablet in the clinic, back-end data can tell us how long it takes a health worker to complete a new patient registration, search for a record, or even how many vaccine doses they recorded in a given timeframe. The automated system allows us to track key indicators without spending the resources to do in-person clinic observations.

Our blog post is going to hone in on some rad resources, hot tips, lessons learned and cool tricks based on our experiences using automated data for program monitoring.

Rad Resources: Thinking about starting a digital (mobile phone- or tablet-based) project? There are a number of open-source digital health tools that can help get you started. See K4Health’s comprehensive list of digital health platforms.

Hot Tip: Collecting real-time, automated data is a useful tool, but can’t always tell you the full story. We believe in the importance of cross-referencing back-end data with focus groups, interviews and other mixed methods to increase the validity and credibility of the results.

Lesson Learned: Take the time to get to know your automated data and how it can be exported. Oftentimes extraneous information is collected, so it will save you time and energy if you’re deliberate about the data you analyze. Ask “What are the purposes for collecting this information?,” “How often would you like to see this information?,” “How would you like it to be represented?,” and “How will you use it?”

Cool Trick: Data dashboards can be a useful way to present and monitor key indicators that are important to your stakeholders. However, building stakeholder capacity to use the dashboard can make the difference between success and failure, use and non-use.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Leave a Comment

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.