Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

Paul Collier on practicing great data and evaluation habits – Part 2

Collier2

Hello again!  Paul Collier here.  Yesterday, I shared several resources that helped me create repeatable data and evaluation processes at SFCAPC Today I’d like to share with you six of the specific habits I followed to manage our “data” function, develop our Efforts to Outcomes database, and provide consistent value to our staff.

Lessons Learned:

  1. Keep a reporting calendar: Organizations are often required to submit detailed program participant and activity uploads for certain government contracts. I created a comprehensive calendar of when to submit these uploads and who needed to review the data before it uploaded.
  2. Define data integrity controls: Data integrity controls minimize the risk that information in a database is incorrect. For example, we scrubbed our database of test data monthly and audited a sample of new families to verify the accuracy of data entry each quarter. We summarized data integrity controls in a spreadsheet outlining each procedure, information source, performer, reviewer, and results.
  3. Review dashboards: The first week of each month, I sent a performance dashboard to each of the program managers. Managers discussed these metrics with their teams. Then they shared explanations for variances and any action items they were going to take at the manager’s meeting the following week.
  4. Schedule time for troubleshooting and report development: To build staff buy-in I needed to be responsive to database troubleshooting and report development needs. I tracked time for these tasks and blocked out time for them weekly. In an average week, I spent 2-10 hours troubleshooting and training, and 5-15 hours developing self-service reports staff could use to access program data themselves.
  5. Automate annual development data pulls: A significant “data” responsibility was pulling data for our development team, including demographics and unduplicated client counts. Working with our development team, I developed a self-service report designed to answer 80% of their “stats” asks, saving everyone time.
  6. Have a data analysis process: Stakeholders across our organization came to me with many good questions to explore in our data which I just didn’t have time to answer. I created a master tracker of these questions and set aside several weeks each year to explore those that were most critical. This “Annual Data Analysis” process set expectations and created a process to focus our limited analysis time.

These habits helped us save time, set expectations, and create lasting systems. They’re far from complete, but allow SFCAPC to manage our internal data & evaluation work.

What are the data & evaluation habits you’ve found valuable at your organization?

Image via

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.