Hello! I am Scott Swagerty, PhD, President-Elect for the Arizona Evaluation Network and the Methodologist for the Office of the Arizona Auditor General. I work primarily in the Performance Audit Division which conducts audits to assess the effectiveness and efficiency of Arizona state agencies and programs. I am responsible for organizing Arizona Evaluation Network’s annual conference this year and wanted to share some perspective on why I chose the theme of “Refocusing on the Fundamentals.”
In my experience, evaluation is as much about education as it is providing assessments of programmatic results. What can we do as evaluators to broaden these educational efforts for our clients and improve our impact as evaluators, regardless of the areas in which we work? For me, revisiting the essential processes of thinking about data collection, measurement, and outcomes—and focusing on ways of educating our clients in these areas—would be a great start.
- Make a pitch for good data collection practices—It can be hard to convince clients who are already short of resources and time to commit to developing and implementing good data collection processes and systems that will make evaluation more impactful. Sell clients on the value of effective data collection by stressing how good data can be used to determine what is working and what is not, what investments are paying off in terms of outcomes, and where resources may be more effectively utilized. Bad data makes this kind of assessment impossible.
- Remind the client to focus on outcomes—A common problem in the evaluative work I do is that clients are assessing their performance in terms of program outputs: for example, measuring progress by how many free books were given out rather than a metric focused on the actual goal of the program of improving literacy. Thinking about outcomes is hard because sometimes outcomes are time-distant, or difficult to measure, or difficult to track. Educate your client to help them focus on what matters and develop program outcomes that will allow them to demonstrate the impact of their programs.
- Many books have been written about it, but I love how distilled the principles of data collection are in this resource from the Right To Education Initiative: Get the right data, get the data right, get the data right away, get the data the right way, and get the right data management—always.
- Having trouble getting your client to understand the difference between outputs and outcomes? Take this line from Deborah Mills-Scofield’s excellent article in the Harvard Business Review: “Outcomes are the difference made by the outputs” or “…without outcomes, there is no need for outputs.”
- Keep the conversation going about the importance of evaluation fundamentals at the Arizona Evaluation Network’s 2019 conference: Refocusing on the Fundamentals.
The American Evaluation Association is celebrating Arizona Evaluation Network (AZENet) Affiliate Week. The contributions all this week to aea365 come from our AZENet members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to firstname.lastname@example.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.