AZENet Week: Educating for Empowerment & Impact by Scott Swagerty

Scott Swagerty
Scott Swagerty

Hello! I am Scott Swagerty, PhD, President-Elect for the Arizona Evaluation Network and the Methodologist for the Office of the Arizona Auditor General. I work primarily in the Performance Audit Division which conducts audits to assess the effectiveness and efficiency of Arizona state agencies and programs. I am responsible for organizing Arizona Evaluation Network’s annual conference this year and wanted to share some perspective on why I chose the theme of “Refocusing on the Fundamentals.”

In my experience, evaluation is as much about education as it is providing assessments of programmatic results. What can we do as evaluators to broaden these educational efforts for our clients and improve our impact as evaluators, regardless of the areas in which we work? For me, revisiting the essential processes of thinking about data collection, measurement, and outcomes—and focusing on ways of educating our clients in these areas—would be a great start.

Hot Tips:

  • Make a pitch for good data collection practices—It can be hard to convince clients who are already short of resources and time to commit to developing and implementing good data collection processes and systems that will make evaluation more impactful. Sell clients on the value of effective data collection by stressing how good data can be used to determine what is working and what is not, what investments are paying off in terms of outcomes, and where resources may be more effectively utilized. Bad data makes this kind of assessment impossible.
  • Remind the client to focus on outcomes—A common problem in the evaluative work I do is that clients are assessing their performance in terms of program outputs: for example, measuring progress by how many free books were given out rather than a metric focused on the actual goal of the program of improving literacy. Thinking about outcomes is hard because sometimes outcomes are time-distant, or difficult to measure, or difficult to track. Educate your client to help them focus on what matters and develop program outcomes that will allow them to demonstrate the impact of their programs.

Rad Resources:

  • Many books have been written about it, but I love how distilled the principles of data collection are in this resource from the Right To Education Initiative: Get the right data, get the data right, get the data right away, get the data the right way, and get the right data management—always.
  • Having trouble getting your client to understand the difference between outputs and outcomes? Take this line from Deborah Mills-Scofield’s excellent article in the Harvard Business Review: “Outcomes are the difference made by the outputs” or “…without outcomes, there is no need for outputs.”
  • Keep the conversation going about the importance of evaluation fundamentals at the Arizona Evaluation Network’s 2019 conference: Refocusing on the Fundamentals.

The American Evaluation Association is celebrating Arizona Evaluation Network (AZENet) Affiliate Week. The contributions all this week to aea365 come from our AZENet members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

3 thoughts on “AZENet Week: Educating for Empowerment & Impact by Scott Swagerty”

  1. Hi Dr. Swagerty,

    I am a current Professional Master of Education student from Queen’s University, Canada. I am currently enrolled in a Program Inquiry and Evaluation course and appreciate the suggestions and resources you have provided.

    Your ‘Hot Tips’ really hit home to the topics I’ve been learning about, and the Program Evaluation Design (PED) I am creating. Focusing on the quality of the data over the quantity is the theme of your first tip. By spending the resources (time or money) into valuable data collection processes, the client will be determining the most valuable outcomes of their program. Your second tip, to focus clients on the outcomes rather than the program outputs. After reading this and reviewing my PED, I believe it was much easier to focus on these outputs and how to measure them than the outcomes. However, as you stated, it is important to invest the energy and focus on what really matters, which will be the impact of the program.
    I hope to get the right data, right away, in the right way!

    Thanks for sharing your expertise.

    Taylor Richmond

  2. Sherice Gayden


    I really love the idea where you explain how program evaluation is not just looking at what’s wrong and right, but the education to the clients so they too can see the difference, The lasting results are more profound with the constructive feedback that helps the clients grow from the evalution.

  3. Hello Scott. My name is Crystal and I am a student at Texas A&M University Killeen Texas. Fundamentals are always important for any aspect of learning and creating. As you know, part of being an evaluator is the creativity and thinking outside the box. As far as an evaluator goes it is imperative to always stay abreast of the fundamentals. There’re so many different details that rely on evaluators sense of resourcefulness. Your hot tip on reminding the client to focus on outcomes is a great idea. I like to use the term playing the long game. It’s important to reiterate to your client that the long run is hard to see. But the overall effect on the program this is the whole point. Thank you for sharing.

Leave a Reply to Sherice Gayden Cancel Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.