Hello! I am Scott Swagerty, PhD, President-Elect for the Arizona Evaluation Network and the Methodologist for the Office of the Arizona Auditor General. I work primarily in the Performance Audit Division which conducts audits to assess the effectiveness and efficiency of Arizona state agencies and programs. I am responsible for organizing Arizona Evaluation Network’s annual conference this year and wanted to share some perspective on why I chose the theme of “Refocusing on the Fundamentals.”
In my experience, evaluation is as much about education as it is providing assessments of programmatic results. What can we do as evaluators to broaden these educational efforts for our clients and improve our impact as evaluators, regardless of the areas in which we work? For me, revisiting the essential processes of thinking about data collection, measurement, and outcomes—and focusing on ways of educating our clients in these areas—would be a great start.
Hot Tips:
- Make a pitch for good data collection practices—It can be hard to convince clients who are already short of resources and time to commit to developing and implementing good data collection processes and systems that will make evaluation more impactful. Sell clients on the value of effective data collection by stressing how good data can be used to determine what is working and what is not, what investments are paying off in terms of outcomes, and where resources may be more effectively utilized. Bad data makes this kind of assessment impossible.
- Remind the client to focus on outcomes—A common problem in the evaluative work I do is that clients are assessing their performance in terms of program outputs: for example, measuring progress by how many free books were given out rather than a metric focused on the actual goal of the program of improving literacy. Thinking about outcomes is hard because sometimes outcomes are time-distant, or difficult to measure, or difficult to track. Educate your client to help them focus on what matters and develop program outcomes that will allow them to demonstrate the impact of their programs.
Rad Resources:
- Many books have been written about it, but I love how distilled the principles of data collection are in this resource from the Right To Education Initiative: Get the right data, get the data right, get the data right away, get the data the right way, and get the right data management—always.
- Having trouble getting your client to understand the difference between outputs and outcomes? Take this line from Deborah Mills-Scofield’s excellent article in the Harvard Business Review: “Outcomes are the difference made by the outputs” or “…without outcomes, there is no need for outputs.”
- Keep the conversation going about the importance of evaluation fundamentals at the Arizona Evaluation Network’s 2019 conference: Refocusing on the Fundamentals.
The American Evaluation Association is celebrating Arizona Evaluation Network (AZENet) Affiliate Week. The contributions all this week to aea365 come from our AZENet members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Hello Scott,
I really enjoyed reading your perspective on program evaluation. I’m relatively new to the field, and I’m currently taking a course in a graduate-level education program.
What stuck out for me was your desire to re-think what good evaluation means. In short, it’s less about the numbers and more about the long-term vision or outcomes. What is the company’s/program’s ethos? What is our “why”? so to speak. Are we working towards that, effectively and efficiently? Programs and that are truly successful AND healthy; meaning, ones that care about the wellbeing of all stakeholders; are programs that do not lose sight of those overarching intentions. We do that by, as you say, educating people to uphold the values and intentions of programs/agencies. As an educator myself, I appreciate what you do. Keep up the thoughtful work!
Thank you for your time,
David
Dear Dr Swagerty,
My name is Urooj. I am enrolled in the Professional Masters of Education program. I am a student at Queen’s University and I am replying to your article. Your article prompts me to websites that provide me with a deeper understanding of evaluation. Your points are very interesting. I like that you stated about revisiting the process- as I am also a reflective learner. I agree with you that evaluation is rooted in education. I like the idea of clients getting informed about the data. It seems like it can be a collective process between people. Clients would have a better understanding of how efficient data is collected. Clients need to be able to distinction correct data and inaccurate data. Date information can truly have a profound impact of programming. I believe clients input can also be important. The tips you offer are useful and working with clients shows you care about your work. Thank you for broadening my understanding of how evaluation and assessment process works.
– Urooj Ahmed
Hi Dr. Swagerty,
My name is Dema and I’m currently a student in a Professional Masters of Education at Queen’s University. I’m currently taking a course on Program Inquiry and Evaluation.
I’m really interested in the concept you bring up of “Refocusing on the Fundamentals” in evaluation. It seems to be necessary to periodically reflect on evaluation intention and the core values to implement.
I like your point on the importance of convincing clients of the importance of strong, reliable data collection.This appears to be a process in explaining to clients the impact reliable data can have on influencing change or gaining valuable knowledge. This makes me think of my own PED I’ve created and the importance of reliable, quality data to be collected and analyzed through the process. This makes me think of how this would look in the application stage of my evaluation, requiring to ensure clients are on board with collecting reliable data to align with my evaluation theory.
You also point out the importance of focusing on outcomes rather than output, even if it does take more time to see progress. I think this is a really important point in reminding clients that redirecting the focus back to the initial intention is how you actually influence the meaningful change that you intended.
Thanks for sharing your expertise!
?Dema
Hi Dr. Swagerty,
I am a current Professional Master of Education student from Queen’s University, Canada. I am currently enrolled in a Program Inquiry and Evaluation course and appreciate the suggestions and resources you have provided.
Your ‘Hot Tips’ really hit home to the topics I’ve been learning about, and the Program Evaluation Design (PED) I am creating. Focusing on the quality of the data over the quantity is the theme of your first tip. By spending the resources (time or money) into valuable data collection processes, the client will be determining the most valuable outcomes of their program. Your second tip, to focus clients on the outcomes rather than the program outputs. After reading this and reviewing my PED, I believe it was much easier to focus on these outputs and how to measure them than the outcomes. However, as you stated, it is important to invest the energy and focus on what really matters, which will be the impact of the program.
I hope to get the right data, right away, in the right way!
Thanks for sharing your expertise.
Taylor Richmond
Hello,
I really love the idea where you explain how program evaluation is not just looking at what’s wrong and right, but the education to the clients so they too can see the difference, The lasting results are more profound with the constructive feedback that helps the clients grow from the evalution.
Hello Scott. My name is Crystal and I am a student at Texas A&M University Killeen Texas. Fundamentals are always important for any aspect of learning and creating. As you know, part of being an evaluator is the creativity and thinking outside the box. As far as an evaluator goes it is imperative to always stay abreast of the fundamentals. There’re so many different details that rely on evaluators sense of resourcefulness. Your hot tip on reminding the client to focus on outcomes is a great idea. I like to use the term playing the long game. It’s important to reiterate to your client that the long run is hard to see. But the overall effect on the program this is the whole point. Thank you for sharing.