My name is Kelly Robertson, and I work at The Evaluation Center at Western Michigan University and EvaluATE, the National Science Foundation–funded evaluation hub for Advanced Technological Education.
I’m a huge fan of quick reference guides. Quick reference guides are brief summaries of important content that can be used to improve practice in real time. They’re also commonly referred to as job aids or cheat sheets.
I found quick reference guides to be especially helpful when I was just learning about evaluation. For example, Thomas Guskey’s Five Critical Levels of Professional Development Evaluation helped me learn about different levels of outcomes (e.g., reaction, learning, organizational support, application of skills, and target population outcomes).
Even with 10-plus years of experience, I still turn to quick reference guides every now and then. Here are a few of my personal favorites:
- Robertson (yes, that’s me!) and Wingate’s Checklist for Program Evaluation Report Content: Even though I coauthored this checklist, I continue to use it when I write reports to make sure I haven’t missed any important content.
- Vagias’s Likert-Type Scale Response Anchors: I typically scan this checklist when I’m creating survey questions to make sure the scales I use are appropriate.
- Leeper’s Choosing the Correct Statistical Test in SAS, STATA, SPSS, and R: During my grad school days and even now, I turn to statistical test cheat sheets like this one to help me decide which tests to use or to help me understand why someone else chose the test they did.
My colleague Lyssa Becho is also a huge fan of quick reference guides, and together we compiled a list of over 50 evaluation-related quick reference guides. The list draws on the results from a survey we conducted as part of our work at EvaluATE. It includes quick reference guides that 45 survey respondents rated as most useful for each stage of the evaluation process.
Here are some popular quick reference guides from the list:
- Evaluation Planning: Patton’s Evaluation Flash Cards introduce core evaluation concepts such as evaluation questions, standards, and reporting in an easily accessible format.
- Evaluation Design: Wingate’s Evaluation Data Matrix Template helps evaluators organize information about evaluation indicators, data collection sources, analysis, and interpretation.
- Data Collection: Wingate and Schroeter’s Evaluation Questions Checklist for Program Evaluation provides criteria to help evaluators understand what constitutes high-quality evaluation questions.
- Data Analysis: Hutchinson’s You’re Invited to a Data Party! explains how to engage stakeholders in collective data analysis.
- Evaluation Reporting: Evergreen and Emery’s Data Visualization Checklist is a guide for the development of high-impact data visualizations. Topics covered include text, arrangement, color, and lines.
If you find any helpful evaluation-related quick reference guides are missing from the full collection please contact kelly.robertson@wmich.edu.
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Hi Kelly,
I was very excited to read your article as I am currently taking a course on program evaluation design for my Professional Masters of Education. In this course, we are learning more about evaluation planning, design, data collection, analysis and reporting through the lens of a beginner new to the field of evaluation. In the current module, we are learning more about data collection and data analysis so I look forward to using these resources to help guide my design as I complete the module!
All of these resources are very helpful to me as I learn more about evaluation and the evaluation design process. I specifically wanted to comment on the “You’re invited to a party” resource. Participatory data analysis is something I am interested in learning more about, and I love the idea of posing reflective questions to increase engagement with stakeholders, as well as increase understanding of the data results. I believe that this would help stakeholders make sense of the data, as it can be all too easy to be consumed and overwhelmed by the amount of data or trying to interpret the data.
One of the educator hats I wear is that of a secondary math teacher (which probably contributes to my interest in participatory analysis from the above paragraph). I love the data visualization checklist resource, with the last slide being an anatomy chart which provides clear examples of the terminology used in the guide. How to make effective visuals for data – charts/graphs etc. – is something that I have to review with my students every year and I am excited to reference this resource and share with students!
Thank you for writing this informative article and sharing so many wonderful resources. This article will be helpful to anyone in the evaluation field, whether it be a student like myself, or evaluation veterans! Have you considered including other mixed media resources (such as YouTube Videos, infographics, podcasts etc.)? I would be interested in reading a future Quick Reference post with additional resources of various types! Thanks again!
This is a FANTASTIC post-Kelly. Thank you for all of the wonderful links to resources!
Sondra LoRe, Ph.D.
Manager | National Institute for STEM Evaluation and Research (NISER)
Adjunct Professor | Evaluation, Statistics, and Measurement Program, Department of Educational Psychology & Counseling
The University of Tennessee, Knoxville
Office of Research & Engagement
114 Philander P. Claxton Education Building
PH: 865-974-4962
slore@utk.edu
Great information. Thank you.