Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

Introducing the Checklist of Program Evaluation Report Content by Kelly Robertson and Lori Wingate

We are Kelly Robertson and Lori Wingate, and we work at The Evaluation Center at Western Michigan University and EvaluATE, the National Science Foundation-funded evaluation resource center for Advanced Technological Education (ATE).

Rad Resource:

We’re excited to announce our new rad resource, the “Checklist of Program Evaluation Report Content.” We created this checklist to address a need for practical guidance about what should go in a traditional evaluation report—the most common means of communicating evaluation results. The checklist is strictly focused on the content of long-form technical evaluation reports (hence, the name). We see the checklist as complementary to the exciting work being done by others to promote the use of evaluation through alternative ways of organizing, formatting, and presenting data in evaluation reports. If you want guidance on how to make your great content look good, check out the new Evaluation Report Guidance by the Ewing Marion Kauffman Foundation and Evergreen Data.

How is our checklist on reporting different from others you may have come across?

  • It not only lists key elements of evaluation reports, but it also defines these elements and explains why they are relevant to an evaluation report.
  • Its focus is not on judging the quality of a report. Rather, our checklist is intended to support practitioners in making informed decisions about what should be included in an evaluation report.
  • It’s not tailored to a specific type of program or evaluand and is presented as a flexible guide rather than rigid specifications.

We hope multiple audiences find the checklist useful. For example, new evaluators may use it to guide them through the report writing process. More experienced evaluators may reference it to verify they did not overlook important content. Evaluators and their clients could use it to frame conversations about what should be included in a report.

Lesson Learned:

It takes a village to raise a great checklist. We received feedback from five evaluation experts, 13 of our peers at Western Michigan University, and 23 practitioners (all experts in their own right!). Their review and field testing were invaluable, and we are so grateful to everyone who provided input—and they’re all credited in the checklist.

Like checklists? See the WMU Evaluation Center’s Evaluation Checklists Project for more.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.