My name is Stella SiWan Zimmerman and I am the President of ACET, Inc., a research and evaluation consulting firm based in Minneapolis, MN. We specialize in the evaluation of education, public health, and community-based programs in order to improve organizational effectiveness and build evaluation capacity.
Over the years, I have produced and reviewed many evaluation reports. In reviewing such documents, I have noticed certain key pieces of information that are occasionally overlooked during the report writing process. This does not necessarily mean that the person is a poor evaluator or the results were of poor quality; it may be a function of the writer being excited about sharing the evaluation results and trying to be succinct. Unfortunately, omitting these key details can result in some pertinent information getting lost in the report writing process.
Hot Tip: Regardless of what you’re writing (i.e., PowerPoint presentation, Executive Summary, newsletter), don’t forget those small – but key – details. Never assume that the reader knows as much as you know. To help ensure that your report is complete, make sure you address the following questions to avoid the most common reporting mistakes:
– Did you provide a description of the program? The program (intervention) should be described in each freestanding document (i.e., Executive Summary, Report of Findings). By providing a description of the program, you are giving the reader context to interpret the evaluation findings.
– Did you mention your target population? Reports should always clearly state the target population of the program (intervention) before listing the results. For example, a report could read that the program is “intended to serve 500 inner-city youth.”
– Did you include your sample size? Although your program serves 500 youth you may have collected data on only 35 of those youth. Because the findings presented in a report are based on the experiences of 35 youth, it’s essential to have that number listed so readers understand the reach of your study.
– Have you clarified which instruments you used to gather data? Make sure to clearly state your data source(s) (i.e., surveys, interviews, focus groups). Providing your data source(s) leaves no mystery as to how the results were gathered.
– Is there a clear link from findings to impact? Occasionally a report will list pages of findings, but the findings won’t link to any specific impact. It is essential that the report clearly states the intended outcomes so readers understand the link between what was found and the impact on participants.
The list above may seem commonsense but can often be overlooked in the quest to being concise.
The American Evaluation Association is celebrating Minnesota Evaluation Association (MN EA) Affiliate Week with our colleagues in the MNEA AEA Affiliate. The contributions all this week to aea365 come from our MNEA members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Melissa: I tried to post this yesterday but it doesn’t appear to have arrived . . . First, thanks for this posting! As the Chair of the Research Forum for the Joint Committee on Standards for Educational Evaluation (JCSEE), I want to encourage you and anyone else who is using the standards in their practice to ‘tell your stories’ and submit these to the Forum for review for possible publication on our website. The standards become a more powerful tool as we share the different ways in which they can be used. For more information on how to publish your work see the JCSEE the website http://www.jcsee.org/, or contact me directly, Lyn.Shulha@queensu.ca
Great post! It is precisely these basics that are often overlooked. I would add that it’s also critical to embed the actual instrument within the survey report, so that it isn’t lost over time. As someone who has referenced reports from studies my organization conducted years before I joined the staff, I have sometimes been unable to find the instrument used in a particular evaluation. This is frustrating, because it means that I can’t replicate a question or approach that proved successful in the past, and I can’t easily compare my results to those that my organization has already gathered.