My name is Amy A. Germuth, President of EvalWorks, LLC (http://EvalWorks.com) and owner/blogger at EvalThoughts.com. I’ve worked over the last year on improving my evaluation reports to better meet my client’s needs and have a few great resources to help you do the same.
Rad Resource: “Unlearning Some of our Social Scientist Habits” by Jane Davidson (independent consultant and evaluator extraordinaire, as well as AEA member and TIG leader). http://davidsonconsulting.co.nz/index_files/pubs.htm She recently added some additional thoughts to this work and presented them at AEA’s 2009 annual conference in Orlando. Her PowerPoint slides for this presentation can be found at: http://bit.ly/7RcDso.
Frankly, I think this great article has been overlooked for its valuable contributions. Among other great advice for evaluators (including models or theories but not using them evaluatively and leaping to measurement too quickly), she addresses these common pitfalls when reporting evaluation findings: (1) Not answering (and in some cases not even identifying!) the evaluation questions that guided the methodology, (2) reporting results separately by data type or source, and (3) ordering evaluation report sections like a Master’s thesis. This entertaining article and additional PowerPoint slides really make a case for using the questions that guide the evaluation to guide the report as well.
Rad Resource: The “Evaluation Report Checklist” by Gary Miron (professor at Western Michigan University and former Chief of Staff at The Evaluation Center at WMU) provides a great outline of the eight main sections in an evaluation report (Title page, Exec. Summary, Table of Contents, Introduction and Background, Methodology, Results, Summary and Conclusion, References) and the various things that should be included in each. http://www.wmich.edu/evalctr/checklists/checklistmenu.htm
The author notes that this checklist can be used as a “tool to guide a discussion between evaluators and their clients regarding the preferred contents of evaluation reports and a tool to provide formative feedback to report writers” and can help writers identify the strengths and weaknesses of their report. However, as Gary notes, evaluation reports differ greatly in terms of purpose, budget, expectations, and needs of the client, thus one may need to consider or weight the checkpoints within sections as well as the relative importance and value of each section when reviewing one’s own writing (or someone else’s).
Using the Evaluation Report Checklist in conjunction with some of Dr. Davidson’s suggestions has increased the quality and utility of my evaluation reports and should do the same for yours.