Hello, AEA365 community! Liz DiLuzio here, Lead Curator of the blog. This week is Individuals Week, which means we take a break from our themed weeks and spotlight the Hot Tips, Cool Tricks, Rad Resources and Lessons Learned from any evaluator interested in sharing. Would you like to contribute to future individuals weeks? Email me at AEA365@eval.org with an idea or a draft and we will make it happen.
Janelle, JoAnna, Mary, and Christiana here, evaluators from Emory Centers for Public Health Training and Technical Assistance. We had the opportunity to present a session entitled No More Crappy Surveys at last year’s AEA Summer Evaluation Institute. We are on a mission to rid the world of crappy surveys, and are here to share some of our Hot Tips and Rad Resources to do so.
If you haven’t already, check out the first and second blog posts in this series, No More Crappy Surveys – Best Practices in Survey Design for Evaluations (you can check it out here) and No More Crappy Survey Analysis – Best Practices in Survey Analysis for Evaluations (which you can read here). Today, we’ll be following up with some tips on how to report your survey findings to different audiences and tips to engage partners throughout the survey process.
Hot Tips
- Engage relevant partners at each step of the survey process. Ensuring that your partners are satisfied with an evaluation report begins at the start of your evaluation process. Engaging partners during the design of evaluation questions will help you to develop and deliver a utility-focused evaluation, guiding you to answer key questions that are truly meaningful to your partners. Some examples of continuing to engage partners in all steps of the process include sending your survey draft or pilot test to partners, discussing preliminary results at a partner meeting, and confirming reporting expectations.
- Tailor your report format to your audience. Before beginning on the report (whether it is a presentation, infographic, or written report), it is imperative to understand your audience. If you are developing the report for partners or clients, how do they plan to use it? Will it be an internal resource for leadership? Or a tool to communicate with the public? Understanding the plan for dissemination will help you to decide what type of report to use and the level of detail your audience needs.
- Break up blocks of text with alternative content styles. Generally, aim to make your report as user-friendly and easy to read as possible. Some tips are to break up big blocks of text with bullets, a list of key points, illustrative quotes, tables, visualizations, or figures to highlight findings. Be mindful of the formatting and styling you use. Select section headers that provide visual contrast, and ensure you leave enough white space on the page. To mitigate information overload and keep readers intrigued, leverage different content styles while keeping your messaging direct and digestible throughout.
- Make your data visualizations work for you. Data visualization is a powerful way to display data, but can prove to be challenging in some situations. To ensure findings are clear, avoid over-complicating data through complex visualizations when a simple visualization will suffice. Use colors and labels to highlight key points and takeaways.
- Keep your message clear and concise. Use plain language and avoid jargon and acronyms where possible. Aim for a reading level at or beneath your report’s intended audience, using short sentences to communicate your ideas. Keep critical information in your main body, and include supporting details in appendices.
- Use reports for action. After you have made every effort to finalize a high-quality report, you may be asking yourself “Now What?”. If possible, present findings and recommendations to key partners, leaving time for partner discussion and next steps. When disseminating findings in any modality, take advantage of opportunities to encourage decision-making and action planning. Action doesn’t have to be programmatic change—action could be that interviews or focus groups are needed to understand the complex dynamics that you picked up on in the survey results.
Rad Resources
- Stephanie Evergreen is a wonderful resource for all things reporting. We particularly like her blog with data visualization resources and her 1-3-25 reporting model.
- This resource on data visualization from Better Evaluation, which includes a chart suggestion tool to help guide you to effective chart types depending on the type of data you are visualizing.
- This article by Pieta Blakely and Eli Holder on some of the potential impacts of visualizing differences between groups, and emphasizing the importance and power of equitable data visualization.
- Checklist for reports can be incredibly helpful tools. There are several checklists out there, or you can make your own! Look through existing checklists to get inspiration, but tailor your checklist to your clients/partners, reporting needs, and other contextual factors. Here are a couple to get you started: University of Michigan’s Checklist for Program Evaluation Report Content and Stephanie Evergreen’s Evaluation Report Layout Checklist.
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.