Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

No More Crappy Survey Analysis – Best Practices in Survey Analysis for Evaluations by Janelle Gowgiel, JoAnna Hillman, Mary Davis, and Christiana Reene

Hello, AEA365 community, and happy Conference Week from Indianapolis! The AEA staff have been working overtime to prepare for our biggest event of the year. Whether you will be joining us for the conference or not, you can keep up with our happenings via the AEA365 blog. See you around!

-Liz DiLuzio, Lead Curator


Janelle, JoAnna, Mary, and Christiana here, evaluators from Emory Centers for Public Health Training and Technical Assistance. We had the opportunity to present a session entitled No More Crappy Surveys at last year’s AEA Summer Evaluation Institute. We are on a mission to rid the world of crappy surveys, and are here to share some of our Hot Tips and Rad Resources to do so.

If you haven’t already, check out the first blog post in this series, No More Crappy Surveys – Best Practices in Survey Design for Evaluations (you can check it out here). Today, we’ll be following up with some tips on how to analyze your surveys (which, of course, you’ve made sure are not crappy!). Stay tuned for our final post of this series, on how to report your findings to different audiences. And please stop by and say hello to us in Indianapolis—we’re always happy to talk about survey design!

Hot Tips

  • Make a plan. Before beginning any analysis, establish your data analysis plan. Return to the evaluation questions that guided your survey, and ground your analysis in directly addressing those questions. When establishing your plan, think through the different types (quantitative and/or qualitative) of analyses you will need, the anticipated timeline, and delegating tasks among you and your team members. This will help you set realistic expectations for when to expect a report or presentation for partners or funders.
  • Clean data ? smooth analysis. Before doing any analysis, you must check and clean your data—it will save you lots of time in the end! Here are some things to look for (check out the Rad Resources below for more information on each response type): responses from beta-testing, duplicate responses, blank responses, incomplete responses, unrealistic or inconsistent or nonsensical responses, straight lining or “Christmas tree” responses, and overly-fast responses. These are signals to look into the response more closely, and assess if it should be removed from the dataset.
  • Proceed with caution with automatic reports. Many survey platforms (Qualtrics, Survey Monkey, Google Forms, REDCap, etc.) have features that will automatically create reports, complete with graphs and tables. While these features are great tools to quickly visualize and summarize your data, use a critical eye when reviewing these analyses. Sometimes, the categories will be in an order that doesn’t make sense, the type of graph may not be the most useful, or the graph itself misrepresents the data. Always review these reports carefully, and consider if there are better ways to visualize your data. By developing your own graphics and tables, you can ensure that data is accurately reported and visualized.
  • You can still get valuable findings from small sample sizes. If you’ve gone through the survey process and do not have as many responses as you anticipated, you may have more limited options in terms of quantitative techniques. Many comparative analyses (such as T-tests, Chi-Square tests, and ANOVAs) require a minimum sample size that small evaluations may not meet. However, you can still get programmatically useful results even from a small sample size! When your sample size is small, rely on simple descriptive statistics as well as any qualitative responses to open-ended questions.
  • Remember you can perform qualitative analysis on qualitative survey data. While open-ended survey questions may not yield as much qualitative data as interviews or focus groups, you can still perform qualitative analyses on them, following similar procedures. Think about using simple thematic analyses with frequency counts, coding responses, word clouds, or representative quotes c to analyze qualitative survey data. Often, it helps to use a mix of qualitative methods when reporting.

Rad Resources


Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.