Hello, AEA365 community! Liz DiLuzio here, Lead Curator of the blog. This week is Individuals Week, which means we take a break from our themed weeks and spotlight the Hot Tips, Cool Tricks, Rad Resources and Lessons Learned from any evaluator interested in sharing. Would you like to contribute to future individuals weeks? Email me at AEA365@eval.org with an idea or a draft and we will make it happen.
Janelle, JoAnna, and Christiana here, evaluators from Emory Centers for Public Health Training and Technical Assistance. We had the opportunity to present a session entitled No More Crappy Surveys at the AEA Summer Evaluation Institute, teaching fellow evaluators how best to plan, design, execute, analyze, and report on surveys. We are on a mission to rid the world of crappy surveys, and are here to share some of our Hot Tips and Rad Resources to do so.
- Use Evaluation Questions to guide your survey development. Pick a few (typically, 3-5) evaluation questions to guide your survey. Make sure that all of your survey questions are relevant in answering your overarching evaluation questions. Don’t ask questions if you are not planning to use the information and be aware of the length of your survey. It is critical to engage partners from the outset (and throughout!) to ensure that the evaluation questions will be useful and relevant, and to set expectations of scope and anticipated results.
- Order your survey questions mindfully. Avoid putting open ended questions or other difficult to answer questions at the beginning of a survey, and make sure you ask the most important questions at the beginning, in case any respondents drop out part way through the survey.
- Think about the best question type for your purpose. There are some question types we recommend avoiding altogether (hello Ranking Questions!), and others we suggest you use sparingly (matrices and fill in the blank). Some of our favorites are validated questions like Net Promoter Scores, multiple choice questions, Likert scale questions, select all that apply questions, and the occasional open text box question.
- Be attentive to wording in survey questions. The way you phrase a question can have big impacts on the data that you get back, so be mindful of how your wording may affect respondents. Watch out for leading questions, double barreled questions, jargon, ambiguity, and extremes. See The Qualtrics Handbook of Question Design for more information on these wording mishaps and why they matter.
- Pilot, Pilot, Pilot! Always allow some time to test your survey before sending it out to the intended audience. Sometimes a small group of internal pilot testers will suffice, but it is ideal to pilot the survey with a small subset of your population as well. If your survey has any logic (branching or skip) make sure that you test every possible path that a respondent could take. While this step can be time intensive, you’ll be rewarded with higher quality data, and the time upfront will save you time and difficulty when you start to analyze.
- The Qualtrics Handbook of Question Design is a great resource that is applicable to all survey software and modalities (not just Qualtrics) is available as a free online download.
- USAID Checklist for Defining Evaluation Questions and this Evaluation Question Checklist for Program Evaluation are helpful tools to help you define and decide on evaluation questions to guide your survey.
- Qualtrics’ Survey Methodology & Compliance provides tips to make surveys accessible to all audiences, and which question types tend to be less accessible.
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to firstname.lastname@example.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.