We are Hilary Loeb and Kelly Bay of the Research and Evaluation Department at the College Success Foundation. Many of our scholarship and support programs host events in which we collect data from students and educators. As internal evaluators, we often rely on colleagues to collect and enter survey data from these groups. The results are used for staff learning internally and external reporting. To help evaluators increase survey relevance, decrease demands on respondents’ time, and ultimately boost data quality and response rates, below are tips on instrument design and data collection.
Lessons Learned:
Look for ways to make surveys easier for staff to administer up front and more useful to stakeholders at the back end. The key is keeping the main focus on your programs while building support for data collection and analysis efforts.
Hot Tips
Survey Design:
- Ensure that survey content is relevant: Meet with the entire program team and start with the question, “What do we want to learn about our program?” before discussing what’s needed for grant-reporting requirements.
- Draft a survey using previously tested questions: You don’t have to reinvent the wheel. By using previously tested survey questions from existing “banks” of items, you can save time and often improve the quality of the data collected (see Rad Resources).
- Pilot test surveys with your program team and other stakeholders. This exercise never fails to elicit important feedback and takes only a modest amount of time. It’s amazing what fresh eyes can find! Where possible, use trainings and even Board meetings as opportunities to pilot and discuss surveys.
Survey Data Collection:
- Be strategic about paper versus online surveys: When event participants can’t readily access computers, paper surveys may help increase response rates. Online surveys are more appropriate when participants are able and willing to access technology.
- Designate sufficient time and staff to collect survey data: Ensure that there is a specific time slot dedicated for survey completion. It should be near to but not at the very end of the event. We suggest providing a script to help staff describe the survey’s purpose and value.
- Consider using scanning software for paper surveys: Scanning software automates data entry by reading the optical marks on paper survey forms, which can reduce errors and save time. Before purchasing, it’s best to test. We piloted a Free Demo of Remark Office OMR, to confirm that this was the right software for our organization.
Rad Resources:
A Bing search of survey item banks yields over 60 million results. Our favorites in the education and youth development field include: Ansell-Casey Life Skills Assessments, the Youth Behavioral Risk Surveillance System and National Center for Educational Statistics resources .
The American Evaluation Association is celebrating Internal Evaluators TIG Week. The contributions all week come from IE members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.
Two quick tips:
Drafting a data analysis plan will help determine what data elements are needed (and eliminate those that are not).
The Roper Center and ICPSR also maintains a data archive useful for generating items.