Hello everyone! This is Chelsey Leruth, an internal evaluator at Access Community Health Network in Chicago. I am currently wrapping up the analysis for a phone survey of participants in one of our maternal and child health programs, Westside Healthy Start. Today, I would like to share some tips and lessons learned during the survey planning and data collection.
Hot Tip: Surveys are an effective way to confirm preliminary findings or qualitative data. Based on program records, informal discussion with participants at program events, and a previous photovoice project, we had a good sense of which services participants really valued and how they benefited from the program. However, we wanted to validate this information with “hard data.” Through the survey, we were able to quantify participants’ knowledge, use, and attitudes about specific services as well as the overall program.
Cool Trick: Ask staff to give participants a flyer or card to “tease” the survey a few weeks before you begin making calls. Our marketing department designed and printed a 4” x 6” card with information about the purpose of the survey and the importance of participants’ feedback. The card also provided talking points (and a reminder!) for staff to introduce the survey to clients and encourage participation. Our response rate was much improved at the sites where we used the teaser card.
Rad Resource: This interviewer training guide (Injury Related Morbidity and Mortality Following Tornadoes in Alabama on April 27, 2011: Survey Instrument Training) developed by the Centers for Disease Control and Prevention (CDC) for a project with the Alabama Department of Public Health has several slides in plain language that are helpful for putting together training materials for your interviewers. Skip to page eight for instructions on interviewer roles, minimizing interviewer effects, and following various question formats.
Lesson Learned: We found that monitoring and supporting interviewers throughout the data collection period was just as important as initial training. Our interviewers encountered a variety of “tricky” scenarios along the way (such as disgruntled participants, significant others “leading” a respondent, and questions about program services during the survey) and routine check-ins provided a venue to discuss these issues, reinforce training, and adjust the protocol as needed.
The American Evaluation Association is celebrating Chicagoland Evaluation Association (CEA) Affiliate Week. The contributions all this week to aea365 come from CEA members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to email@example.com. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.