AEA365 | A Tip-a-Day by and for Evaluators

TAG | response rate

Greetings from beautiful Boise! We are Rakesh Mohan and Amy Lorenzo of Idaho’s legislative Office of Performance Evaluations. For the past six years, our office has been conducting surveys solely in an electronic environment. This approach has worked so well that we have all but abandoned the use of paper surveys. When we looked at costs, both in postage and manpower, paper surveys just didn’t make sense any more. That is, until this year.

As part of our current study of barriers to postsecondary education, we sought input from members of the Idaho School Counselor Association. Initially we thought of contacting them via e-mail and requesting them to complete a web-based survey. However, once we learned that they would be participating in a statewide conference in our town, we decided to brief them about our study at the conference and asked them to complete a paper survey and turn in their completed surveys at the conference registration table. The response was phenomenal!

Over the next 30 minutes, attendees answered our survey questions, chatted with us about our study, and offered to meet with us again in the future. By the end of the conference, we had received over 70 surveys, a response rate of nearly 50 percent. In a couple of hours, we had accomplished what normally takes us several weeks to achieve. We were able to close out the survey and begin our analysis the same day we distributed it.

Hot Tips: Why did this approach work so well? In the spirit of full disclosure, we should note that attendees were offered an extra raffle ticket for turning in a survey. However, that was really just one small piece of the equation. What really made this process a success was twofold.

  1. Conference attendees were a captive audience. By asking them to complete the survey while at the conference, we were no longer competing with their other professional and personal demands. Our e-mail wasn’t lost in their in-boxes or pushed to the back burner until they had free time. Attendees had the time, in that moment, to collect their thoughts and answer our questions.
  2. We made a personal connection with our target audience. By explaining to them how important their input was to our study, distributing our survey in person, and making ourselves available for questions, we overcame the often impersonal nature of a web-based survey.

We recognize that this approach is more the exception than the rule. With decreased budgets and increased workloads, paper surveys rarely make sense. But in certain circumstances, they can be a powerful tool for involving stakeholders.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators

· ·

Hello, I’m Jessica Foster, a Research and Evaluation Analyst at Franklin County Children Services.  Our agency provides child protective services for the Central Ohio area.  During the course of my career, I have administered many surveys to a variety of stakeholders and clients.  I’ve learned from my work and from the experiences shared by my colleagues that achieving a desirable survey response rate is almost always a challenge.  Here I will share some strategies that have helped increase survey participation.

Hot Tip: Make it Matter

Communicate the relevance and value of your survey to your participants.  In your invitation, let participants know what the survey’s purpose is, how the results will benefit them, and how much their opinions matter!

Hot Tip: Communication is Key

Make contact with your participants before, during, and after the survey is administered.  When possible I try to meet with potential respondents before I send out the survey invitation to give an overview of the survey in person and allow for questions.  It’s easier to dismiss a survey when it comes from someone you’ve never seen or talked to.

Cool Trick: Personalizing your survey invitations can help respondents feel more important and more accountable.  SurveyMonkey and Zoomerang have features that allow you to enter personalized information (e.g., name, e-mail address, custom fields) when sending invitations via the site.  You can also do a mail merge in Word or other programs to accomplish the same result.

Follow up with non-respondents at least once during the survey period.

After the survey is finished and the data have been analyzed, share a summary of your results with participants, if possible.

Hot Tip: Keep it Easy

Keep your survey as short and simple as possible.  Minimize the amount of material that participants have to read at the beginning of the survey – some of this information can go into the invitation, instead.

When possible, try to utilize existing organizational structures, such as staff meetings, to administer the survey.  That way, participants don’t have to make any additional effort to complete the survey.

Hot Tip: Incentivize

Several survey respondents have told me how much they appreciated a small incentive for participation.  If your budget allows for it, you may find it helpful to offer an individual incentive (e.g. $5 gift card for a coffee shop) or a group incentive (e.g. the region with the highest response rate gets a pizza party).  Another strategy is to enter participants into a raffle for a larger prize.

Cool Trick: I recently discovered that SurveyMonkey now allows you to build incentives right into your surveys.  You can either enter respondents into a drawing for a gift card, or award prize coupons to all of your participants.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · ·

My name is Susan Kistler and I am AEA’s Executive Director. I contribute each Saturday’s aea365 post. Many years ago I taught survey and instrument design at the University of Minnesota, and survey methodology is an issue near and dear to my heart. Thus, this week, I want to share a few resources from the American Association of Public Opinion Research (AAPOR), an organization dedicated to public opinion and survey research. Their website is a good place to start if you have a question about developing and deploying an evaluation survey.

Rad Resource: AAPOR’s Standard Definitions report details how to calculate response rates for random dial telephone surveys, in-person household surveys, mail surveys, and internet surveys. It is essential reading for those conducting rigorous surveys.

Lessons Learned: AAPOR’s Standard Definitions provides a common language for talking about survey response rate, including differentiating among the following:

  • Response rates – The number of complete interviews with reporting units divided by the number of eligible reporting units in the sample. The report provides six definitions of response rates, ranging from the definition that yields the lowest rate to the definition that yields the highest rate, depending on how partial interviews are considered and how cases of unknown eligibility are handled.
  • Cooperation rates – The proportion of all cases interviewed of all eligible units ever contacted. The report provides four definitions of cooperation rates, ranging from a minimum or lowest rate, to a maximum or highest rate.
  • Refusal rates – The proportion of all cases in which a housing unit or the respondent refuses to be interviewed, or breaks-off an interview, of all potentially eligible cases. The report provides three definitions of refusal rates, which differ in the way they treat dispositions of cases of unknown eligibility.
  • Contact rates – The proportion of all cases in which some responsible housing unit member was reached. The report provides three definitions of contact rates.
    (page 4-5)

Rad Resource: AAPOR’s Response Rate Calculator is an Excel spreadsheet with the categories pre-identified from the Standard Definitions and with the needed formulas entered. It simplifies the job of translating the Standard Definitions from paper to practice.

Rad Resource: AAPOR’s Survey Disclosure Checklist details the minimum disclosure requirements when reporting survey results to the public.

Rad Resource: AAPOR recommends the Survey Random Sample Size Calculator from custominsight.com and includes details on good samples versus bad samples, margin of sampling error, and weighting on its What is a Random Sample page.

Thank you to AAPOR for so generously sharing publicly these – and so many other – resources.

The above represents my own opinions and not necessarily those of AEA.

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org.

· · ·

My name is Shobha Mittal and I am an independent consultant. Seeking a higher response rate on the surveys has always been tricky in the field of evaluation.

Lessons Learned: Although there are many good references available to help evaluators impact the response rates, I have found that thinking like a respondent before seeking their response improves the likelihood of success. From a respondent’s perspective there are two major concerns (Besides of course having the inclination or time to respond). First is the technological comfort of responding to a questionnaire electronically versus the hardcopy response, and the second is the fear of consequence which may follow an honest response.

The response seeker must address the respondent’s need prior to expecting them to accommodate his/hers.

Hot Tip: Using a mixed mode survey strategy and providing both the paper survey method for survey response with an additional option of an electronic response may address both concerns of the respondents. Some respondents may not be assured by the confidentiality of the survey process and sometimes perceive the fear of being identified through their email IDs. The paper survey option will encourage such respondents to use the paper survey instead of a no response. Conversely, some respondents may find dealing with paper and pencil difficult with the technological advancement and may prefer using the electronic version instead of a no response. Providing a mixed mode response method is likely to help respondents take the survey than to ignore it. The mixed mode methodology requires extra time, efforts and resources. However, such costs may not outweigh the benefits.

Shobha will be presenting her work on mixed methods survey research as part of the poster exhibition at AEA’s annual conference this November in San Antonio. Join us at Evaluation 2010 to connect further wish Shobha and over 1000 other presenters!

· ·

Archives

To top