AEA365 | A Tip-a-Day by and for Evaluators

TAG | surveymonkey

Hello! We are Laura Beals and Barbara Perry, internal evaluators at Jewish Family & Children’s Service, a large social-service agency located outside of Boston, MA. We have been gathering feedback from clients and other constituents via surveys, often online, for many years, but we recently completed a needs assessment project in which we also wanted to gather information from potential clients, their families, and professionals. One way we attempted to reach these new populations was through social media (via the agency’s Facebook page) and a Massachusetts non-profit email listserv.

In our invitation to participate in the survey—the Facebook post or the email invitation to the listserv—we clearly but briefly outlined who we were looking for (the target population), what we were asking them to do, what the incentive was, and how to access the link. In the introduction to the survey itself, we more thoroughly outlined participation expectations and indicated that we would (snail) mail the first 100 participants who were interested a $5 gift card to Dunkin Donuts. We had decided to use a physical gift card, instead of an electronic gift card, and we are so glad that we did…because about two weeks after our first survey completions from the Facebook and listserv posts, we started to get back envelopes of gift cards with bright yellow “return to sender” stickers.

This prompted us to immediately take a closer look at these responses—we found email addresses that didn’t match names provided in the mailing address (not that they have to, but it was suspicious), many email addresses had random numbers at the end, all had out-of-state mailing addresses, searching on whitepages.com suggested these likely were fake people, they were all in the same survey collector, and the content of their responses were not in alignment with those that we knew from real people (e.g., they completed the closed-ended questions, but none of the open-ended). Luckily, we were able to determine a pattern to these fraudulent responses and eliminate them from our analysis. We had planned to look at our responses for duplicates (email address, IP addresses if known, etc.), but had not thought about reviewing for fraud!

Lessons Learned:

As more-and-more evaluators may be turning to social media to gather survey responses, especially for youth, we wanted to share three lessons learned from our experiences:

  1. Use a unique collector for each distribution type. We use SurveyMonkey as our surveying tool, and it gives us the ability to create distinct links for each type of distribution (email to clients, post to Facebook, email to listserv, etc.). This allowed us to pinpoint which collector was the culprit.
  2. Read “Detecting, Preventing, and Responding to “Fraudsters” in Internet Research: Ethics and Tradeoffs.” The authors provide a table of “Methods of Detecting and Preventing Internet Study Duplication and Fraud and Their Implications.”
  3. Include the step of examining your responses for signs of fraud before beginning analysis.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

My name is Kate Rohrbaugh and I am Co-Chair of the Business, Leadership, and Performance TIG along with Michelle Baron.  I’m a Research Team Leader at a consulting firm in Virginia leading a group studying capital project organizations and teams in the process industries.  Today I’d like to talk about the renaming of our TIG and the tools we used to conduct this work.

When I accepted my current position five years ago, I had to rethink my AEA TIG membership because I had been a faithful member of educationally related TIGs, which were no longer relevant.  The number of TIGs at AEA can be overwhelming at times, but it also offers a wide variety of “homes” to evaluators regardless of the content area.  In my new position I turned to the Business and Industry TIG where I found a small but dedicated group of professionals.  I “lurked” with this group for a year, and within a short time (since it was a smaller group), I was able to take an active role in the leadership of this TIG.

In discussions with the leadership of the TIG and at AEA, we determined that the name of the TIG was unnecessarily limiting both presenters and audience – evaluation issues in for-profit organizations are relevant to a wide variety of evaluation professionals in both private and public sectors.  For this reason, we canvassed the membership and working closely with the AEA staff and board, identified a new name for our TIG.

Rad Resources

  • AEA maintains a list of members in each TIG and faithfully protects AEA membership from unnecessary contact, but this was a great source for contacting our membership about the desire to change the name of the TIG and solicit ideas for renaming the TIG.
  • To canvass our membership, we turned to the old faithful Survey Monkey which met our simple needs for collection and analysis.
  • To discuss the results with the TIG leadership located across and outside the United States, we turned to FreeConferenceCall.com, which is exactly what you think it is.

We are excited about the AEA 2012 in Minneapolis and hope to see lots of new faces at our presentations and business meeting!

The American Evaluation Association is celebrating the Business, Leadership, and Performance TIG (BLP) Week. The contributions all week come from BLP members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · ·

Hello, I am Ian Shadrick, program coordinator of graduate programs in Blindness and Low Vision and Orientation & Mobility at Missouri State University. As a follow up to some of great tips provided in Creating Presentation Potent for All, I’d like to talk to you today about some additional methods to ensure full inclusion of persons with blindness or low vision in your presentations and related evaluations. Without careful consideration, it is quite common for persons with blindness or low vision to not have the opportunity to fully participate in these environments. Additionally many individuals with low vision, or those in early stages of vision loss, may not be as obvious; or may choose not to disclose their low vision. Given this, it is important to consider ways to improve opportunities for access to information both in presentations and related evaluations.

Lessons Learned:

  • The World Health Organization reports 161 million people have a visual impairment, among them 124 million are persons with low vision, and 37 million are persons with blindness.
  • Most persons with blindness have some amount of vision, which can vary in functionality based on environment.
  • Unlike persons with blindness, many persons with low vision may present with no visible signs as someone with low vision, i.e. they do not use a cane/dog guide, and may not be using any other assistive optical devices.
  • Many factors can impact the ability for a person with low vision to fully participate in presentations and evaluations (e.g. lighting, glare, contrast, screen location/angles, etc.).

Tips for Presentations:

  • Whenever possible provide materials ahead, such as a website provided by the organization/conference.
  • When considering electronic documents, provide either Microsoft Word or Rich Text Format files, as these are most accessible to screen reading and magnification software.
  • Don’t be afraid to use tables or charts, but ensure descriptions are provided verbally and in text documents.
  • Describe graphics or photos whenever present.
  • Consider lighting when presenting, if possible leave some lights on, allowing someone the opportunity to use the light as needed.
  • If using a video, consider obtaining a descriptive video, or if not available, be prepared to provide a description yourself.

Tips for Evaluations Related to Presentations:

  • Whenever possible provide an opportunity for completion through an accessible website, such as Survey Monkey.
  • If possible provide alternative formats (large print and Braille) for hardcopy forms.
  • Consider having someone present to assist in reading and recording responses if the two previous suggestions are not an option.

Resources:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · · ·

Hello, I’m Jessica Foster, a Research and Evaluation Analyst at Franklin County Children Services.  Our agency provides child protective services for the Central Ohio area.  During the course of my career, I have administered many surveys to a variety of stakeholders and clients.  I’ve learned from my work and from the experiences shared by my colleagues that achieving a desirable survey response rate is almost always a challenge.  Here I will share some strategies that have helped increase survey participation.

Hot Tip: Make it Matter

Communicate the relevance and value of your survey to your participants.  In your invitation, let participants know what the survey’s purpose is, how the results will benefit them, and how much their opinions matter!

Hot Tip: Communication is Key

Make contact with your participants before, during, and after the survey is administered.  When possible I try to meet with potential respondents before I send out the survey invitation to give an overview of the survey in person and allow for questions.  It’s easier to dismiss a survey when it comes from someone you’ve never seen or talked to.

Cool Trick: Personalizing your survey invitations can help respondents feel more important and more accountable.  SurveyMonkey and Zoomerang have features that allow you to enter personalized information (e.g., name, e-mail address, custom fields) when sending invitations via the site.  You can also do a mail merge in Word or other programs to accomplish the same result.

Follow up with non-respondents at least once during the survey period.

After the survey is finished and the data have been analyzed, share a summary of your results with participants, if possible.

Hot Tip: Keep it Easy

Keep your survey as short and simple as possible.  Minimize the amount of material that participants have to read at the beginning of the survey – some of this information can go into the invitation, instead.

When possible, try to utilize existing organizational structures, such as staff meetings, to administer the survey.  That way, participants don’t have to make any additional effort to complete the survey.

Hot Tip: Incentivize

Several survey respondents have told me how much they appreciated a small incentive for participation.  If your budget allows for it, you may find it helpful to offer an individual incentive (e.g. $5 gift card for a coffee shop) or a group incentive (e.g. the region with the highest response rate gets a pizza party).  Another strategy is to enter participants into a raffle for a larger prize.

Cool Trick: I recently discovered that SurveyMonkey now allows you to build incentives right into your surveys.  You can either enter respondents into a drawing for a gift card, or award prize coupons to all of your participants.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · ·

My name is Susan Kistler and I am AEA’s Executive Director. I contribute each Saturday’s post to the aea365 blog. This week, Lois Ritter and Tessa Robinette gave a great (free!) webinar for AEA comparing Surveymonkey and Zoomerang, helping to compare and contrast the two for potential new users. Building on their presentation, and from my own experiences having worked with both of these programs as well as two others over the past two years, I wanted to share this week a tip and lesson learned in using online survey software.

Hot Tip: Take Advantage of the Free Trial – The majority of survey platforms offer a free trial, usually allowing for only a few questions and respondents. Create and try out a sample survey from beginning to end, including sending invitations, collecting data, and completing the analysis, usually by exporting the data into other software for further analysis. Walk through the entire process with data parallel to that which you anticipate for your actual survey.

Lessons learned – Permissions: Does your survey platform help you to comply with IRB expectations? With the CAN-SPAM act? While using the built-in invitation functions of many online survey platforms can help you with sending, tracking, and compliance, it can also distort your sample and limit your access to potentially viable respondents because opt-out treatment varies from platform to platform. Using the two platforms discussed in this week’s webinar as examples, when survey recipients opt-out of a SurveyMonkey survey invitation, by clicking the opt-out button on the bottom, they are directed to this address http://www.surveymonkey.com/optout.aspx and they opt-out of ALL surveys from SurveyMonkey, not just a particular survey or those from a particular sender. Alternatively, when users opt out of a survey from Zoomerang, the default is that they opt out of surveys only from that particular sender (see http://zoomerang.custhelp.com/app/answers/detail/a_id/308). Notably though, it is much easier to opt back in to SurveyMonkey than Zoomerang. Lesson learned? Research how your platform treats opt-outs and determine how this is likely to impact your respondent pool.

Hot Tip: If you are an AEA member, review the Ritter and Robinette Webinar Recording in the AEA webinars archive at http://ow.ly/1XqAh to gain a better understanding of other considerations when choosing a survey platform.

This contribution is from the aea365 Daily Tips blog, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org. The above comments reflect my own opinion and not necessarily that of the American Evaluation Association.

· · · ·

Archives

To top