AEA365 | A Tip-a-Day by and for Evaluators

TAG | survey

Hi, I am Rick Davies, an Evaluation consultant, based in Cambridge, UK. The Basic Necessities Survey (BNS) is a method of measuring poverty that is:

  • Simple to design and implement. The results are easy to analyse and to communicate to others
  • Democratic in the way that it identifies what constitutes poverty and who is poor
  • Rights-based, in its emphasis on entitlement

The BNS builds on and adapts earlier methods that have been used to measure poverty by measuring deprivation and which emphasise the “consensual” definition of poverty. However, the BNS is innovative in the way in which (a) individual poverty scores and (b) a poverty line are generated from respondents’ survey responses.

The BNS examines two aspects of people’s lives: (a) their material conditions, (b) their perceptions of these material conditions. Both have consequences for the quality of people’s lives.

Basic necessities are democratically defined as those items listed in a “menu” that 50% or more of respondents agree “are basic necessities that everyone should be able to have and nobody should have to go without”.  Items are weighted for importance according to the percentage of respondents who say an item is a basic necessity (between 50% and 100%). Respondents’ poverty (BNS) scores are based on the sum of the weightings of the basic necessities they have, as a percentage of the total they could have if they had all basic necessities. Items for inclusion in a BNS menu are identified through prior stakeholder consultations and include things, activities and services that can be reliably observed.

The BNS has been widely used and adapted particularly by international conservation NGOs who want to monitor and evaluate the impact of their work. Most notably by the Wildlife Conservation Society and supported by USAID.

Possible concerns:

  • Do different sections of the community have different views about what things are basic necessities? For example, women may have different views to men? Surveys in the UK, Vietnam and South Africa show high levels of agreement across genders.
  • Do people’s views of what are basic necessities reflect their bounded realities? If they don’t know how others live, how can they have such expectations? In the UK and South Africa, the expectations of poorest respondents were not significantly different from those of the richer respondents
  • Do people limit their view of what are necessities when they can’t achieve them? In South Africa “…there is very little evidence of people reporting that they had chosen not to possess any of the socially perceived necessities”

Lessons Learned:

Hot Tip:

  • Where to go to first, to learn more: The Basic Necessities Survey webpage on MandE NEWS website provides detailed information on survey design and analyses, and extensive references to its use and related work.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Greetings AEA colleagues. We are Carla Hillerns and Pei-Pei Lei – survey enthusiasts in the Office of Survey Research at the University of Massachusetts Medical School. In 2014, we shared a post about effective email subject lines for internet survey invitations. Today we’d like to focus on the body of the email. Here are strategies for writing email invitations that motivate recipients to participate in your survey.

Hot Tips:

  • Personalize the salutation. Whenever possible, begin the invitation with the recipient’s name, such as “Dear Carla Hillerns” or “Dear Ms. Lei.” Personalization helps people know that they’re the intended recipient of the invitation.
  • Do not bury the lead. Use the first line or two of the email to invite the recipient to take the survey. Some people might open your email on mobile devices, which have significantly smaller screen sizes than most computers.
  • Include the essentials. A survey invitation should accomplish the following:
    • Explain why the individual is chosen for the survey
    • Request participation in the survey
    • Explain why participation is important
    • Provide clear instructions for accessing the survey
    • Address key concerns, such as confidentiality, and provide a way for recipients to ask questions about the survey, such as a telephone number and email address
    • Express appreciation
    • Include sender information that conveys the survey’s legitimacy and significance
  • Less is more. The most frequent problem we’ve seen is an overly wordy invitation. Follow the modified KISS principle – Keep It Short and Simple. Common issues that complicate invitations are:
    • Overlong sentences
    • Redundant points
    • Extra background details
    • Cryptic wording, such as acronyms and technical jargon
    • Intricate instructions for accessing and/or completing the survey

Cool Trick:

  • Pre-notify, if appropriate. Examples of pre-notifications include an advance letter from a key sponsor or an announcement at a meeting. Pre-notification can be a great way to relay compelling information about the survey so that the email invitation can focus on its purpose.

Lesson Learned:

Rad Resources:

  • Emily Lauer and Courtney Dutra’s AEA365 post on using Plain Language offers useful tips that can be applied to all aspects of survey design and implementation, including the initial invitation email, any reminders emails, and the survey itself.
  • Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method, 4th Edition by Don A. Dillman, Jolene D. Smyth, and Leah Melani Christian provides lots of helpful guidance for crafting invitations and implementing internet surveys.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Hello! My name is Valerie Futch Ehrlich and I am the Evaluation and Research Lead for the Societal Advancement group at the Center for Creative Leadership. My team focuses on supporting our K-12, higher education, non-profit, and public health sector initiatives through evaluation and research. I want to share with you our recent experience using pulse surveys to collect feedback from school-wide faculty on a professional development initiative.

Pulse surveys” are short, specific, and actionable surveys intended to collect rapid feedback that is immediately utilized to inform the direction of a program, activity, or culture. Through our partnership with Ravenscroft School, we used a pulse survey midway through a (mandated) year-long professional development experience and timed it so that the pulse feedback would inform the next phase of programming.

We used Waggl, a tool designed for pulse surveys, that has a simple interface to include either yes/no questions, agreement scales, or one open-ended question. A neat feature of Waggl is that it allows for voting as long as the pulse is open, encouraging participants to read the open-ended responses of their peers and vote on them. This way, you can have the most actionable requests filter up to the top based on voting, and it can help drive decisions.

In our case, the Waggl responses directly informed the design of the second phase of training. We also repeated the Waggl toward the end of the school year to quickly see if our program had its intended impact, to provide ideas for a more comprehensive evaluation survey, and to inform the next year of work with the school.

Hot Tips:

  • Keep your pulse survey short! This helps ensure participation. It should be no more than 5-10 questions and take less than a minute or two.
  • Pulse survey results are quick fodder for infographics! Waggl has this functionality built in, but with a little tweaking you could get similar information from a Google Form or other tools.
  • Consider demographic categories that might provide useful ways to cut the data. We looked at differences across school levels and how different cohort groups were responding, which helped our program designers further tailor the training.
  • Pulse surveys build engagement and buy-in…when you use them! Faculty reported feeling very validated by our use of their feedback in the program design. The transparency and openness to feedback by our design team likely increased faculty buy-in for the entire program.

Lesson Learned:

Think outside the box for pulse surveys. Although they are popular with companies for exploring employee engagement, imagine using them with parents at a school, mentors at an after-school program, or even students in a classroom giving feedback to their instructor. There are many possibilities! Any place you want quick, useful feedback would be a great place to add them. In our next phase of work, we are considering training school leaders to send out their own pulse surveys and incorporate the feedback into their practices. Stay tuned!

The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PreK-12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

I am Holly Kipp, Researcher, from The Oregon Community Foundation (OCF). Today’s post shares some of what we’re learning through our efforts to measure social-emotional learning (SEL) in youth in the context of our K-12 Student Success Initiative.

The Initiative, funded in partnership with The Ford Family Foundation, aims to help close the achievement gap among students in Oregon by supporting expansion and improvement of out-of-school time programs for middle school students.

Through our evaluation of the Initiative, we are collecting information about program design and improvement, students and their participation, and student and parent perspectives. One of our key data sources is a survey of students about their social-emotional learning (SEL).

Rad Resources: There are a number of places where you can learn more about SEL and its measurement. Some key resources include:

  • The Collaborative for Academic Social and Emotional Learning, or CASEL
  • The University of Chicago Consortium on School Research, in particular their Students & Learning page

In selecting a survey tool, we wanted to ensure the information collected would be useful both for our evaluation and for our grantees. By engaging grantee staff in our process of tool selection, they had a direct stake in the process and would hopefully buy-in to using the tool we chose – not only for our evaluation efforts but for their ongoing program improvement processes. 

Hot Tip: Engage grantee staff directly in vetting and adapting a tool.

We first mined grantee logic models for their outcomes of interest, reviewed survey tools already in use by grantees, and talked with grantees about what they wanted and needed to learn. We then talked with grantees about the frameworks and tools we were exploring in order to get their feedback.

We ultimately selected and adapted The Youth Skills and Beliefs Survey developed by the Youth Development Executives of King County (YDEKC) with support from American Institutes for Research.

Rad Resource: YDEKC has made available lots of information about their survey, the constructs it measures, and how they developed the tool.

Rad Resource: There are several other well-established tools worth exploring, such as the DESSA (or DESSA-mini) and DAP and related surveys, especially if cost is not a critical factor.

Hot Tip: Student surveys aren’t the only way to measure SEL! Consider more qualitative and participatory approaches to understanding student social-emotional learning.

Student surveys are only one approach to measuring SEL. We are also working with our grantees to engage students in photo voice projects that explore concepts of identity and belonging – elements that are more challenging to measure well with a survey.

Rad Resource: AEA’s Youth Focused TIG is a great resource for youth focused and participatory methods.

The American Evaluation Association is celebrating Oregon Community Foundation (OCF) week. The contributions all this week to aea365 come from OCF team members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· ·

We are Caitlin Ruffenach, Researcher, and Kim Leonard, Senior Evaluation Officer, from The Oregon Community Foundation (OCF). Among other things, we are working on an evaluation of the Studio to School Initiative at OCF, which focuses on the development of sustainable arts education programs through partnerships between arts organizations and schools.

This past summer, in collaboration with the Oregon Arts Commission, we conducted a survey of arts organizations in Oregon in an effort to learn about the arts education programming they provide, often in concert with what is available more directly through the school system.

The purpose of this survey was to help the Foundation understand how the grantees of its Studio to School Initiative fit into the broader arts education landscape in Oregon. We hope the survey results will also serve as a resource for grantees, funders, and other stakeholders to understand and identify programs delivering arts education throughout the state.

Lesson Learned: To ensure we would have the most useful information possible, our survey design process included several noteworthy steps:

  1. We started with existing data; by gathering information about organizations who had received funding in arts education in Oregon in the past we were able to target our efforts to recruit respondents.
  2. We consulted with others who have done similar surveys to learn from their successes and challenges;
  3. We paid close attention to survey question wording to ensure that we were focusing as tightly on what was measurable by survey as possible; and
  4. We vetted our early findings with arts education stakeholders.

Hot Tip: A collaborative, inclusive survey design process can result in better survey tools. We used a small, informal advisory group throughout the process that included members who had conducted similar surveys and representatives of our target respondent group. They helped with question wording, as well as with identifying a small survey pilot.

Hot Tip: Vetting preliminary findings with stakeholders is fun and helps support evaluation use. We took advantage of an existing gathering of arts stakeholders in Oregon to share and workshop our initial findings. We used a data placemat, complete with re-useable stickers, to slowly reveal the findings. We then engaged the attendees in discussions about how the findings did or didn’t resonate with their experiences. What we learned during this gathering is reflected in our final report.

Resources: We are not the first to try a more inclusive process both in developing our survey tool and in vetting/interpreting the results! Check out the previous aea365 post about participatory data analysis. And check out the Innovation Network’s slide deck on Data Placemats for more information about that particular tool.

The American Evaluation Association is celebrating Oregon Community Foundation (OCF) week. The contributions all this week to aea365 come from OCF team members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi, I’m Ama Nyame-Mensah. I am a doctoral student at the University of Pennsylvania’s School of Social Policy & Practice. In this post, I will share with you some lessons learned about incorporating demographic variables into surveys or questionnaires.

For many, the most important part of a survey or questionnaire is the demographics section. Not only can demographic data help you describe your target audience, but also it can reveal patterns in the data across certain groups of individuals (e.g., gender, income level). So asking the right demographic questions is crucial.

Lesson Learned #1: Plan ahead

In the survey/questionnaire design phase, consider how you will analyze your data by identifying relevant groups of respondents. This will ensure that you collect the demographic information you need. (Remember: you cannot analyze data you do not have!)

Lesson Learned #2: See what others have done

If you are unsure of what items to include in your demographics section, try searching through AEA’s Publications or Google Scholar for research/evaluations being done in a similar area. Using those sources, you can locate links to specific tools or survey instruments that use demographic questions that you would like to incorporate into your our work.

Lesson Learned #3: Let respondents opt out

Allow respondents the option of opting out of the demographics section in its entirety, or, at the very least, make sure to add a “prefer not to answer” option to all demographic questions. In general, it is good practice to include a “prefer not to answer” choice when asking sensitive questions because it may make the difference between a respondent skipping a single question and discontinuing the survey altogether.

Lesson Learned #4: Make it concise, but complete

I learned one of the best lessons in survey/questionnaire design at my old job. We were in the process of revamping our annual surveys, and a steering committee member suggested that we put all of our demographic questions on one page. Placing all of your demographic questions on one page will not only make your survey “feel” shorter and flow better, but it will also push you to think about which demographic questions are most relevant to your work.

Collecting the right demographic data in the right way can help you uncover meaningful and actionable insights.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Hi, I’m Wendy DuBow, a senior research scientist and director of evaluation at the National Center for Women & Information Technology (NCWIT). Our mission is to increase the meaningful participation of women in technology fields. We focus on sharing theory- and evidence-based practices with stakeholders in education and industry to support them as they recruit, retain, and promote girls and women in tech. In my position, I see a lot of K-20 interventions aimed at increasing women in tech, and alongside, a wide variety of measurement instruments.

Lesson Learned: Using Social Cognitive Career Theory. Most of the evaluations I see don’t take advantage of theory or past empirical evidence to ground their assessments. It would be great to share more theory- or evidence-based evaluation approaches. The social cognitive career theory (SCCT) model has been widely used to explain people’s educational and career interests in STEM. We wanted to specifically assess students in computer science-related programs, so we developed an instrument that uses SCCT to assess five constructs: interest, self-efficacy, outcome expectations, perceived social supports and barriers, and intent to persist in computing. Our survey has been used in a number of different educational settings, with middle and high school students, and with college and above. Of course, there are many other valid and reliable instruments available to evaluators of STEM education programs, but it can be hard to find them when you’re pressed for time in the proposal writing or instrument development stages. For expediency and for the larger good of sharing data and measuring interventions systematically, I would very much like to see STEM education evaluators and researchers have a shared repository of instruments. To this end, I’m holding two sessions at the Chicago AEA meeting to discuss

Hot Tip: Our SCCT survey instrument is publicly available upon request.

Cool Trick: We currently use SurveyMonkey for online surveys, and also have access to Qualtrix, so if you use either of these tools, we can share our SCCT survey directly with your pro account, already formatted though you can customize as you see fit! We just ask that you acknowledge NCWIT in any presentations or write-ups of the data.

Rad Resource: A variety of STEM assessment tools are already collected in the Engineering field:

Lessons Learned: Be sure that all of the SCCT survey constructs match the intended outcomes of the program, and tailor the wording of the parenthetical explanations of each item to the program being evaluated.

Get involved: Please come to the AEA 2015 Think Tank “Improving the Quality and Effectiveness of Computer Science Education Evaluation Through Sharing Survey Instruments” and the multi-paper session “Four Approaches to Measuring STEM Education Innovations: Moving Toward Standardization and Large Data Sets.”

The American Evaluation Association is celebrating STEM Education and Training TIG Week with our colleagues in the STEM Education and Training Topical Interest Group. The contributions all this week to aea365 come from our STEM TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi, I’m Paul Bakker. I run an independent evaluation consultancy, Social Impact Squared.  As an independent consultant, I don’t have a call centre to conduct population based surveys. However, a colleague (Brian Cugleman of Alterspark) introduced me to Google Consumer Surveys. I would like to share with you my research and trials into how Google Consumer Surveys could be used for evaluations.

Rad Resource: Google Consumer Surveys embeds your survey questions into a network of websites, and people answer your questions in exchange for access to the websites’ content.   You can target specific populations by:

  • The location of respondents’ IP addresses,
  • Demographics such as age or gender,
  • A custom screening question.

Google Consumer Surveys allows you to ask up to 10 questions to any one group of respondents for a relatively low cost – 10¢ per complete for 1 question surveys, and $1.10 to $3.50 per complete for 2 – 10 question surveys.

Lesson Learned: Google Consumer Surveys’ accuracy and reliability is comparable to other surveying methods. You can view their paper on how Google Consumer Surveys compare to other internet surveys here, and you can view how well their predictions on the 2012 U.S. Presidential race compare to other internet and phone surveys here.

Hot Tip: You can use Google Consumer Surveys to help answer evaluation questions around the relevance and effectiveness of population level programs and policies.

As a trial, I ran a couple of 1 question surveys:

  • The first found that only about 29% of Canadians remember reading a Material Data Safety Sheet outside of a training session. That metric could be used as a possible performance measure for Canada’s Workplace Hazardous Materials Information System.
  • The second found that 70% of recipients of Canada’s Universal Child Care Benefit (UCCB) felt that their child would continue to receive no or the same child care if they did not receive the benefit. While there may have been some issues with the survey’s response options, the results could inform an evaluation of the UCCB’s performance.

You can view the overall results and break them down by demographics by clicking on the links above.

survey-wxau5xt2anryq-question-1 survey-wz5nv34j6khxs-question-2

We’re celebrating 2-for-1 Week here at aea365. With tremendous interest in the blog lately, we’ve had many authors eager to share their evaluation wisdom, so for one special week, readers will be treated to two blog posts per day! Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

I’m Ann Martin, an evaluator working with a team of science educators and outreach professionals in Hampton, VA. I frequently employ Google Formsfor evaluation and even other projects. Forms are free, simple, intuitive for users, and get the job done.

In the past, though, Google Forms had several notable limitations. If you found that Forms didn’t meet your needs in the past, you might not be aware of great new features that represent significant improvements. It’s worth taking another look at this free resource!

Hot Tip: Customize the visual look, feel, and branding of your survey! In September 2014 new Forms functionality allowed survey designers to add background and header images and to customize fonts and other display options. Before, theme options were limited. You can use this functionality to make your survey more readable and inviting. A custom header image with a logo may make your users feel more comfortable responding, or can make your survey a seamless part of a website in which you embed it. You can also embed images and videos within the body of the survey itself, which is handy for quizzes or assessments.

Figure 1 – Customization options range from a header image, page and form background, and fonts.

Figure 1 – Customization options range from a header image, page and form background, and fonts.

Cool Trick: Google Forms now support more complex survey design and administration options, including progress bars, data validation, logic/path branching, and randomizing the order of options in multiple choice questions. It’s also easier now to set up your survey’s questions. For instance, if you have a long list of options to include in a question, you can now copy-and-paste in a list from a word processor or spreadsheet table and automatically populate. (I wish that option had existed a few years back, when I created a drop-down with 200 alphabetized options!)

Cool Trick: New Add-ons enable even more behind-the-scenes functionality. The latest Add-ons includenifty widgets like Form Notifications, which will send automatic emails to your survey respondents, and Form Publisher, which will use survey responses to fill in a new document from a template.

Figure 2. Example Add-ons for Google Forms (screencapture from the Google Drive Add-ons Store).

Figure 2. Example Add-ons for Google Forms (screencapture from the Google Drive Add-ons Store).

Rad Resource: The Google Drive blog shares updates to Forms functionality so that you can always be aware of new features. I’m also more than happy to share tips if you contact me.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org .aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Lisa R. Holliday, and I am an Evaluation Associate with The Evaluation Group in Columbia, SC. I was recently cleaning survey data and preparing it to be uploaded into a database. The survey had been offered multiple times, and I wanted to track the responses of participants who had completed the survey each time it was offered. To start with, I needed to create a list of non-duplicated names. I would then be able to use this list to determine which participants had taken the survey each time it was offered.

Hot Tip: Power Query can remove duplicates from a list. Power Query is a handy add-in for Microsoft Excel that allowed me to generate my list quickly and easily. Power Query is a business intelligence tool that works with Excel 2010 and 2013. It is available for free download at the Microsoft website: http://www.microsoft.com/en-us/download/details.aspx?id=39379

Cool Tricks:

Step 1: I created a unique field in my survey data that would allow me to identify each person.  Just in case I had two people from the same site with the same name, I concatenated their name with their location and job title. The final cell looked like this:

Holliday 1

I then created a master list of names (with duplicates) using cut and paste. Once this was done, I was ready to load my data in to Power Query.

Step 2: I selected to load my data “From Table” under the Power Query tab.

Holliday 2

Step 3: Within the Power Query window, I selected the concatenated column, and then “Remove Duplicates.”

Holliday 3

After my query ran, I selected “Close and Load.” Excel created a new table comprised of unique values only.

Holliday 4

Hot Tip: Why not use “Remove Duplicates” from the Data Tab?

Any data you load into Power Query can be refreshed, and the query will automatically be re-executed. This feature is valuable if you plan to add more data to your original data set. In contrast, “Remove Duplicates” under the Data Tab does not have this option.

Hot Tip: Other Functionality. Power Query has a lot of other useful functionality and is worth exploring. It can easily import data from a variety of sources (including websites), un-pivot data, and split columns (such as First and Last name).

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Older posts >>

Archives

To top