DEOET TIG Week: Kim Manturuk on Increasing Survey Response Rates in Open Online Classes

Hello, my name is Kim Manturuk and I’m the Program Evaluator at Duke University’s Center for Instructional Technology. I get to evaluate a lot of interesting projects related to teaching and technology, but one of my biggest jobs is to evaluate Duke’s Massive Open Online Classes (MOOCs).

MOOCs are free, non-credit classes offered by universities and other institutions to anyone in the world. Since 2012, Duke has developed over 30 classes that have amassed over 2 million registrations. It’s my job to evaluate how well students in these classes are learning what the instructors want them to learn.

To accomplish this, I send out a lot of surveys – over 2 million surveys (and counting) in less than 3 years! When I started this project, I would be lucky to get 5% of people who registered for a class to fill out the pre-course survey, and the post-course survey response rates often hovered around 1%. It’s practically impossible to say anything evaluative with a 1% response rate, so I tried a lot of different things to get more people to fill out surveys. Some worked and some didn’t, but I learned several good lessons along the way.

Lesson Learned: When sending a survey to an online class, sign the email invitation with the class instructor’s name (with permission, of course). People are more likely to respond when the invitation comes from someone they know and respect.

Lesson Learned: Avoid sending surveys on Mondays when they are more likely to be ignored or accidently deleted during the first of the week email cleaning. It is better to send surveys from mid-morning on Tuesday through Friday.

Cool Trick: Tell people in the survey invitation how many questions they will be asked and how long it will take. I set it up so that students are asked just ten questions and then they are automatically brought back to their class.

Hot Tip: Be thoughtful about what demographic questions you ask in online classes. In some cultures, questions about race or gender are considered confusing, intrusive, or even offensive.

Rad Resource: Would you like to take a free class from Duke University? Visit our list of classes at And if you do register for a class, be sure to fill out the course surveys!

The American Evaluation Association is celebrating DEOET TIG Week with our colleagues in the Distance Education and Other Educational Technologies Topical Interest Group. The contributions all this week to aea365 come from our DEOET TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

4 thoughts on “DEOET TIG Week: Kim Manturuk on Increasing Survey Response Rates in Open Online Classes”

  1. Hi Kim!

    My name is Casey Shelley. I am a student in the Professional Master of Education Program at Queen’s University. Currently, I am enrolled in a course called “Program Inquiry and Evaluation.” This is my first time studying in this area, and I have found it to be quite interesting.

    I came across this article and it coincided so well with a project that I am currently working on, that I thought I would reach out to you with gratitude. This project has required me to design my first program evaluation. In doing this, it is important to consider the perspectives/opinions of stakeholders. I had decided that presenting a survey to these individuals would be the best way to go about this, but also realized that there are issues with this (problems which myself along with classmates have been unsure how to solve). Your article has provided me with great insight: how to gain attention/interest from those who are presented with my survey as well as questions that should be avoided. Thank you for sharing your advice, as it will help me to formulate my survey for the final version of my project!

    I have a question for you: I had decided that for my project I would do a written survey (which would be filled out by relevant stakeholders, including program staff and clients to name a few). A classmate had suggested that literacy ability may affect the accuracy/possibility of this. Do you think that an online survey may be more effective? Perhaps if there were an option to add an audio component to questions?

    A follow-up question: is there a format of survey that leads to the smallest level of possible stakeholder bias? (oral, written, etc…).

    As a final note, I wanted to mention that as an individual who has taken quite a few online courses throughout my educational journey, it was interesting to hear your perspective. I have personally been invited to participate in surveys related to my online classes, but did not realize the significance of this participation until now. In the future, I will absolutely take part in such surveys that are strategically designed by individuals like you to better programs for the future.

    Thank you for taking the time to read my message!

  2. Hi Kim,
    I would like to Thank you for sharing those tips. however I have three questions for you.

    Question 1:

    Do you think learners were not responding to surveys because classes were non-credit and free?

    Question 2:

    have you seen any increase in the percent of people responding after trying the tips that you have stated ?

    to what extent those tips were beneficial ?

    Question 3:

    surveys were the online method to collect information on how well those students were leaning .

    To what extent do you think the evaluation would be valuable and measurable in such situation ?
    Thank you

  3. Kim,

    Thank you for such practical tips about soliciting feedback for evaluation. It can be so difficult to get authentic feedback when, as you mentioned, only a small percentage of your stakeholders respond. As I think ahead to my future as an Administrator, your practical tips are indeed, very practical! I especially appreciate your tip about not sending your surveys at the beginning of the week! Monday morning unread emails can indeed be daunting so your suggestion of sending them after Tuesday at noon is brilliant. Thanks for sharing.


  4. Your tips on surveys are great! Sounds like you have captured much int he way of useful feedback….especially about whether — and what — MOOCs might be teaching effectively. Any “topline” findings you can share??

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.