Greetings from Columbia, SC! My name is Heather Bennett, MSW, and I have experience working in the field of evaluation for both the public and private sector. Currently, I work as a Research Associate in the Office of Program Evaluation (OPE) at the University of South Carolina where I have the opportunity to lead and work collaboratively on state and federally funded education initiatives in South Carolina. One of my primary responsibilities is to lead our qualitative data analysis efforts, including the analysis of video or audio recordings of cognitive labs, focus groups, interviews, and responses to open-ended survey items.
Lesson Learned: For my tip-a-day for aea365 I am going to focus on one vital and fundamental lesson I’ve learned through the analysis of responses to open-ended survey items — the quality of the question asked has the greatest impact on the data analysis process.
As evaluators, I’m sure we have all inherited some projects with the corresponding data collection instruments and noticed some issues with the construction of items…or worse, we have looked back on the open-ended items we’ve developed and asked ourselves: “What was I thinking?” Upon inheriting the evaluation of a program, I was soon reminded of the impact item writing can have on data management. Issues of data utility arose as my team and I reviewed the structure of qualitative items and worked to develop clear coding structures for corresponding data.
Hot Tip: Poorly written items do not always follow the “garbage in, garbage out” scenario. However, it takes more time to take-out the trash and get to meaningful data (data cleaning, analysis, coding) when you start with bad items. Below are a few things to keep in mind when developing open-ended items that will support your analysis and coding efforts once the data is collected.
First, you must have a clear understanding of what it is you want to learn about the project before you do anything else. What information do you really hope to gain? What is its utility for the program? This process should be guided by the project scope and involve project stakeholders to ensure the usefulness of the data collected.
Now that you have focused your data collection efforts, use these tips when developing your open-ended item(s):
- Ask one question at a time.
- Avoid leading questions.
- Avoid including personal biases in questions.
- Be specific about the topic.
- DO NOT ask questions that can be answered with yes/no.
- Indicate the number of responses requested from the participant.
- Ask clear and concise questions to avoid participant fatigue.
Following these tips will serve to improve your efforts in collecting focused and clear information from program participants.
This aea365 Tip-a-Day contribution comes from the American Evaluation Association. If you want to learn more from Heather, join us in San Antonio this November for Evaluation 2010 and check out her session on the Conference Program. If you would like to contribute an aea365 Tip, please send an email to aea365@eval.org.
Pingback: A Roundup of Survey Design Resources (cross-post with actionable data) | Sheila B Robinson
Pingback: A Roundup of Survey Design Resources (Cross-Post with Actionable Data) « Evaluspheric Perceptions
Excellent points Heather! One point I would add is that it also helps to double check with your overall evaluation questions/criteria once you have your interview/survey questions written. I’ve seen it happen when we are writing up the data, we go back to the questions/criteria and realise “oh we forgot to ask about that….”
Thanks Heather. Here’s a couple of additonal tips to get “thicker” responses in qualitative interviews:
1) Rather than asking “why do you think X, do X, etc.,” interviewees may feel less defensive if you ask “how” questions. For example, “can you elaborate on how you think about X, do X, etc.” (Obviously, as you mentioned, only one topic at a time.)
2) ask for specifics–generalizations don’t lead to very useful analysis. For example, “Please describe a specific experience you had….”
Heather – Excellent! This is such an important point. I think one of the challenges is distinguishing that questions that may work in interviews or focus groups may not necessarily work in surveys. And with survey software that now allows for branching and piping, it may be blurring those lines and tempting us to be more causual with our open-ended questions. Thank for you for your succinct guidelines.