Hi folks! I’m Jill Scheibler, a community psychologist and Senior Research Analyst at Carson Research Consulting, a women-led firm whose mission is to help clients thrive by using data to measure impact, communicate, and fundraise. We’re passionate about storytelling with data to make a difference.
At CRC I’m the “word nerd”, implementing our qualitative projects. Like many evaluators, I’ve had to translate academically-honed skills to the often faster-paced world of evaluation. A recent project for a county health department’s substance abuse initiative provides an example of how I tailor qualitative methods to meet clients’ needs.
Hot Tips
Allot ample time for clarifying goals. As with all good research, methods choices flow from the question at hand. In this case, our client wanted to understand the impact of substance abuse on their county, and new resources to be tapped. Like many clients, they lacked research savvy, and thought they required services exceeding their budget and available time. We gradually learned they had access to lots of quantitative data and support from the state to help interpret it. They were missing community stakeholder feedback. So, we provided a qualitative needs assessment component.
Build in more meetings than you think you’ll need, and bring checklists. Be prepared to leave meetings thinking you have all needed answers and learning afterwards that you’ve been (well-meaningly) misinformed! (Quantitative sidebar example: after building a data dashboard for another client in Excel2013, based on their word, we learned they had Excel2007. A costly reminder to always ask more questions!)
Choose tool(s) carefully to maximize usefulness. I generally opt for interviews where probes can offset “one-shot” data collection situations. Here, I instead designed a qualitative survey, using mostly open-ended questions, for efficient gathering of perspectives. The client collected surveys themselves, disseminating hard copies and a SurveyMonkey.com link, and accessed a targeted sample from within a community coalition.
Familiar guidelines for interview and survey design apply to qualitative surveys, but I advise keeping questions very focused and surveys as short as possible to mitigate higher skip rates with qualitative surveys.
Cool Trick
You may think your reporting options are limited compared to quantitative results. Not so! Instead of writing text-heavy reports that eat up valuable time, and folks are disinclined to read (#TLDR), consider telling “data stories” using bullet points and visualizations. This client received a two-pager for internal, local stakeholder, and state use. I’ll also provide an in-depth explanation of results and action steps in a webinar.
Rad resources
Jansen’s “The Logic of Qualitative Survey Research and its Position in the Field of Social Research Methods.”
Great tips on qualitative surveys from Nielsen Norman.
Awesome tips from CRC colleagues for larger community surveys.
Achievable qual visualization ideas from Ann Emery.
Some tools for qual analysis and visualization from Tech for Change.
I genuinely enjoy working creatively with clients, because it makes evident how suited qualitative methods for linking research to action. I’d love to hear how others do this work, please get in touch!
The American Evaluation Association is celebrating Community Psychology TIG Week with our colleagues in the CP AEA Topical Interest Group. The contributions all this week to aea365 come from our CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
First, Thank-you for the post. I found it very informative. I am currently designing an evaluation and I am wondering your thoughts around the collection of qualitative data with a students who are relatively young in age. I am designing an evaluation of literacy interventions for grade 2 students and want to use feedback from the students themselves regarding efficacy in reading and in the program. However their responses would be limited due to age and writing ability. Would it be appropriate to question then orally and record their responses? Do you have any suggestions on how I might approach this?
Any feedback would be greatly appreciated.
Thanks,
Adam.
Thanks for the shout-out, Jill! Great article, as usual.