Hi, my name is Ayana Perkins, the programming Co-Chair of the Atlanta-area Evaluation Association as well as Senior Research Analyst and Evaluator at Infinite Services and Solutions. I am a qualitative enthusiast and often train other evaluators and researchers in these methods. What I have noticed is that participants are more likely to right to feel valued and engaged when sharing data using qualitative methods. In fact, interviews and focus groups are wonderful ways to encourage satisfaction and when done right, the evaluator can walk away with credible findings and the participant can leave renewed and excited about participating in the data collection event.
- Practice, practice, practice. With each new project, every member of the evaluation team should have a firm understanding of what to expect in the data collection event. What contingency plans exist when the recorder doesn’t work or too many people show up? Investing time in learning how to respond to these worst case scenarios produces an investigator or an evaluation team that is well poised to resolve all unexpected issues.
- Plan for informal conversation. Before any interview and focus groups, allow 5 to 15 minutes for informal conversation. This time should be built into the length of the focus group or interview. Further, the evaluator is able to shed the role of expert as well as imply that no greater effort than conversation is required. This strategy also increases an individual’s willingness to more fully participant in any icebreaker activity.
- Create opportunities for success. Previous experience and personality differences can partially influence how a person’s will likely respond to the open ended format. Even with these influences, there are strategies to help the participant feel like their contribution was a successful effort:
- Emphasize there is no right answer which helps to reduce social desirability in responses
- Acknowledge that no response is a response and ask whether or not the question was meaningful to them or more time is needed before responding
- Connect similarities in responses to enhance group dynamics
Focus groups and interviews do require a lot of preparation but this level of effort can be paid off with rich findings and satisfied participants.
Want to learn more about qualitative methods? Visit this website to identify ways to strengthen your project: http://www.qualres.org/. Although the site is not specific to evaluation, most of the recommendations would also apply for qualitative evaluation projects.
Have any of these strategies worked for you? Please share your experiences in the comments.
The American Evaluation Association is celebrating Best of aea365, an occasional series. The contributions for Best of aea365 are reposts of great blog articles from our earlier years. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to email@example.com . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.
3 thoughts on “Best of AEA365: Setting Yourself Up For Success in Qualitative Data Collection by Ayana Perkins”
Thank you for your helpful post! I am currently working on my Master’s program and am at the tail end of a program evaluation course. I am interested in the use of qualitative data as a measure of the effectiveness of a social program. I am a Social Emotional lead teacher at my school and have been involved in a district wide initiative to support teachers implement social emotional learning strategies in the classroom context. While we understand the benefits of supporting student’s social emotional learning, many a meeting centers around on what “measures” we can use that validates the program. The premise holds that students who demonstrate self-awareness will be able to manage emotions and thus manage their learning outcomes to a higher degree. Thus, the data collected is both quantitative (increased academic outcomes) and qualitative (how students feel). While the former can be measured in student academic achievements on a proficiency scale, for example, the later data, qualitative, is more meaningful and supports the initiative of SEL. Kelloggs Foundation states, “When programs operate in real communities where influences and forces are beyond your control, evaluation is generally more about documenting a program’s contribution rather than proving something.” P.54
Thank you for the provided reference. As I investigated this site, I learned how complex qualitative research can be. As cited by the Robert Wood Johnson Foundation, “Qualitative research is multimethod in focus, involving an interpretive, naturalistic approach to its subject matter. This means that qualitative researchers study things in their natural settings, attempting to make sense of, or interpret phenomena in terms of the meanings people bring to them. Qualitative research involves the studied use and collection of a variety of empirical materials – case study, personal experience, introspective, life story, interview, observational, historical, interactional and visual texts – that describe routine and problematic moments and meanings in individuals’ lives.” (Denzin, NK & Lincoln, YS, 2004, p. 2). Despite the varying nuances and variations, I am more convinced that data collected from personal experiences and narratives (positive and negative) can give insights to areas of strengths and weaknesses of a social program.
Your article speaks to participant engagement where they feel valued for sharing their insights. In my classroom, I glean more insights on how students are progressing in their capacity as social learners through formal and informal conversations. As per Herman (1987), evaluations that include qualitative data can provide “detailed description and on in-depth understanding as it emerges from direct contact and experience with the program and its participants. Using more naturalistic methods of gathering data, qualitative techniques rely on observations, interviews, case studies, and other means of fieldwork,” (P.23). Further to your hot tips, I couldn’t agree more that taking time to converse with participants takes not only time but a practiced skill to ask questions that draw out information that is relevant. Strategies as a soft warm up, explaining how no answer is right or wrong does indeed help conversations to flow more easily. Open conversations should consider contingencies of uncovering information that is not expected. Thus, in a program evaluation, evaluators need to consider and identify possible unintended outcomes that could arise from planning an informal conversation. There are lots of thought points here. I appreciate your post.
Data Collection Methods. (2016, April 2). YouTube. Retrieved June 7, 2022, from https://www.youtube.com/watch?v=NJ-gW6adQTc
Herman, J. L. (1987). Evaluator’s handbook. Accessible via Queen’s Library.
Kellogg Foundation Developing a Basic Logic Module for Your Program: https://cyfar.org/sites/default/files/W.%20K.%20Kellogg%20Foundation,%202004.pdf
Denzin, NK & Lincoln, YS. (1994). “Introduction: Entering the field of qualitative research.” In NK Denzin and YS Lincoln (eds.) Handbook of Qualitative Research. pp. 1-18. Thousand Oaks: Sage. As cited by The Robert Wood Johnson Foundation,
My name is Wendy Struck Fox, and am enrolled in the same Program Inquiry & Evaluation course as Anisha. The course has been challenging as a beginner to the topic, but I have gradually learned more, working toward a complete Program Evaluation Design project. Your blog caught my attention because I am interested in learning more specific information about data collection. While quantitative date seems more easy to understand, qualitative data has been more unclear to me.
I was surprised but happy to hear that “interviews and focus groups are wonderful ways to encourage satisfaction, and when done right, the evaluator can walk away with credible findings and the participant can leave renewed and excited”, because I have been of the mind that qualitative data is often unclear, incomplete, and lacking specificity.
Your tip to “practise, practise, practise” resonates with me and am encouraged to hear that a preliminary informal conversation is such an important and effective strategy that it should be built into the timeline. Advice like “there is no right answer” has helped me in the past as well. It seems vital to communicate to participants that their opinions and ideas are meaningful and valued.
Thank you for sharing this information, it has helped me to understand better how to conduct qualitative data collection. I appreciate your time,
Hello Ayana. My name is Anisha and I would like to start by thanking you for sharing your expertise and useful tips on qualitative methods. I have recently become a graduate student at Queens University (Master of Education), and this is my first course and experience with program evaluations. To be honest, I am finding it difficult to absorb all of the information relating to program evaluations. I am also finding it difficult to fully grasp program evaluations as it applies to my day to day profession teaching both at a primary and post secondary level.
I am truly excited by your article, because, for the first time, I am immersed in the parallels between this aspect of evaluations (collecting data) and my own teaching practices. For example, the ‘Hot Tips’ within your article reminds me of my preparation for the first weeks of school and I would love to share these parallels with you.
Similar to your first tip, “Practice, practice, practice”, I relate this to a well planned Lesson Plan that includes flexibility and contingency plans in the event of worse case scenarios. As you mentioned, practicing through an organized plan allows for anticipation of these worse case scenarios.
Your second tip, “Plan for informal conversation” is relatable to how I establish a welcoming and inclusive learning environment. In a classroom setting, the first weeks of school are dedicated to providing students and myself with bulks of time in which we can have conversations that don’t necessarily relate to the course material but are necessary to build relationships and trust so that, as you said, participants are more willing to fully participate. There is also opportunity to collect useful and genuine (informal) data within these times.
Your last tip, “Create opportunities for success”, is relatable to how educators invite students to share their thoughts and learning using various methods that include, but are not limited to, building on what the participant is saying by making connections to the information or/and the thoughts of other participants, use participation to not only answer questions but also to seek help and clarification, etc. As you said, the use of such strategies allows participants to feel empowered by their contributions.
Being able to draw these parallels allows me to better acquaint myself with this specific method of collecting information. Thanks to you, I am better able to respond to the evaluation plan I have been working on by ensuring the evaluation reflects the integrity of the participants, thus providing insights that are required for change.
Should you have any further tips and/or advise, I would love to hear from you.