Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

Analyzing Interview Data by Beverly Peters

Beverly Peters
Beverly Peters

Greetings! I am Beverly Peters, assistant professor of Measurement and Evaluation at American University. This is the final article in a 5-part series on Using Interviews for Monitoring and Evaluation. In previous articles, we discussed conducting good interviews, including crafting good questions and interview guides, and learning the skills of a good interviewer.  

Before you start analyzing interview data, consider how you will record and manage it. Evaluators often use audio and video devices to preserve data for transcription and analysis, as this is the most reliable way to capture interview data.  But some respondents are uncomfortable sharing information on audio/video tape, and may not be forthcoming with answers if captured in this way. In these cases, recording could jeopardize data quality. As a result, evaluators oftentimes write notes as interviews take place. Although not as reliable as recording, this might be the best way of assuring you are able to capture valid, reliable data. This is particularly the case when capturing data on sensitive topics.  

After you complete an interview, aim to transcribe the tape (if recorded) or rewrite  notes (if not recorded) as soon as possible. This will help you to recall information shared during the interview. If you leave your transcription or interview notes too long after the interview, you may not be able to recall the particulars of the conversation.

A few days after the interview, I return to the transcription or notes, and a write a short analytical piece of the interview. This short piece summarizes my respondent’s thoughts, pinpoints themes, and captures areas of agreement and disagreement with other respondents. It also includes my own analysis of the interview, and highlights where I think additional data collection is needed.

It may sound simple, but as you analyze your data, you should first become very familiar with it. Review your interview transcripts or notes and analytical pieces. If you have collected data from other qualitative methods, you would likely include that in your analysis as well.  

As Gibbs (2010) tells us, consider coding your data, particularly if you have conducted multiple interviews. Coding is how we define what is important about our data. It helps us organize our thinking. It involves identifying major themes and assigning these codes so that you can easily organize, retrieve, and examine the data. Your codes could be descriptive, analytical, or theoretical in nature. You might code by hand, or you might use a computer program such as NVivo to do so.

You should also create definitions for your codes, so that you know what belongs in a particular category. Definitions need to be specific enough so that you have meaningful comparisons, but not so specific that you have very few cases to add to a particular coded category.  Ensure that your definition is not so encompassing that every case in your research is part of that code, thereby stretching the definition of the code so that comparison or analysis is no longer meaningful or possible.

The goal of your research should be to collect data in a particular coded category, until the point of saturation, and then compare, or triangulate, that data to draw findings. The evaluator will review the data, comparing and contrasting, asking when the data points converge and diverge, and investigating why. Usually, an evaluator will have data from multiple methods, collected from multiple sources, collected on multiple occasions over time, for triangulation.

Rad Resources:

Throughout this series, I’ve cited several resources, including:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.