AEA365 | A Tip-a-Day by and for Evaluators

TAG | interviewing

Hi, I’m Siri Scott, and I work as a Graduate Assistant with the University of Minnesota Extension.

Conducting interviews with youth is one way to consider gathering information for program improvement, especially when you want to bring youth voice into your improvement process. While more time intensive than doing a survey, interviews will provide you with rich contextual information. Below is a brief overview of the planning process as well as tips for conducting interviews.

Hot Tips: Planning Phase

One main decision is whether or not you will need IRB approval for conducting interviews. Even when done for program improvement purposes, it is a good idea to comply with IRB regulations for data practices and protection for youth. Other major decision points in the planning process include how many individuals you will interview, how you will choose your participants, and how you will collect and analyze your data. In addition, you must decide what type of interview you want to conduct, the purpose of the interview, and then create an interview protocol (if appropriate).

Hot Tips: Conducting interviews

Here are some tips for conducting interviews:

  • Practice: Test the use of the protocol with a colleague (or a young person who you know well) and ask for feedback about the questions themselves and how they fit together.
  • Space: Find a quiet, secluded space for the interview in a public setting (or in a private home if the young person’s guardian can be present). You don’t want other people overhearing your conversation or your participant being distracted by anything.
  • Warm up: Start the interview with some informal chit chat. This will build your rapport with the participant and ease the participant’s (and your) nerves.
  • Probe: If you are not doing a structured interview, make sure to ask participants to clarify or elaborate on their responses. This will provide you with much better data.
  • Notes: If you are using an audio recorder, don’t trust it (fully). Jot down some quick notes during the interview. You can elaborate on these later if the audio recorder malfunctions.
  • Relax! If you mess up, that’s okay. Also, if you’re nervous and tense, the participant will sense that. Do whatever you can to put the participant (and yourself) at ease.
  • Learn More:  A good resource for learning about how to conduct interviews is this newly released, comprehensive overview of the interviewing process:  InterViews: Learning the Craft of Qualitative Research Interviewing (3rd Edition) by Brinkman and Kvale (2014).

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hello! We are Xin Wang, Neeley Current, and Gary Westergren. We work at the Information Experience Laboratory (IE Lab) of the School of Information Science & Learning Technologies at the University of Missouri.  The IE lab is a usability laboratory that conducts research and evaluates technology. What is usability? According to Jakob Nielsen’s definition, usability assesses how easy user interfaces are to use. With the advancement of Web technology, in the past eight years, our lab has successfully applied a dozen of usability methods into the evaluation of educational and commercial Web applications. The evaluation methods that we have frequently used include: heuristic evaluation, think-aloud interviews, focus-group interviews, task analysis and Web analytics. Selecting appropriate usability methods is vital and should be based on the development life cycle of a project. Otherwise, the evaluation results would not be really useful and informative for the Web development team. In this post, we focus on some fundamental concepts regarding one of the most commonly adopted usability evaluation methods–Think-Aloud protocol.

Hot Tip: Use think-aloud interviewing! Think-aloud interviewing is used to engage participants in activities and then ask users to verbalize their thoughts as they perform the tasks. This method is usually applied during the mid or final stage of Website or system design.

Hot Tips: Employing the following procedures are ideal:

  1. Recruit real or representative users in order to comply with the User-Centric Design principles
  2. Select tasks based on frequency of use, criticality, new features, user complaints, etc.
  3. Schedule users for a specific time and location
  4. Have users operate a computer accompanied by the interviewer
  5. Ask users to give a running commentary (e.g., what they are clicking on, what kind of difficulty they encounter to complete the task)
  6. Have interviewer probe the user about the task s/he is asked to perform.

Pros:

  1. When users verbalize their thoughts, evaluators may identify many important design issues that caused user difficulties, such as poor navigation design, ambiguous terminology, and unfriendly visual presentation.
  2. Evaluators can obtain users’ concurrent thoughts rather than just retrospective ones, so it may avoid a situation where users may not recall their experiences.
  3. Think aloud protocol allow evaluators to have a glimpse into the affective nature (e.g., excitement, frustration, disappointment) of the users’ information seeking process.

Cons:

  1. Some users may not be used to verbalizing their thoughts when they perform a task.
  2. If the information is non-verbal and complicated to express, the protocol may be interrupted.
  3. Some users may not be able to verbalize their entire thoughts, which is likely because the verbalization could not keep pace with their cognitive processes–making it difficult for evaluators to understand what the users really meant.
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

My name is Kirsten Ellenbogen. I’m Director of Research & Evaluation at the Science Museum of Minnesota and President of the Visitor Studies Association.  I hope you’re enjoying VSA Week on AEA365.

Rad Resource: Development of individual identity has for some time been considered an outcome of informal learning environment experiences. But identity has recently become more central in the field with the 2009 report by the National Academy of Sciences (NAS): “Learning Science in Informal Environments.” The report identifies and provides evidence for six “strands of learning” that occur in informal learning environments. What’s so rad about this? NAS reports are based on systematic reviews of literature that use strict criteria for what counts as good evidence. This report is unique in the strength and systematic nature of the evidence for learning in informal environments that it provides. You can read (and search) the entire book online or purchase a copy:  http://www.nap.edu/catalog.php?record_id=12190

Cool Trick: Two evaluation approaches that are particularly useful for gathering data about identity development in informal learning environments are embedded evaluation and reflective interviews. Embedded evaluation integrates “invisible” tools for evaluation into the existing program activities. For example, in a youth program that has a focus on interactive media, the projects produced by youth are posted to online environment. http://info.scratch.mit.edu/ScratchR:_The_Online_Community The projects can be downloaded by others, modified, and reported. All activity in the online community can be tracked, and the ongoing development of the youth’s projects can be analyzed for more detail.

Cool Trick: Another evaluation approach useful for gathering data on identity development in informal learning environments is video-based reflective interviews. For example, videotaping a museum visitor using an exhibition (using IRB-approved informed consent as appropriate). In the post interview, after the initial set of questions, show the visitor a video segment of his or her interactions with the exhibition that was taped just moments before. Use a semi-structured interview approach and ask the visitor to narrate their video and tell you more about what they were doing. This approach can become somewhat automated using technologies like Video Traces. http://depts.washington.edu/pettt/projects/videotraces.html

Hot Tip: There’s an app for that. There are reflective tools that support annotation of images, audio or video diaries, and other approaches that support the evaluation of identity development.  Take a look at Everyday Lives or Storyrobe as a great starting point. These apps are useful for you to use as the evaluator or can be added to the participants phone, iPod, iPad or other device. Adding a tool like this to a device that a participant regularly carries around allows ongoing data collection that is reflective and in some instances, embedded. This makes them ideal tools for monitoring identity development.

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from the American Evaluation Association. We are pleased to welcome colleagues from the Visitor Studies Association – many of whom are also members of AEA – as guest contributors this week. Look for contributions preceded by “VSA Week” here and on AEA’s weekly headlines and resources list.

· · ·

Hello, my name is Nicole Jackson. I am both an adjunct faculty in the Human Resource Management Certificate program at U.C. Berkeley Extension and a doctoral candidate in Policy, Organization, Measurement, and Evaluation at U.C. Berkeley’s Graduate School of Education. From my previous and current work, I discovered that interviewing is both an art and a science especially when it is used in more formative evaluations. Although considered important, interviews are prone to researcher bias that can impact data collection and reporting. Below I offer some tips to help mitigate forms of research bias during interviews.

Hot Tip #1: Understand how different interview formats may alter findings. The two general categories of interview formats include invidual versus panel interviews and unstructured versus structured interview scripts. Individual or one-on-one interviews as well as unstructured or loose ended-scripts are the most prone to researcher bias. Both of these formats lend easily to loss of control due to different personality types that can affect information collection. Where possible, try to use multiple interviewers or a small panel with a structured interview script to help mitigate and triangulate real-time interview data. Structured interview scripts should always focus on the critical research questions during an evaluation project.

Hot Tip #2: Tailor question types according to personality type and experience level. A variety of question types exist to help evaluators navigate difficult and shy personality types as well as those participants with more or less knowledge and experience. Where possible try to use more open-ended, situational questions with follow-up probes for more shy personalities and those participants with more knowledge and experience. For more difficult personalities, begin with more close-ended (e.g., yes/no) questions and then transition to open-ended question prompts in order to maintain control and focus during the interview.

Hot Tip #3: Never underestimate the role of the interview environment. Nothing can be as frustrating as a distracting interview environment. Always conduct interviews in a quiet, private location with good lighting, appropriate room temperature, and minimum distraction. Have water ready to go to place participants at ease. When using recording technology, always consider Murphy’s Law and have extra notepads and recorders ready on hand. Test all recording equipment during the first two minutes of the interview as a safe-guard.

Hot Tip #4: Be mindful of both verbal and non-verbal language. Experts on interviewing claim that non-verbal communication is just as important as verbal behavior in evaluating the trustworthiness of data. Be aware of how your own body language and those of your participants can alter data collection and assessment. Never use closed poses such as crossed arms while interviewing, which is a sign of defensive behavior. Also, be mindful that non-verbal behaivor is culturally influenced.

Nicole will be conducting a roundtable at evaluation 2010 on improving methods of inquiry to incorporate diverse views and perspectives. Join Nicole and over 2500 colleagues at AEA’s Annual conference this November in San Antonio.

· ·

Hello, I am Gordon Bonham, owner of Bonham Research, Cranberry Township, Pennsylvania. Measuring the quality of life of people with intellectual and other developmental disabilities (ID/DD) presents a number of challenges.

Lessons Learned: Although it is generally assumed that self response is better than proxy response when people have the ability to respond for themselves, how many people with ID/DD can respond for themselves and who decides? The Maryland Ask Me! Survey has eight years of experience collecting quality of life information for about 1,300 individuals each year sampled from the roles of the state disability administration. The survey follows the principles of participatory action research further than other published studies on outcomes for this vulnerable population by employing only interviewers who are part of the population with ID/DD.

About 30 interviewers with ID/DD are employed each year to interview their peers, and they have worked for an average of 3.2 years. The peer interviewers are more likely to be young and female and have less severe intellectual disabilities and communication impairments, but more likely to have cerebral palsy than the peers they interview. However, they tend to have the same experiences with habilitation, employment, residential and other support services as those they interview. They work in teams of two that allows non-readers and people unable to record answers to interview. The teams make the determination if a person has the ability to respond for him or herself, and find that three-fourths can, including one-fifth of those classified with profound intellectual disabilities. The peer interviewers also interview the proxies for those unable to respond for themselves, generally impressing families and staff with their abilities.

The 90 peer interviewers who have worked over the eight years of the statewide evaluation have contributed greatly to the quality of data used to guide state policy, enhance agency services, and inform consumers making choices for where to get services. In addition, they have personally benefitted from participation in the evaluation. A survey shows that their employment in research increased peer interviewers’ self-confidence, improved communications skills and created openness to new opportunities. Employment as interviewers provided one-fourth of them with their first paid community job experience and helped one-fifth to subsequently step into a better job or pursue further education or training. Research employment also helped one-fourth to move into more independent living, expand friendships, increase participation in clubs and groups, and increase advocacy.

Rad Resource: ASK ME! Survey http://www.thearcmd.org/programs/ask_me.html

The American Evaluation Association is celebrating Disabilities and Other Vulnerable Populations (DOVP) Week with our colleagues in the DOVP AEA Topical Interest Group. The contributions all this week to aea365 come from our DOVP members and you may wish to consider subscribing to our weekly headlines and resources list where we’ll be highlighting DOVP resources. You can also learn more from the DOVP TIG via their many sessions at Evaluation 2010 this November in San Antonio.

· ·

Hello, my name is Deborah Grodzicki and I just received my Masters in Organizational Behavior and Evaluation from Claremont Graduate University.  I plan to pursue a PhD in Evaluation at UCLA in the fall.  Prior to attending Claremont Graduate University, I investigated complaints against New York City police officers.  During my time as an investigator, I gained experience questioning civilian complainants and police officers about extremely sensitive issues.  Drawing on this experience, I will give some tips on how to obtain essential information without compromising evaluator – stakeholder relationship.

Hot Tip: Do not be a prisoner of your question list. At their most basic, interviews and focus groups consist of the evaluator asking stakeholders a list of questions.  To make these qualitative measures most effective, however, it is critical to maintain flexibility in your questioning and establish a conversational atmosphere.  Do not use the questions as a crutch, but rather as a directional tool for the conversation. Otherwise, you risk casting yourself as an interrogator, which could result in the individual withholding vital information.

Hot Tip: Check your biases at the door. It is natural to come into a situation with personal biases that may affect how you approach an interview and/or focus group. It is important to be mindful of these inevitable biases and make a conscious effort to prevent them from affecting how your questions are phrased and delivered. Faced with a biased or leading question, a stakeholder is more likely to provide more restricted answers that mirror the bias and unduly skew the results.

Hot Tip: Withhold judgment. When conducting interviews and/or focus groups, never give someone the impression that you disapprove of their thoughts, feelings, or actions.  It is up to you as the evaluator to generate a safe, comfortable, and above all, accepting atmosphere. Only then will a stakeholder freely share their impressions about the evaluand.

Hot Tip: Look them in the eye. During my time as an investigator, I was taken aback by how many of my colleagues broke eye contact when a complainant spoke about a sensitive issue. Though seemingly insignificant, this small action can have substantial consequences. Failing to maintain eye contact at the stakeholder’s most vulnerable moment gives the impression that you are uncomfortable hearing what they have to say.  Sensing this can lead the stakeholder to feel self-conscious and promptly shut down.

This contribution is from the aea365 Daily Tips blog, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org.

· ·

Archives

To top