AEA365 | A Tip-a-Day by and for Evaluators

TAG | universal design for evalution

I am David J. Bernstein, and I am a Senior Study Director with Westat, an employee-owned research and evaluation company in Rockville, Maryland. I was an inaugural member of AEA, and was the founder and first Chair of the Government Evaluation Topical Interest Group.

Westat was hired by the U.S. Department of Education’s Rehabilitation Services Administration (RSA) to conduct an evaluation of the Helen Keller National Center for Deaf-Blind Youths and Adults (HKNC). HKNC is a national rehabilitation program serving youth and adults who are deaf-blind founded by an Act of Congress in 1967, and operates under a grant from RSA, which is HKNC’s largest funding source.

The Westat evaluation was the first evaluation of HKNC in over 20 years, although HKNC submits performance measures and annual reports to RSA. RSA wanted to make sure that the evaluation included interviews with Deaf-Blind individuals who had taken vocational rehabilitation and independent living courses on the HKNC campus in Sands Point, New York. After meeting with HKNC management and teaching staff, it became clear that communication issues would be a challenge given the myriad of ways that Deaf-Blind individuals communicate. Westat and RSA agreed that in-person interviews with Deaf-Blind individuals would help keep the interviews simple, intuitive, and make sure that this critical stakeholder group was comfortable and willing to participate.

Hot Tips:

  • Make use of gatekeepers and experts-in-residence. Principle Three encourages simple and intuitive design of materials to address users’ level of experience and language skills. For the HKNC Evaluation, interview guides went through multiple reviews, including review by experts in Deaf-Blind communication not associated with HKNC. Ultimately, it was HKNC staff that provided a critical final review to simplify the instruments since HKNC was familiar with the wide variety of communication skills of their former students.
  • Plan ahead in regards to location and communication. Principle Seven calls for appropriate space to make anyone involved in data collection comfortable, including transportation accessibility and provision of interpreters, if needed. For the HKNC evaluation, interview participants were randomly selected who were within a reasonable distance of locations near HKNC regional offices. Westat worked with HKNC partners and HKNC regional representatives with whom interviewees were familiar. In the Los Angeles area, we brought the interviews to the interviewees, selecting locations that were as close as possible to where former HKNC students lived. Most importantly, Westat worked with HKNC to identify the Deaf-Blind individuals’ communication abilities and preferences, and had two interpreters on site for interviews. In one case we used a participant’s iPad with large print enabled to communicate interview questions.

Resource:

The American Evaluation Association is celebrating the Disabilities and Other Vulnerable Populations TIG (DOVP) Week. The contributions all week come from DOVP members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

· · · · ·

Hello! We are Linda Cabral, Laura Sefton and Kathy Muhr from the Center for Health Policy and Research at the University of Massachusetts Medical School. We recently completed an evaluation project that involved recruiting people with mental health conditions to participate in individual interviews, focus groups, and surveys regarding their experiences with a mental health peer specialist training program. In 2010, Woodall and colleagues reported that many barriers exist to participating in mental health research, including:

  • fear
  • suspicion and/or distrust of researchers
  • concerns about confidentiality
  • transportation difficulties
  • severity of illness
  • inconvenience
  • fear of relapse as a result of participation
  • the stigma of mental illness

We wanted to share some tips and lessons learned to address some of these barriers.

Hot Tip: Get approval. Before starting data collection, consider applying for Institutional Review Board (IRB) approval. While many evaluations for program improvement purposes do not require IRB approval, if you wish to disseminate your findings to a broad audience, this approval may be necessary to insure that recruitment efforts take into consideration an IRB’s requirements for working with vulnerable populations.

Hot Tip: Establish trust. To establish trust, the evaluation team members visited the training program and were introduced as people who would be in contact after the training was completed to get their feedback on the training. This informal introduction by a trusted source paved the way for outreach later on.

Lesson Learned: Use a script. Having a telephone script was a good tool for initiating a conversation or leaving a message with the intended participant. It helped us to remember to cover key points with potential participants.  It also reinforced our concern for their confidentiality as we avoided sharing information with others when leaving a message.

Lesson Learned: Be transparent. Once we contacted the participant, we were transparent about the purpose of the evaluation, who was funding it, and how their information would be used.

Lesson Learned: Provide multiple access points. To increase survey response rates, we brought copies to all in-person interviews, allowing time after the interview for participants to complete the survey. If they required assistance, we were present, and able to do so in real-time.

Lesson Learned: Be flexible. To increase our recruitment rate, we were flexible in our interview and survey administration formats. When possible, our first preference was to conduct in-person interviews at a time and place of the person’s choice. When this was just not feasible or could have led to a decision to not participate, we offered to conduct the interviews and survey data collections over the phone.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · · ·

Archives

To top