AEA365 | A Tip-a-Day by and for Evaluators

TAG | recruitment

I am Nicole MartinRogers, Ph.D., a senior research manager at Wilder Research. Our mission is to improve the lives of individuals, families, and communities through research and evaluation.

Most of the time when we are asked to conduct an evaluation study, the participants can be found through the program being evaluated (obviously!) But in some cases, an organization may want to do a study to learn about people who have not participated in their programs.

For example, Wilder Research has worked with the Minnesota Historical Society to conduct studies with people who have never visited their sites and who are from under-served cultural communities. To complete these studies, we were able to identify members of under-represented cultural groups, and have them participate in the program or exhibit and then in a focus group to learn more about their experience.

Lesson Tearned: We often do not have direct relationships with the under-served communities that our client organizations are interested in learning about. In those cases, we have contracted with community-based organizations that serve the target community to help us recruit and host studies. These organizations have trusted relationships with community members and are best positioned to help us identify participants.

Cool Trick: Community-based organizations often have locations that are familiar and accessible for members of the target community, so you should consider if it would be better to have participants meet at that community location and provide group transportation to your site.

Cool Trick: We often ask a leader from the community-based organization who recruited the participants to welcome them into the study and to introduce us (the researchers) to build trust. In many cultural communities, trust is gained through relationships, so demonstrating that you have the buy-in of a community leader can go a long way in terms of recruiting study participants from under-served communities and helping them to understand why it is important to participate in your study.

Lesson Learned: Offer incentives to participate. Consider what type of incentives might work best for your target group. It should be something that is useful to them.

Cool Trick: Perhaps the organization that is sponsoring the study can offer free memberships or passes to participants, or some merchandise from their gift store. Providing a cash incentive, transportation assistance (as needed), child care assistance (as appropriate), plus institution-specific incentives can help to encourage participation among a target population that has either low awareness of or interest in engaging with that program or place.

Rad Resources:

Wilder Research report for Minnesota Historical Society on Focus Groups with Potential Visitors from Latino and African American Communities

Speaking for Ourselves: A Study with Immigrant and Refugee Communities in the Twin Cities is a Wilder Research study that includes a report on civic participation and social engagement that has a section about mainstream cultural amenities; the main study page  has links to reports on the experiences immigrants and refugees in many sectors

More on community-engaged research from the U.S. Centers for Disease Control and Prevention

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Jennifer Sullivan Sulewski, Research Associate at the Institute for Community Inclusion, University of Massachusetts Boston. Most of my research and evaluation work has focused on improving employment and postsecondary education outcomes for people with disabilities. I am co-chair of AEA’s Disabilities and Other Vulnerable Populations (DOVP) TIG and co-author of the Universal Design for Evaluation Checklist. I have also had the pleasure of serving as curator for the DOVP Week.

There are two major Universal Design schools: one is broadly applied and the other specifically to curriculum and learning. Both can help inform evaluators and ensure accessibility for all evaluation participants. Each of this week’s posts focuses in on the concepts of Universal Design (UD)or Universal Design for Learning (UDL).

Hot Tip:

  1. Equitable Use
  2. Flexibility in Use
  3. Simple and Intuitive Use
  4. Perceptible Information
  5. Tolerance for Error
  6. Low Physical Effort
  7. Size and Space for Approach and Use
  1. Multiple Means of Representation
  2. Multiple Means of Action and Expression
  3. Multiple Means of Engagement

Lesson Learned:

  • Evaluation recruitment materials and informed consent must be accessible for authentic and ethical participation. In our Universal Design for Evaluation checklist, we demonstrate the importance of Principle 1: equitable use, particularly as it applies to the informed consent process. In my work with people with intellectual/developmental disabilities, I’ve learned how important it is to create recruitment and informed consent materials that are designed to be used and understood by all. For example, in an early project I had separate consent forms for different aspects of the project, and the need to sign multiple forms was confusing for participants. I learned to explain all the expectations and rights of participants in one simple form instead.  
  • Here is an example of such a consent form:

Universally Designed Informed Consent

  • It is also essential when working with this population to understand whether the participants are under someone else’s guardianship; if they are, consent must be obtained from the guardian and assent from the individual.

In the posts to follow, John Kramer discusses how to apply UD Principle 3 to increase access and stakeholder participation. David Bernstein describes applying Principles 3 and 7 to an evaluation involving Deaf-Blind program participants. June Gothberg demonstrates Principles 2 and 4 on flexible and perceptible information. Bob Hughes provides tips on evaluating UDL projects and Don Glass explores the use of UDL to guide the design and evaluation of curriculum, programs, and materials.

Rad Resource:

  • Looking for ideas on how to make your project more accessible to people of all backgrounds and abilities? The Universal Design for Evaluation Checklist is a resource for applying the seven principles of Universal Design to evaluation.

The American Evaluation Association is celebrating the Disabilities and Other Vulnerable Populations TIG (DOVP) Week. The contributions all week come from DOVP members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

· · · ·

Hello! I’m Judy Savageau, faculty, researcher, and IRB representative for the University of Massachusetts Center for Health Policy and Research. I want to bring attention to the need for us to be mindful that our stakeholder groups are often ‘subjects’ in our evaluation research.

Why are there ethical concerns? Like research, evaluation involves human subjects. Study participants, vital to our understanding and advancing knowledge related to particular issues and processes, may experience risks and inconveniences with no direct benefit. Most investigators conducting research (whether clinical, population-based, evaluation, educational, policy or the basic sciences) must have their studies approved by their Institutional Review Board (IRB). The IRB defines ‘research’ as ‘a systematic investigation designed to develop or contribute to generalizable knowledge’. While much of our evaluation work may not need IRB approval, there are many instances where we need oversight.

Many stakeholder groups have their own internal review processes whether as a state agency, a clinical practice, local school district, or cultural group. Multiple approvals may be needed if working with many different stakeholder groups. IRBs are particularly cautious, yet can be very helpful, when we include vulnerable populations: children and teens, elders, pregnant women, inmates, persons with cognitive impairments, mental illness, or other disabling conditions. Human subject protection involves capacity to consent, freedom from coercion, and comprehension of possible risks and benefits. Challenges arise when subjects aren’t aware of potential risks, or understand that their participation is voluntary and that they have the right to withdraw at any time.

The quintessential requirements for the ethical conduct of human subject research include:

  • Respect for persons – recognizing and protecting autonomy of individuals through the informed consent process;
  • Beneficence – protecting persons from harm by maximizing benefits and minimizing risks; and
  • Justice – requiring that benefits and burdens of research are distributed fairly.

Hot Tips:

  • Be mindful of recruitment incentives whether cash, gift cards, free services, raffle prizes, and more.
  • Consider whether paid participants are recruited fairly, informed adequately, and paid appropriately
  • Take into consideration the subjects’ medical, employment, educational status, and their financial, emotional and community resources.
  • Consider whether incentives constitute undue inducements or coercion. We want to acknowledge a person’s time, travel costs, and other expenses but, we must ensure participation is truly voluntary.

Rad Resources:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

My name is Elizabeth Autio and I am an associate at Education Northwest, a non-profit organization serving educators through research, evaluation, and technical assistance.

This is a companion piece to my earlier post on Recruiting Participants for Your Study: Practical Strategies and Advice that more specifically details tips on recruiting teachers into your study.  I found it was important to do these sessions in person so that teachers could hear my recruitment message and make informed decisions about their participation.

  1. Make it personal.  A potential participant’s decision is influenced by their personal rapport with you. Take the time to establish relationships.  These can be initiated via email or telephone, but are best solidified by an in-person visit.  If they will be interfacing with a study team, showing pictures of the other team members communicates more than sharing their vitas.
  2. Flexibility and firmness.  Scheduling your visit requires a balance of these qualities.  Some dates might be on your calendar weeks in advance, while others might come through the day prior.  At the same time, be clear about how much time you need with the teachers to adequately deliver your information and answer their questions.
  3. Add extra travel time.  Teachers are on tight schedules.  Being on time respects that; moreover, if you are late, those are lost minutes for your recruitment session that you will not get back.  Confirm addresses and directions, as schools are routinely rebuilt, closed, or temporarily housed, and such changes are not always reflected in Google or Mapquest.  These sites also often underestimate driving times; I add 50 percent, then an extra 10 minutes to get myself out the door.
  4. Take snacks.  Putting out snacks to share with the group is a small thing that goes a long way in showing your appreciation.
  5. Pool teachers across a district.  If possible, ask teachers to come together to a central site across a district.  This saves time and money.
  6. Be clear.  Explain who you are, who you work for, and who is funding the study.  Communicate essential points of the study methodology in layperson terms.
  7. Be aware of their community.  The places that you visit might be different than your own: politically, demographically, culturally, and/or religiously.  Learn about community norms in advance, if possible, or observe them while you are there.  This might affect the assumptions you make, your dress, your jokes, or even the way that you address people.
  8. Put a signature on it!  Give participants something to sign – such as a memorandum of understanding (see example here) – that outlines the details of their participation.  Leave them with ready-to-sign copies and postage paid return envelopes.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hello! We are Linda Cabral, Laura Sefton and Kathy Muhr from the Center for Health Policy and Research at the University of Massachusetts Medical School. We recently completed an evaluation project that involved recruiting people with mental health conditions to participate in individual interviews, focus groups, and surveys regarding their experiences with a mental health peer specialist training program. In 2010, Woodall and colleagues reported that many barriers exist to participating in mental health research, including:

  • fear
  • suspicion and/or distrust of researchers
  • concerns about confidentiality
  • transportation difficulties
  • severity of illness
  • inconvenience
  • fear of relapse as a result of participation
  • the stigma of mental illness

We wanted to share some tips and lessons learned to address some of these barriers.

Hot Tip: Get approval. Before starting data collection, consider applying for Institutional Review Board (IRB) approval. While many evaluations for program improvement purposes do not require IRB approval, if you wish to disseminate your findings to a broad audience, this approval may be necessary to insure that recruitment efforts take into consideration an IRB’s requirements for working with vulnerable populations.

Hot Tip: Establish trust. To establish trust, the evaluation team members visited the training program and were introduced as people who would be in contact after the training was completed to get their feedback on the training. This informal introduction by a trusted source paved the way for outreach later on.

Lesson Learned: Use a script. Having a telephone script was a good tool for initiating a conversation or leaving a message with the intended participant. It helped us to remember to cover key points with potential participants.  It also reinforced our concern for their confidentiality as we avoided sharing information with others when leaving a message.

Lesson Learned: Be transparent. Once we contacted the participant, we were transparent about the purpose of the evaluation, who was funding it, and how their information would be used.

Lesson Learned: Provide multiple access points. To increase survey response rates, we brought copies to all in-person interviews, allowing time after the interview for participants to complete the survey. If they required assistance, we were present, and able to do so in real-time.

Lesson Learned: Be flexible. To increase our recruitment rate, we were flexible in our interview and survey administration formats. When possible, our first preference was to conduct in-person interviews at a time and place of the person’s choice. When this was just not feasible or could have led to a decision to not participate, we offered to conduct the interviews and survey data collections over the phone.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · · ·

My name is Elizabeth Autio and I am an associate at Education Northwest, a non-profit organization serving educators through research, evaluation, and technical assistance.

Lesson Learned for Recruiting Study Participants: Recruiting study participants is challenging. Success has ramifications for continued funding, buy-in and attrition. Based on my experience recruiting for a federally-funded randomized controlled trial, here are some lessons learned:

1.       Develop a system: This lays the groundwork for your recruitment activities. It includes a strategy for who you want to approach, through what mechanisms, and what resources you can leverage to do so. Materials include presentations, a website, and handouts that have a consistent look and content. A contacts and communications tracking mechanism, such as a database, will help you stay organized and report progress to your funders.

2.       Make participation attractive: Ideally, the program that you are studying is something potential participants need or want. Offer incentives that thank the participants for the time it takes to engage in study activities and are provided upon successful completion. Investigate things that will not cost additional money; for example, in our education setting, we facilitated course credit for teachers who attended study-related professional development.

3.       Don’t think like a researcher: Put yourself in the shoes of potential participants. Imagine yourself a teacher, public health worker, administrator, or service provider. What would you want to know about the study? Most likely, you would want to know practical things, such as: What will I receive? What do I have to do? Why would I want to do that? Is there any risk? How much time will it take? Does it mesh with what I am already doing?

4.       Allocate adequate time and money: Successful recruitment takes an upfront investment of time and money. We found that it took several times what we originally budgeted. When planning, think realistically about the sample size that you need, the eligible pool, their likelihood of participation, and how many potential participants you will therefore need to approach. Are you leveraging existing contacts or building new ones? Can you budget for in-person visits? Are you traveling, and, if so, how far? Can you bring participants together to a central location to hear your recruitment message, or do you have to travel to each site separately? Are there multiple levels of hierarchy that you need to work with, or just one? How will you secure participation, and who will collect and track this? These are examples of some of the questions you might ask yourself as you begin budgeting.

Rad Resource: Learn more about recruitment through downloading our Evaluation 2011 session handouts from the AEA Public eLibrary.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Archives

To top