AEA365 | A Tip-a-Day by and for Evaluators

TAG | IRB

Hello! I’m Siobhan Cooney, Principal Consultant of Cooney Collaborative.

Over the past eight years, I’ve had the good fortune of working with more than 100 school and district entities to gain approval for data collection activities – such as surveys, assessments and focus groups – involving students whose teachers have followed non-traditional paths to certification or have participated in professional development (PD) programs from third party providers. Before they can be implemented, data collection activities with students should be approved by school and district administrators. I’ve found that particularly when districts and schools are not explicit partners in the programming, these approval processes can pose significant barriers to research and evaluation. In this post, I provide tips for navigating these approval processes.

Lessons Learned: Depending on the district or school, approval by an external Institutional Review Board (IRB) may also be required. While IRBs are more consistently focused on understanding the ethics of the research and whether the rights of participants are protected, school and district administrators have a larger set of concerns including whether the data collection is a good use of time for students and staff; what information might be published about the school or district; and whether the timing of data collection interferes with priorities such as statewide testing.

Hot Tip: Build in a long timeline for gaining approval. Some districts have approval processes lasting six months or more.

Hot Tip: For research and evaluation designs that include a baseline measure at the start of the school year, plan to get approvals in the prior school year. Do not expect that school and district staff will work on approval processes in the summer months. For instance, if you are holding a summer PD workshop, you will need to know well prior to the event who will be attending and work with their administrators on approvals as quickly as possible.

Hot Tip: Be generous in budgeting hours for approval processes. Navigating these processes, particularly with multiple schools and districts at the same time, can be time-intensive. With a tight budget, you may need to forgo data collection in schools and districts with more burdensome processes and/or where approval seems less likely.

Hot Tip: Do not assume that because your research is ethical, you will gain approval from all districts and schools. You may consider oversampling if you need a particular sample size for your study, recognizing that some requests will either be rejected or still unresolved when data collection begins.

Hot Tip: When possible, offer the school or district something in return, such as a school-level analysis of outcomes.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello! We are Dana Gonzales and Lonnie Wederski, institutional review board (IRB) members at Solutions IRB, specialists in the review of evaluation research.

Why talk about IRB review for evaluations of science, technology, engineering, and math (STEM) education projects? Most simply, federally funded projects may require it. You may also ask, “Why aren’t all of these evaluations exempt?” IRB reviewers apply the Code of Federal Regulations (CFR) in their decisions. Many STEM evaluations include children. Under CFR rules, only a narrow range of research is exempt from review when it involves children, like research applying educational tests or observations of public behavior where the investigator does not participate. Interviews and focus groups with minors won’t likely qualify for exempt review, as they are seldom part of the normal educational curriculum. Randomization to a control group would not meet exempt category requirements for the same reason. Both would, however, qualify for expedited review, if there is no more than minimal risk for participants.

So, do you need to use an IRB? Ask these questions:

  • Is IRB required by the grant or foundation funding the project?
  • Does the school district require IRB review?
  • Do you intend to disseminate findings in a publication requiring IRB review?

If the answer to any of those questions is “yes,” you need an IRB—at which point uncertainty strikes! Maybe this is the first time you’ll use an IRB (you are not alone) or you remember unpleasant experiences with an academic IRB. Fear not, evaluators! Many IRB reviewers understand the differences between clinical studies and evaluations. Some specialize in evaluations, employing reviewers with expertise in the methods evaluators use, who recognize that phenomenology, grounded theory, ethnography, and autoethnography are valid study approaches. Who wants to educate an IRB when you are paying them? 

Rad Resources:

Hot Tips:

  • Have questions regarding the ethics of recruitment or consent? Some independent IRBs will brainstorm with you and answer “what if” questions. Ask for a complementary consultation with a reviewer.
  • Ready to submit your evaluation for review? Ask the IRB if free pre-review of study documents is provided, to save time prior to formal review. Ask for a list of the documents required by the IRB.
  • Most important, know the review timeframe in advance! If the IRB requires two weeks for review, you need to plan accordingly. Some IRBs routinely review exempt and expedited studies in 24-48 hours, so timeframes can vary widely.

We hope you found the information provided helpful.

The American Evaluation Association is celebrating Research vs Evaluation week. The contributions all this week to aea365 come from members whose work requires them to reconcile distinctions between research and evaluation, situated in the context of STEM teaching and learning innovations.. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

This is Frances Lawrenz and Amy Grack Nelson from University of Minnesota; Amy also works at the Science Museum of Minnesota. We’re part of the Complex Adaptive Systems as a Model for Network Evaluations (CASNET) research project, which has involved observing, interviewing, audio recording and collecting artifacts about adult participants in the NSF-funded Nanoscale Informal Science Education Network (NISE Net).

Because CASNET is centered at University of Minnesota, we first requested IRB approval there. They determined that consent was unnecessary, feeling the research was evaluation-oriented and subjects would be talking more about NISE Net than providing personal information. This raised a dilemma for us. Some team members were also participants in NISE Net and wanted to honor existing relationships and build trust by asking for permission, especially because data would be gathered during meetings. Therefore, we turned to the IRB at the Museum of Science, Boston, which, unlike the University’s IRB, determined that consent was indeed necessary.

Ultimately, we decided to gather consent from all research participants. We discovered that when offered the choice to opt out of certain aspects of the study, people sometimes did so, which would not have been possible had we followed the University’s IRB ruling.

Lessons Learned Engaging with Two IRBs:

  • IRBs have different perceptions of risk.       The two IRBs interpreted risk and the need for consent differently. Because obtaining more consent is usually unproblematic, evaluation researchers should carefully consider research subjects’ needs and research team relationships when making consent decisions. Researchers often have a deeper understanding of the people they are researching than a formal IRB application makes clear.
  • Consent is a process. Some people who originally restricted consent changed their minds to later allow collection of their data. Consent should be an ongoing process, especially in long projects. (Ours lasted 3 years.)
  • Individuals have diverse opinions about what data should be allowable. Despite the fact that some people agreed to data collection, others did not, demonstrating different preferred levels of personal control. Accommodation and acceptance of differing needs are necessary.
  • Obtaining permission can be time consuming.       We developed highly confidential mechanisms for obtaining consent because research team members were also part of data collection. Additionally, someone external to the network managed consent forms that required substantial amounts of time. Taking the time was worth it because the research team’s careful attention to ethical behavior ultimately built a high level of trust in the subjects.
  • Trust is essential. Human subjects are understandably concerned about who is going to learn their personal information and how information will be used. Recognizing this and allowing people to meet their own needs to opt in or out of a study develops trust, which can facilitate the research and the quality of its content.

The American Evaluation Association is celebrating Complex Adaptive Systems as a Model for Network Evaluations (CASNET) week. The contributions all this week to aea365 come from members of the CASNET research team. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi!  I’m Heather King, an Associate Project Director at Outlier Research & Evaluation at the University of Chicago. I’d like to share some tips for applying to conduct research in school districts.

Research review boards (RRBs) and institutional review boards (IRBs) are tasked with ensuring that research and evaluation projects meet the requirements for protecting human subjects. If you are collecting interview, questionnaire, focus group, or any other data directly from human subjects, you are required to earn IRB/RRB approval. You’ll need to apply separately for IRB/RRB approval at your own institution and for each school district you’ll collect data in.

I’ve completed successful IRB/RRB applications for some of the largest school districts in the United States and I’d like to share some tips for success.

Lessons Learned:

Start early. Earning district IRB approval is a prerequisite for each of our research and evaluation projects, so we do everything we can do ensure that our IRB applications are well received. The first step is beginning applications early, at least 2 months before the deadline. This gives you time to collect or create the necessary documents, such as instruments and consent forms, and to ensure that your institutional IRB has been approved.

Know the deadlines. Many districts meet only a few times a year to read and approve IRB applications, so meeting the deadline is critical. You might not have another chance to submit your application for another 6 months! Knowing the deadlines can help you plan your evaluation too. For example, if your project begins after a district IRB application deadline has already passed, you can plan in advance to begin data collection around the next IRB deadline.

Read everything. After you’ve done a few IRB applications, it can start to feel like they’re all the same, and generally they are. But each district has its own nuances; don’t wait until you get a rejection letter to learn that! In particular, read details about consent, compensation/incentives, and data collection timing because policies vary widely from district to district. For example, the Chicago Public School RRB requires that your instruments be physically stamped with approval from your home institution IRB.

Make some friends in the IRB office. Navigating the IRB process for each district takes a lot of time, and you’ll undoubtedly have questions. It helps to have a contact in the IRB office that can help explain the process and answer any questions you might have. In my experience, IRB offices appreciate being asked detailed questions because they get so many applications that have not been carefully prepared.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

 

·

I am Dana Harley, an Assistant Professor at Northern Kentucky University.  I specialize in child and adolescent mental health and developmental issues, with a focus on participatory action research methods such as, “photovoice.”

Photovoice is a cutting edge research method aimed at uncovering issues, concerns, constructs, or real-life experiences of those who have historically been marginalized or oppressed.  Participants are given cameras and asked to photograph images that represent the particular issue of interest.  This method is very appropriate for use with children and adolescents; however, special precautions and considerations must be managed to successfully acquire Institutional Review Board (IRB) approval.  Special issues of concern include safety, confidentiality, and consenting.  I provide several tips that may assist you in addressing these unique challenges.

Hot Tips:

  • Safety First. Always consider safety first.  The IRB is concerned about children’s safety related to taking photographs.  I conducted a photovoice study with adolescents in a low-income, high crime, and violent neighborhood.  To address the issue of potential safety hazards, I discussed photovoice “safety” with the research participants.  I included information about avoiding taking pictures of illegal activities, crimes being committed, and other potentially dangerous scenarios.  You should compose a script that outlines exactly what you will say to participants when addressing such issues.
  • Confidentiality. Due to the visual nature of photographs, confidentiality is a concern of the IRB.  For example, I received numerous photographs from research participants that included images of people (teachers, parents, siblings etc.).  It is conceivable that such images could have been linked back to particular individuals participating in the study. Although this issue is almost unavoidable in some photovoice projects, it is important not to publish photographs of research participants themselves.  You MUST explicitly indicate to the IRB that you will not publish images of actual research participants.
  • Consenting. Once your research participants have their cameras in hand, it’s important that they obtain consent to photograph other individuals.  IRB’s are especially critical of this process, since minors are attempting to acquire consent from adults and potentially other minors.  Having research participants obtain verbal consent to photograph other individuals is the best way to manage this issue.   It is important to provide a script that outlines exactly what the research participants will say to obtain verbal consent.

Rad Resources:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· · · · · ·

We are Debi Lang and Kathy Muhr, members of the Research and Evaluation Unit at the University of Massachusetts Medical School Center for Health Policy and Research.

Populations considered hidden or hard to reach for participation in qualitative evaluation studies may be small in size, their members difficult to locate, or hard to distinguish from general populations. In their article, Salgalnick and Heckathorn state such groups historically include subjects in HIV/AIDS research but can include undocumented immigrants, or the homeless.

Evaluations that rely on data from hidden or hard to reach populations present challenges when names and contact information do not exist, are not accessible, or are generated in a way that may introduce biased results. In two recent projects, we used approaches to identify 1) family members of Hospice patients who had died; and 2) adults with mental health conditions who are deaf/hard of hearing (D/HH) or Latino.

Hot Tip: Avoid Bias

  • For the Hospice project, we used claims and enrollment data to identify family members of Hospice decedents, rather than request the information from Hospice providers. This approach avoided a potentially biased sample of family members who were predominantly satisfied with their services.

Hot Tip: Hire Cultural Brokers

  • To identify D/HH or Latino adults with a mental health condition, we hired cultural brokers, with the experience and language of the groups we wished to contact.  As peers and integral members of our evaluation team, the cultural brokers helped to identify group members and create a viable sample of potential participants.
  • To recruit cultural brokers, we made announcements at various stakeholder and committee meetings, brought copies of the job description, and brainstormed with attendees to identify likely candidates.

Hot Tip: Maintain Confidentiality

  • Whether gathering names and contact information of potential study participants from a database or by word-of-mouth, use compliance procedures to maintain confidentiality of personal information and to protect their rights. Both projects required approvals from either an Internal Review Board (IRB) or Compliance Unit to identify and recruit participants.

Lessons Learned: Budget Wisely

  • To budget a project which identifies hidden populations, consider the time needed to generate the study sample, including IRB and data access approval.
  • Consider costs for hiring cultural brokers and/or translators, as well as for participant incentives, travel, and costs associated with rescheduling meetings. These expenses support successful recruitment and data collection activities.

RAD Resources:  The hyperlink above and the following resources discuss sampling designs used to identify hidden or hard to reach populations.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · · · · · ·

Hello! I’m Judy Savageau, faculty, researcher, and IRB representative for the University of Massachusetts Center for Health Policy and Research. I want to bring attention to the need for us to be mindful that our stakeholder groups are often ‘subjects’ in our evaluation research.

Why are there ethical concerns? Like research, evaluation involves human subjects. Study participants, vital to our understanding and advancing knowledge related to particular issues and processes, may experience risks and inconveniences with no direct benefit. Most investigators conducting research (whether clinical, population-based, evaluation, educational, policy or the basic sciences) must have their studies approved by their Institutional Review Board (IRB). The IRB defines ‘research’ as ‘a systematic investigation designed to develop or contribute to generalizable knowledge’. While much of our evaluation work may not need IRB approval, there are many instances where we need oversight.

Many stakeholder groups have their own internal review processes whether as a state agency, a clinical practice, local school district, or cultural group. Multiple approvals may be needed if working with many different stakeholder groups. IRBs are particularly cautious, yet can be very helpful, when we include vulnerable populations: children and teens, elders, pregnant women, inmates, persons with cognitive impairments, mental illness, or other disabling conditions. Human subject protection involves capacity to consent, freedom from coercion, and comprehension of possible risks and benefits. Challenges arise when subjects aren’t aware of potential risks, or understand that their participation is voluntary and that they have the right to withdraw at any time.

The quintessential requirements for the ethical conduct of human subject research include:

  • Respect for persons – recognizing and protecting autonomy of individuals through the informed consent process;
  • Beneficence – protecting persons from harm by maximizing benefits and minimizing risks; and
  • Justice – requiring that benefits and burdens of research are distributed fairly.

Hot Tips:

  • Be mindful of recruitment incentives whether cash, gift cards, free services, raffle prizes, and more.
  • Consider whether paid participants are recruited fairly, informed adequately, and paid appropriately
  • Take into consideration the subjects’ medical, employment, educational status, and their financial, emotional and community resources.
  • Consider whether incentives constitute undue inducements or coercion. We want to acknowledge a person’s time, travel costs, and other expenses but, we must ensure participation is truly voluntary.

Rad Resources:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

Hello! We are Linda Cabral, Laura Sefton and Kathy Muhr from the Center for Health Policy and Research at the University of Massachusetts Medical School. We recently completed an evaluation project that involved recruiting people with mental health conditions to participate in individual interviews, focus groups, and surveys regarding their experiences with a mental health peer specialist training program. In 2010, Woodall and colleagues reported that many barriers exist to participating in mental health research, including:

  • fear
  • suspicion and/or distrust of researchers
  • concerns about confidentiality
  • transportation difficulties
  • severity of illness
  • inconvenience
  • fear of relapse as a result of participation
  • the stigma of mental illness

We wanted to share some tips and lessons learned to address some of these barriers.

Hot Tip: Get approval. Before starting data collection, consider applying for Institutional Review Board (IRB) approval. While many evaluations for program improvement purposes do not require IRB approval, if you wish to disseminate your findings to a broad audience, this approval may be necessary to insure that recruitment efforts take into consideration an IRB’s requirements for working with vulnerable populations.

Hot Tip: Establish trust. To establish trust, the evaluation team members visited the training program and were introduced as people who would be in contact after the training was completed to get their feedback on the training. This informal introduction by a trusted source paved the way for outreach later on.

Lesson Learned: Use a script. Having a telephone script was a good tool for initiating a conversation or leaving a message with the intended participant. It helped us to remember to cover key points with potential participants.  It also reinforced our concern for their confidentiality as we avoided sharing information with others when leaving a message.

Lesson Learned: Be transparent. Once we contacted the participant, we were transparent about the purpose of the evaluation, who was funding it, and how their information would be used.

Lesson Learned: Provide multiple access points. To increase survey response rates, we brought copies to all in-person interviews, allowing time after the interview for participants to complete the survey. If they required assistance, we were present, and able to do so in real-time.

Lesson Learned: Be flexible. To increase our recruitment rate, we were flexible in our interview and survey administration formats. When possible, our first preference was to conduct in-person interviews at a time and place of the person’s choice. When this was just not feasible or could have led to a decision to not participate, we offered to conduct the interviews and survey data collections over the phone.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · · ·

My name is Laurel Lamb and I’ve been a practicing evaluator (although sometimes under the guise of organizational development) for over twenty years. When I sat down to write, I wanted to contemplate what I would share with someone new coming into the field. What have I learned that you couldn’t find in a textbook or look up online?

Lesson Learned: The Golden Rule must apply to every aspect of my evaluation practice: Do unto others as you would have them do unto you. What does this mean for the evaluator?

  • Do your background research: Today, it is easier than ever to learn the basics about many programs online or from their literature. Take the time to learn everything you can before meeting with program staff and then verify that which needs to be verified. You’ll demonstrate that you value their time and care about their program, and you’ll have the basic understanding that you need to ensure that you can be productive during your time together.
  • Show up with a smile and positive attitude: Your attitude and demeanor will set the tone for the evaluation and for the myriad interactions needed to make it happen. Is there someone in your life – not someone you love (for when in love we don’t always see straight) just someone you know – who brightens your day each time you see him or her? You can be that person. Each new client, each meeting, offers an opportunity for you to bring forth the very best of your authentic self and to be a positive and valuable contributor to the work at hand.
  • Demonstrate respect for your human subjects: Respect must go far beyond just what we learned in college about full disclosure and allowing for opt-out (which by the way I would argue has become so arduous as to make some human-subjects approved surveys include extended linguistically challenging preambles that are disrespectful of the very people they intend to protect). Respect must include meeting your subjects in their ‘space,’ and on their terms.
  • Don’t collect data without having a plan for using it: When you ask questions to satisfy little more than a client’s curiosity, you are wasting everyone’s time. This must be distinguished from collecting data to follow an emergent line of understanding or collecting data in an open-ended way in order to ensure that you are not unduly narrowing possible responses – both of which are valid and essential forms of inquiry.
  • Say ‘Thank you’: Saying thank you demonstrates that you value the investment that they have made in the evaluation and can show that you’ve listened and learned. It exemplifies basic human kindness. Say thank you in words – in person, via email, through a newsletter. Or say thank you with a small gift – a poem, a perfect piece of fruit (‘orange you happy it’s Friday! Thanks for all you’ve done this week’), or a book from your bookshelf passed on to someone who’d value its insights.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

I’m Jim Hammerman, and I Co-Direct the Evaluation Group at TERC, a non-profit, primarily grant-funded math, science and technology, education research, development, and evaluation company in Cambridge, MA. I’m also a current member and former Chair of TERC’s Institutional Review Board (IRB).

Several of us recently noted on AEA’s LinkedIn group that there are at least three ways to address the issue of Institutional Review for nonprofit organizations:

  1. Partner with a university and use their IRB process;
  2. Contract with a commercial IRB;
  3. Develop the capacity in-house, establishing your own IRB.

Over the course of three days on the aea365 blog, we’ve been sharing a few lessons learned about pursuing these approaches. I’m going to talk about option 3.

Lessons Learned: As a mid-sized non-profit of about 120 employees, several years ago we found it both necessary and advantageous to establish our own IRB. It took a bit of effort to complete the paperwork and establish the policies to obtain and maintain a Federal Wide Assurance (FWA) number through the NIH Office of Human Resource Protection (OHRP).

We started with documents borrowed from collaborators and with advice from some of our own staff who had served on university IRBs. Initially we had paper-based record-keeping, but setting up electronic systems for archiving and tracking IRB review status has been very helpful.

Having our own IRB allows us to review and oversee projects for which we’re prime, or for which our evaluation subcontract is the only real research (e.g., the prime is doing development). But arrangements can be flexible. Sometimes we obtain review for our portion of a project through a university or other institution with their own IRB when they’re the prime; sometimes our IRB oversees research of our partners if they don’t have their own IRB. For liability reasons, we don’t conduct reviews for folks with no affiliation with TERC.

Having our own IRB allows staff to work with peers who know and understand the nature of the work when developing procedures and informed consent documents. Rapid modification of procedures or consents is also easier with an in-house IRB – e.g., when we encounter unanticipated issues in some of our cutting edge work in online learning environments or with out-of-school library or community action groups. Experience as a member of the IRB can also help junior researchers become familiar with a range of study types and issues, building internal capacity. Having your own IRB can provide a number of benefits for your organization.

Resources: IRB guidelines and requirements regarding membership, meetings, initial and continuing review procedures, record-keeping, and the like can be found in the Code of Federal Regulations 45CFR690.101-124 available through the NSF (http://www.nsf.gov/bfa/dias/policy/human.jsp) or NIH (http://www.hhs.gov/ohrp/).

This contribution is from the aea365 Daily Tips blog, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org.

·

Older posts >>

Archives

To top