AEA365 | A Tip-a-Day by and for Evaluators

TAG | youth

Hi! I am Cathy Lesesne and I work at ICF International doing public health related evaluation and research. My passion is doing work that affects the lives of adolescents, particularly those with the most need and the least voice in how to meet those needs. I do a lot of work in and with schools and school districts focused on optimal sexual health for teens and how to ensure youth have skills and ability to make healthy choices no matter when they decide to engage in sexual activity.

I often see well-intentioned school or school district staff creating solutions for youth and testing them rather than involving youth in solution identification and evaluation of the success. It is clearly easier to retain the power to determine the solutions and to see if they work in the end through evaluation. However, in my own work I have seen the power of youth engagement and involvement in both developing programs and services as well as in helping to evaluate and improve those resources.

Rad Resources: As evaluators, we often have the ability to make recommendations to our clients and partners working with youth AND we have the power to approach our evaluation work with youth in empowering and engaging ways. But we don’t always know how. I highly recommend that you dig into the Youth-Adult Partnerships in Evaluation (Y-AP/E): A Resource Guide for Translating Research into Practice and find your own ways to apply the wide range of ideas, tip sheets, and examples for engaging youth as partners in evaluation. Many of these examples may also help your clients or partners think of ways to better engage youth in the development of programs and services that reflect them and their real interests and needs. If youth are empowered to be partners in developing and testing solutions, they become allies instead of subjects; sources of solutions instead of sources of data.

The American Evaluation Association is celebrating Community Psychology (CP) TIG Week with our colleagues in the CP AEA Topical Interest Group. The contributions all this week to aea365 come from our CPTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hi my name is Kim Sabo Flores, I am the Co-Founder of Algorhythm. Over the last 20+ years, I have been working as an evaluator in the field of youth development. Recently I’ve observed an unfortunate trend in the field: A LOT OF TALK about “evidence-based,” “research-based” and “data-driven” decision-making, and very LITTLE ACTION. This is particularly true for youth practitioners, working on the front lines of social change, where data could have the greatest impact. Why, in this rich information and technology era is this still a challenge?

Here are a few Hot TIPS:

Bring the power of data to the front lines of social change: Data is power! And for the most part that power is held by senior-level staff and has been used to leverage resources rather than to drive programmatic decision-making. It is rarely the case that evaluation findings are shared and analyzed with front-line staff, and there is a radical misunderstanding of their ability to effectively understand and utilize data.

Hot Tip: Support ALL staff to learn from and make meaning of data; be sure they’re included when you share your findings.

Value rather than evaluate: Research and even evaluation reports are written and consumed by academics and funders. However, they leave practitioners with limited practical information about how to improve outcomes for ALL youth —specifically the most difficult to serve.

Hot Tip: Utilize predictive and prescriptive analytics that focus on what “works” for each and every youth, valuing all the various pathways taken toward success rather than just those taken by the “average” youth.

Measure what matters: Driven by funding demands, program staff spend precious time and resources capturing mandated data such as report cards, test scores, attendance records, etc., with the full knowledge that these metrics do not fully tell their story and are not fully attributable to their programs. Front line workers are tired of gathering meaningless data that doesn’t answer their questions.

Hot Tip: Use research-based social/emotional measures to show proximal gains that contribute to academic achievement, reduction of risk and thriving. These types of outcomes speak directly to the work of youth development and allow front-line staff to see their contribution.

Provide timely insights at a low cost: Take advantage of new technologies that allow programs to gather data, immediately analyze it and put it to use. Such technologies increase data utilization and ultimately increase the impact on youth. Best of all, it drastically decrease the cost and allows more nonprofits to afford evaluation and to afford it more often!

Rad Resources:

Foundation For Young Adult Success: UCChicagoCCSR. Concept Paper for Research and Practice. June 2015.

FREE webinar:“21st Century Impact Measurement for Youth Serving Organizations,” and learn more about a game-changing approach to impact measurement.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Julie Poncelet and I am co-founder of Action Evaluation Collaborative (AEC), a partnership of independent consultants who use evaluation to strengthen social change. We want to share some of the participatory methods we use to engage youth in evaluations.

By participatory methods, we mean ways of working together, to gather insights, such that everyone, including program participants, are involved and play an active and influential part in decisions that affect their lives. Methods include a variety of tools or exercises to engage youth in meaningful, age-appropriate ways.

Last year, AEC blogged about the use of Participatory Video (PV) with a youth group in Mexico. Although we are not always able to engage youth in PV for evaluation, we do video document (with permission) all of our participatory engagement work so that the voices and experiences of youth can be brought into the collective analysis and sensemaking process. In addition, we also encourage the active participation of youth in such processes directly so that insights can be contextualized and dialogued enriched. However, the power dynamics between youth and adults should be considered, especially when working internationally.

Hot Tip: In our evaluation work, we use a range of visioning tools, adapting activities like force field analysis, fish and boulders (which we called butterflies and stones) or road journey to explore girls’ individual and collective aspirations. These tools help us, as well as the girls and program implementers, to better understand the current status of girls in the community and conditions or forces that support them to achieve their vision or hold them back. A little bit of paper, some cut out butterflies, and scented markers can create a fun, energizing space for girls to safely explore and share their insights.

  • One key lesson learned from doing this type of visioning work with girls from marginalized communities is that many have not had the opportunity to explore their aspirations and so it may be challenging for them at first to engage with the activity. We have found that conversation, especially around what girls like (or don’t like), what they are curious about, and who they look up to, can help girls to reflect or think more deeply about their dreams.

Rad Resources: We have used Liberating Structures in some of our evaluations working with older youth and have found the following to provide an energizing structure to facilitate the quick generation and sharing of ideas: 1-2-4-All and Mins Specs. We have also used storyboarding (Liberating Structures version and from Dr. Kim Sabo Flores’ book) and other drawing activities to engage youth in sharing experiences, feelings and attitudes.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Greetings from Toyin Owolabi at the Women’s Health Action Research Center (WHARC) in Nigeria and Susan Igras at Georgetown University’s Institute for Reproductive Health (IRH). Last year, we joined together on a cross-country project to build capacity in designing and evaluating programs for younger adolescents.

Younger adolescent programming and related program evaluation is nascent in the international arena. Nigeria is a leader in Africa in adolescent health programming and research but, like many countries, has not yet focused much on the developmental needs and concerns of 10-14 year olds, who are often lumped into all-adolescent program efforts. Younger adolescents’ cognitive skills are still developing and traditional focus group discussions and interviews do not work well. Games and activity-based data collection techniques can work much better in eliciting attitudes, ideas and opinions.

Going beyond knowledge to assessing more intangible program outcomes such as gender role shifts, IRH has been using participatory methodologies drawn from rapid rural appraisal, advertising, and other disciplines, and adapting them for evaluation.

Staff from WHARC, a well-respected research and advocacy organization, were oriented to and used many of these methodologies for a first-time-ever needs assessment with younger adolescents in Ibo State. The assessment provided data to advocate for age-segmented program approaches for adolescents and inform program design. Some of the things we learned:

HOT TIPS:

Make data collection periods brief for short attention spans. Build in recess periods (and snacks!) if data collection takes longer than 20-30 minutes.

Challenge your comfort level in survey development. Standard adolescent questions may not apply. Younger adolescents’ sexual and reproductive health issues generally revolve around puberty, self-efficacy, emerging fertility, gender formation, and body image, and NOT pregnancy and HIV prevention.

Youth engagement is important, and older adolescents may contribute better to evaluation design. Having recent recall of the puberty years, they also bring more abstract reasoning skills than younger adolescents.

COOL TRICK:

“Smile like you did when you were 13 years old!” This opened one of our meeting sessions and startled quite a few participants. It is really important to help adults get into the ‘younger adolescent zone’ before beginning to think about evaluation.

RAD RESOURCES:

This article by Rebecka Lundgren and colleagues provides a nicely-described, mixed method evaluation of a gender equity program (2013): Whose turn to do the dishes? Transforming gender attitudes and behaviours among very young adolescents in Nepal.

The Population Council is revising its seminal 2006 publication, Investing when it counts: Generating the evidence base for policies and programmes for very young adolescents. A guide and toolkit. Available in late 2015, it contains evaluation/research tool kit references available from various disciplines.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Greetings from Montreal! My name is Mónica Ruiz-Casares and I am an Assistant Professor in the Division of Social and Transcultural Psychiatry and at the Centre for Research on Children and Families at McGill University and an Evaluation Advisor at the Centre de Santé et des Services Sociaux de la Montagne, a primary health and social services centre in Montreal serving a largely immigrant and refugee population. I want to share a visual method that I have developed and used with colleagues in Canada, Liberia, and Laos and that can easily be adapted to other cultural contexts.

Lesson Learned:

In order to facilitate discussions with young children, we selected several dozen images online that participants could easily relate to. For example, to study risk and protective factors, images represented common barriers and supports that children encounter in each given setting. This way, images can easily be adapted to the developmental level and socio-cultural context of participants. Here are some key elements for a successful exercise:

  • Select a balanced set of images representing positive and negative elements in natural and social environments that participants can easily relate to. For example, a review of current prevalence rates and/or expert opinions can help identify relevant sources of risk and protection.
  • It is crucial that the same set of images is used for both positive and negative experiences (for example, to explore young people’s perspectives of risk and safety). This will allow opposite views of the same topic or situation to surface.
  • Validate the selection of images with young people of similar age group, ethnicity, etc. as the population you will be working with. Whenever possible, involve young people in the selection (or generation!) of images too.

Hot Tips:

Make more than one copy of images to facilitate several young people selecting the same image. Even if an image is selected only by one participant, after s/he has explained to the group why s/he selected that image to represent happiness/safety or sadness/risk, it is useful to ask other participants about their views on that image.

Cool Trick: Online image libraries

Search one of the open-access repositories for context-appropriate images. For example, the Centre for Disease Control has a searchable online image library. Using open-access images will not only be cheaper, but will facilitate the dissemination of information.

Rad Resources:

These two articles outline the materials, sequence of activities, and surprising results from 7-11 year olds in Liberia and Laos. I will be happy to communicate with anyone who is considering adapting and using this method.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi, I am Lan To, Director of Postsecondary Initiatives at Good Shepherd Services in New York City. Each year, Good Shepherd hosts a Youth Summit for our participants in middle-school and high-school-aged programs, where they form teams to work on a research-based service learning project over a 4-month period.  Each team is asked to identify a critical issue of concern in their community (programs participate from across the Bronx, Brooklyn and Manhattan), research it, develop a project to address it, and then create a presentation to share about their findings at the Youth Summit.

In alignment with our commitment to youth development and college and career readiness, the Youth Summit inspires the development of youth voice, youth leadership, civic engagement, networking, teamwork, and research and presentation skills.  It is a youth-led event run by a council of youth representatives from each participating program with the support of staff mentors and coaches.  We value this experience because it highlights our young participants’ ideas and values, helps develop a skill set for research-based project development, and inspires agents of social change within the various communities we work.  It is about youth teaching youth (and adults) about issues they deem important, and features their strategies on how to make a difference.

LESSONS LEARNED:

  • Youth-driven research is most effective when staff are well-versed in the service-learning process, but still flexible enough to allow youth to explore within the process.  We created a mentor professional learning community to help staff mentors in our various programs connect and share best practices.
  • Connecting youth teams to “topic experts” in the community helped enhance their research and overall project by making the research more tangible and the findings applicable to current work occurring in the field.  For example, one team focusing on the issue of “healthy relationships” was connected to a local domestic violence shelter.

RAD RESOURCES:

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi, I’m Siri Scott, and I work as a Graduate Assistant with the University of Minnesota Extension.

Conducting interviews with youth is one way to consider gathering information for program improvement, especially when you want to bring youth voice into your improvement process. While more time intensive than doing a survey, interviews will provide you with rich contextual information. Below is a brief overview of the planning process as well as tips for conducting interviews.

Hot Tips: Planning Phase

One main decision is whether or not you will need IRB approval for conducting interviews. Even when done for program improvement purposes, it is a good idea to comply with IRB regulations for data practices and protection for youth. Other major decision points in the planning process include how many individuals you will interview, how you will choose your participants, and how you will collect and analyze your data. In addition, you must decide what type of interview you want to conduct, the purpose of the interview, and then create an interview protocol (if appropriate).

Hot Tips: Conducting interviews

Here are some tips for conducting interviews:

  • Practice: Test the use of the protocol with a colleague (or a young person who you know well) and ask for feedback about the questions themselves and how they fit together.
  • Space: Find a quiet, secluded space for the interview in a public setting (or in a private home if the young person’s guardian can be present). You don’t want other people overhearing your conversation or your participant being distracted by anything.
  • Warm up: Start the interview with some informal chit chat. This will build your rapport with the participant and ease the participant’s (and your) nerves.
  • Probe: If you are not doing a structured interview, make sure to ask participants to clarify or elaborate on their responses. This will provide you with much better data.
  • Notes: If you are using an audio recorder, don’t trust it (fully). Jot down some quick notes during the interview. You can elaborate on these later if the audio recorder malfunctions.
  • Relax! If you mess up, that’s okay. Also, if you’re nervous and tense, the participant will sense that. Do whatever you can to put the participant (and yourself) at ease.
  • Learn More:  A good resource for learning about how to conduct interviews is this newly released, comprehensive overview of the interviewing process:  InterViews: Learning the Craft of Qualitative Research Interviewing (3rd Edition) by Brinkman and Kvale (2014).

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Greetings! My name is Suzanne Le Menestrel and I am a National Program Leader for Youth Development Research at the 4-H National Headquarters, National Institute of Food and Agriculture, U.S. Department of Agriculture.  4-H is a national youth development organization serving 6 million youth throughout the country. We partner with the nation’s Cooperative Extension system operated by the more than 100 land-grant universities and colleges and with National 4-H Council, our private, non-profit partner. Recent trends in funding have elevated the importance of illustrating impact and accountability for nonformal educational programs.  We were also interested in building capacity for evaluation through the creation of easy-to-use and accessible tools.  We partnered with National 4-H Council, state 4-H program leaders, 4-H specialists and Extension evaluators from around the country to create a national 4-H common measures system that will also enable us to aggregate data across very diverse 4-H programs.

I have learned a number of lessons through the implementation of this new system.

Lessons Learned:

    • Common measures must be developmentally appropriate. Children and youth who participate in 4-H range in age from ages 5 to 19.  Because of concerns about reading levels and developmental appropriateness, we focused the common measures on ages 9 to 18. We also divided up the measures into two levels—one for children and youth in grades 4 through 7 and one for youth in grades 8 through 12.
    • Common measures must have strong psychometric properties.  As much as possible, we drew from existing measures but have been conducting analyses with both pilot and preliminary data.
    • Measures must be applicable to a broad variety of programs. 4-H looks very different from county to county and state to state. We started with the creation of a national 4-H logic model that represents desired program outcomes.

Clipped from http://www.4-h.org/about/youth-development-research/

 

  • Common measures must be available through a flexible, easy-to-use, and robust on-line platform.  This includes the ability to add custom items.
  • Training and technical assistance are key to the implementation of common measures in a complex, multi-faceted organization such as 4-H.
  • Buy-in and support from stakeholders is critical as is creating an ongoing system for soliciting stakeholder feedback.
  • Such a system cannot be developed without sufficient funding to support the on-line platform, technical assistance, and on-going formative evaluation.
  • Common measures are a flexible product that needs to grow and change with the outcomes of the organization.

Rad Resource:

Check out this article written by Pam Payne and Dan McDonald on using common evaluation instruments.

The American Evaluation Association is celebrating Extension Education Evaluation (EEE) TIG Week with our colleagues in the EEE Topical Interest Group. The contributions all this week to aea365 come from our EEE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

I am Alice Hausman, a professor of Public Health at Temple University. I have been working as a community based participatory research (CBPR) evaluator of youth violence prevention initiatives in urban environments for many years.

Lesson Learned:

  • Involve the Community in Identifying Measures and Data. As part of the participatory evaluation planning process, I always ask community participants to define their vision of program success. But I take it one step further by looking for data that might actually measure these community-defined outcomes. The process of working with community partners to identify measures and data has been as rewarding as just asking what success would look like.

Hot Tips

  • Use available data sources in partnership with the community. One community collaborative I worked with identified available data sets and survey opportunities they could use to evaluate their programs.  In another project, a randomized community trial of a multi-level violence prevention program, we found that the standardized psychometric tools being used by the evaluation trial could be used to measure community-defined constructs, such as “showing kids love”, after reconfiguring the items through a participatory review process.
  • Remind yourself of the value of community-evaluator partnerships.  In our case, the indicator itself was insightful about the community’s perception of social and relationship factors related to preventing youth violence. But the actual process of discussing the instruments and constructs was rewarding for all parties. The academic researchers learned more about the lived experience of their community partners who learned more about measurement development and psychometric research.
  • Don’t hesitate to collaboratively develop new measures Another important outcome of the process of identifying existing data to measure community ideas was the realization that new measures and data might be needed to accurately capture the constructs defined by the community. While our community partners were initially concerned with the burden of adding new questionnaires, their views shifted somewhat after seeing that the benefit of being able to actually measure community defined constructs would outweigh the risks of more surveys.

Rad Resource:

Get Involved:  I would love to hear from others who have done work in this area. We can compare notes on indicators and measures and possibly find ways to make measuring community-defined outcomes as routine as measuring outcomes defined by funders.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

I am Lisanne Brown, Director of Evaluation and Research at the Louisiana Public Health Institute (LPHI). I lead a staff of 10 evaluators who evaluate LPHI’s programs. Increasingly we are also providing evaluation services to external organizations and programs throughout Louisiana.

As part of an evaluation of a peer led HIV and AIDS prevention program for vulnerable youth in New Orleans, implemented by a local AIDS Services Organization, we trained and supported youth to conduct and analyze qualitative interviews with their peers. The interviews sought to understand the behaviors, needs and preferences of New Orleans Black youth and develop programmatic strategies relevant to them. In addition, the training and the interview process aimed to enhance the communication, outreach and analysis skills of the peer educators.

Hot Tip – Engage youth directly in all aspects of interview processes: All aspects of the interview process, from questionnaire development through programmatic recommendations were peer-driven with support from the LPHI evaluation team. This allowed for youth to be deeply engaged in the interview process and understand the results in order to translate them into program recommendations.

Hot Tip – Consider using the third person for interviews: We asked the Peer educators to conduct interviews in the third person so that interviewees discussed their peer group rather than themselves, helping to establish greater comfort and trust between the interviewees and the peer researchers, and allowing for more information sharing and deeper insight. Through this strategy, we were able to obtain a wide range of information through a small number of interviews.

Hot Tip – Employ storytelling for deeper understanding: Storytelling was an important aspect of the interviews. In primary interviews, interviewees were encouraged to share stories related to topics on the questionnaire. In recall interviews, peer-researchers recounted the interview as a story, and the evaluation staff probed for greater depth which resulted in more meaningful program recommendations.

Lesson Learned – Creating archetypes can aid in analysis: The peer researchers also guided data analysis. LPHI evaluation staff conducted a preliminary theme analysis, followed by an analysis workshop with the peers. Peer researchers led a discussion of the findings to create an archetype of a peer named Chris. The results of the peer-led analysis was a Peer Archetype Illustration and Narrative that described Chris’ lifestyle and a Suite of Micro-Narratives that described Chris’ social and sexual network.

To apply our findings to the program, peer researchers created a programmatic timeline in which they brought Chris through the steps of the CHAT program. For Chris’ social and sexual network, we examined factors that would hinder or motivate their participation in the CHAT group education session, “Condoms 101.” This helped the peer researchers to develop conclusions and recommendations that are important to them.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Older posts >>

Archives

To top