AEA365 | A Tip-a-Day by and for Evaluators

TAG | youth focused evaluation

We are Julie Poncelet, Kim Sabo-Flores, and Kai Fierle-Hedrick with Algorhythm, a company that builds solutions to ignite ongoing learning and drive impact in the nonprofit sector. We recently conducted a participatory evaluation with the Nellie Mae Education Foundation that focused on the impact of its Amplifying Student Voice and Leadership (ASVL) grant fund: a fund that supports youth organizing and aims to put youth at the center of conversations about education reform in New England, particularly in the area of student centered learning.

One of our goals was to build ASVL groups’ internal monitoring and evaluation capacity and we did this by establishing Local Evaluation Teams (LETs): teams of Youth Organizers and Adult Allies who were interested in learning about participatory evaluation, and in gathering key data on their organizing efforts. We wanted to help groups better understand their impact on youth peers, schools, and the community. And we wanted to help them effectively and compellingly share the story of that impact with their supporters.

We launched the LETs with an Orientation Webinar that introduced ASVL groups to the concept, goals, and purpose of an LET. Each group then had the option to participate, and the foundation made it clear that no one would be penalized for opting out. In the end, all but two groups engaged in the LET process, which included personalized support and some rad resources to support their work:

Rad Resources. We provided LETs with two key tools: a LET Handbook and an Event Monitoring Form. The handbook outlined detailed instructions for 4 easy to use, low-tech evaluation activities that could be implemented before, during, or after a community event, workshop, or action. The monitoring form was a simple worksheet that prompted Youth Organizers to track key information about an event/action, participants, and basic evaluation data.

Hot Tips. Our team also provided LETs with personalized support via two site visits and virtual coaching sessions. We met with LETs prior to events/actions to help them identify goals and determine what they wanted to learn, from whom and when. We helped them choose 1-2 simple evaluation methods from the LET Handbook they could use to gather relevant data. And, after events/actions, LETs shared their completed monitoring forms with us (and in some cases, the raw data). Last but not least, at the end of the process we met with LETs to help them “make meaning” of their data and determine how to use it for ongoing improvement, planning and communication about future organizing efforts.

Overall, feedback from the LETs was incredibly positive. Youth Organizers appreciated the accessible format of the LET Toolkit, as did Adult Allies. And we hope it will be a useful resource in your own youth led evaluation efforts.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.

We are Sara Plachta Elliott, executive director of the Youth Development Resource Center in Detroit, and Alicia McCormick, youth development director at Urban Neighborhood Initiatives, a community development organization in Detroit that provides a variety of youth programs. We collaborate to improve youth program quality, making the improvement process more youth-led along the way.

Many youth programs use the Youth Program Quality Assessment (YPQA), or other program quality assessments. YDRC runs the Youth Program Quality Intervention for youth programs in Detroit. We believe the quality improvement process is even more powerful when organizations like UNI engage youth and empower them to lead a quality improvement process for the programs they are involved in.

This summer, UNI deepened youth engagement in the evaluation process by having youth design their own program quality observation tool to compliment the organization’s use of the YPQA. UNI hired a team of 3 high-school aged youth, guided by a young adult intern, through the city’s summer youth employment program. YDRC partnered with UNI to give a day of training to the team on youth-led evaluation, and then the youth team spent the summer designing their own program observation tool and testing it with UNI’s youth programs. The youth used the results to give recommendations to the organization for future program improvements. The youth are now serving as part of the board’s Youth Development Committee, further deepening their leadership and influence within the organization.

Hot Tips:

  • Prepare the adults. An organization’s leaders need to create structures that allow for deeper youth participation. Inviting youth to join UNI’s Youth Development Committee was a structural change that allowed for more impactful youth participation.
  • Give youth agency. While the adults asked the youth to focus on program quality for one aspect of the project, the youth team also selected their own research focus which was “Why Youth?” They choose to produce a video and this increased their engagement in the evaluation work.
  • Pay youth for their work. In the case of UNI, paying youth through summer youth employment was critical for meaningful work and engagement. Adults get paid for their work on evaluation, and youth should too.

Rad Resources:

  • The Youth Engaged in Leadership and Learning curriculum from the John W. Gardner Center provides a lot of activities and meeting agendas to support youth-led research.
  • The Weikert Center for Youth Program Quality’s Youth Program Quality Assessment is a widely used quality improvement assessment tool. Their aligned Youth Work Methods trainings, such as Youth Voice, help build a foundation of readiness.
  • The Neutral Zone’s Youth Driven Spaces initiative provides a variety of resources for increasing youth-driven work, including a TAC Guidebook, a Youth-Adult Partnership Rubric, and an Agency Self-Readiness and Capacity Assessment.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.

Hi, I am Cassandra Jessee and I serve as the director of the USAID-funded YouthPower Learning project, a youth-focused knowledge-creating, -curating and -convening project and through it, I represent Making Cents International and the International Center for Research on Women in Washington, DC.

 

We defined our core approach, positive youth development (PYD) to ensure relevance for low-and middle-income countries (LMIC), whereby PYD builds and holistically supports the competencies, skills and abilities of youth so that they are empowered to reach their full potential. PYD differs from many approaches in that it views youth as active partners in development efforts, rather than as problem actors or obstacles to be overcome.

To support youth-focused evaluation and program design in LMIC, we developed the PYD Measurement Toolkit, in which we outline a simple PYD measurement framework with four overlapping domains: assets, agency, contribution, and enabling environment. This framework provides a simple entry point into measurement and aims to makes story telling about how PYD programs easier. We included illustrative PYD measurement indicators and possible tools organized by these domains that can be considered and adapted for the varying LMIC context.

Hot Tips to advance PYD:

  • Engage youth meaningfully throughout all phases of programming and evaluation. Meaningful youth engagement is a key component of effective PYD programs. Training, supporting, and mentoring youth to participate meaningfully has direct skill-building benefits and helps ensure program effectiveness and evaluation relevance.

Rad resources: We commissioned a series of videos that provide guidance and best practices on engaging youth. Also check out a recent webinar series on engaging youth we did in collaboration with AEA YFE TIG:

  • Youth Voice in Action: Tips, Strategies, and Advice from Youth Evaluators
  • Engaging Hard-to-Reach Youth in Research and Evaluation
  • Engaging Youth in Research
  • Use the term “Positive Youth Development” when evaluating programs that incorporate two or more domains of PYD. In our recently released systematic review of PYD programs in LMIC, only one program explicitly identified itself as practicing PYD. The more programs and evaluations recognize that they are employing PYD principles and use PYD terminology, the more the global community will understand the importance of PYD approaches, as well as their effectiveness.
  • Ensure consistent measurement of PYD outcomes. Many PYD programs measure sector-specific outcomes, such as knowledge of HIV, contraceptive use, job placement rates, or reduction in conflict. Very few assess PYD outcomes, such as self-efficacy, positive identity or interpersonal skills. The PYD Measurement Toolkit can help integrate PYD principles into your M&E plans.

Get Involved

Join one of our communities of practice to share and learn with others and check out www.youthpower.org and its repository of more than 2000 resources and events dedicated to PYD.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.

Thursday

 

My name is Somongkol Teng, Extension Educator for Evaluation at the University of Minnesota Extension Center for Youth Development. Recently, I conducted an evaluation using focus groups with our 4-H Online Adventure Program, a collaborative, project-based learning program for Minnesota 4-Hers, ages 10 to 12. While many evaluators are familiar with focus groups with adults, conducting one with youth requires careful considerations and preparation.

Below are lessons learned and hot tips:

  • Pick the right facilitator. A good facilitator with adults might not be as good with youth. In our case, we had a colleague who had starred in our training videos facilitate the sessions. He was selected because he was not too involved in the program, but was recognizable by the youth.
  • Be attentive to the age range. Keep the age range no more than two years. Different age groups behave differently and require different strategies.
  • Keep the group small. Unlike focus groups with adults, we found conversation was easier and richer with a smaller group of youth, usually around 5-6.
  • Group youth participants thoughtfully. Find out in advance about the youth’s group dynamic and try to separate close friends. This strategy helped ensure a wider range of comments.
  • Start with fun icebreaking activities. Invest 10-15 minutes for some fun ice-breaking topics about celebrities, video games, etc. to get the conversation started.
  • Ask age-appropriate questions. Remember that youth will have fewer life experiences to draw from compared to adults. When developing questions, keep sentence structures simple, avoid yes/no questions, and be aware of questions that potentially threaten the freedom and independence of young people (e.g. if interested in knowing how decisions were made about their 4-H project selection, try not to stress on “who” made the decision since few youth liked to admit before their peers that their parents decided for them).
  • Use interactive and participatory activities. Including technology or drawing kept the session lively and fun. We embedded a live online polling using UMU, a free online platform for engaging learning experience, into one of our activities.
  • Keep the session short. We found it effective to keep the focus groups to one-hour sessions using a short set of 6 to 8 questions.
  • Provide food. Food is the key to the heart. Find out what the youth like. Do not underestimate the power of food to keep them engaged.
  • Get consents. This is critical! Determine what the appropriate protocol might be to get parental or guardian consent. That said, it is equally important to get youth’s assent to participate in the focus group. Communicate why their participation matters.
  • Be flexible. Things are bound to not go as planned. Have fun, and go with the flow.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.

·

I’m Nicole Clark, social worker and owner of Nicole Clark Consulting (http://www.nicoleclarkconsulting.com), where I partner with community-based groups, local/national organizations, schools and more to design, implement, and evaluate programs and services geared toward women and girls of color.

In 2015, I conducted an impact evaluation on a 6-week intensive summer leadership program geared towards high school young women of color in the New York City (NYC) area. The program provided social justice classes, workshops, field trips, and leadership seminars with accomplished women of color leaders and is the flagship program of an NYC-based organization that provides social, political, and economic leadership programming for young women. The organization received funding to implement the program in another borough of NYC where we sought out to determine if it could be implemented successfully.

We used a mixed methods approach consisting of classroom observation, focus groups with the participants, in-depth interviews with on-site leadership, parents, and staff, and a post-intervention survey.

Lesson Learned #1: Conduct a community asset map to highlights the linkages, relationships, and resources located in a community. Prior to the start of the program, organizational staff conducted a community asset map to determine what resources, services, and community organizations and members were in the area. We concluded that this program presented a unique opportunity for the organization to bring a social justice curriculum leadership curriculum to the community.

Lesson Learned #2: Consider the accessibility of a program’s activities. A determinant of participants applying to the program was the commute time to the classroom site. While the program was open to all high school self-identified young women in each of NYC’s five boroughs, the flagship location is Manhattan. Remaining in their home borough allowed participants whose parents and guardians were concerned over them traveling to Manhattan to remain in the borough. Also, participants shared feeling more connected to the program and to the other participants because they were all from the same borough.

Lesson Learned #3: Staff capacity plays a major role in how a program is implemented. While the participants recommended the program be implemented in each borough, program staff identified the lack of staff capacity to do so. As a small organization, several on-site leadership and organization staff frequently traveled between the Manhattan site and the new borough. Also staff felt they had more community relationships in other boroughs compared to the borough in which the program and evaluation was implemented in. This provided a challenge for the staff, but also an opportunity to build community partnerships in a new borough

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.

Hello! My name is Nick Petten, owner of Petten Consulting in Toronto, Canada. I am a practicing program evaluator focused on child and youth programs and a children’s rights advocate. My work with children and young people over the last decade and a half has impressed upon me that they are fully capable of expressing and communicating their lived realities and that those realities are a valued focus of study.

Lessons Learned

When researching childhood, a critical methodological concern is the power dynamics that occur between adults and children. Unequal power can compromise your data and severely distort truth. But more importantly, it can cause harm to your subject.

The new sociology of childhood is a scientific discipline that promotes the process of obtaining assent from children to participate in research. So, how do you seek assent with children?

Hot tip #1: Develop an assent and accountability framework that will help you explain the process of research and its findings in a child-friendly manner and systematically ‘check-in’ with children about their participation in the research.

Hot tip #2: Develop protocols to use when seeking children’s assent that considers as many factors as possible about why they would answer in a particular way.

 Hot tip #3: Dedicate time to building relationships and having conversations with the gatekeepers and remember that such conversations need to strike a balance between providing critical information and too much detail that can lead to confusion.

Hot tip #4: Engage in internal reflection through an explicit and systematic process throughout the research process and think about how the insights gained may influence how you converse with children and their gatekeepers.

Rad resources:

If you are starting to think about how to genuinely involve children’s participation in research, you better be ready to read. By reading some of the great material on research with children you can begin to understand children’s position in research – even in our adult-centric world.

Here is some material to get your started on you reading adventure:

For more information, please go to my website: www.pettenconsulting.com

The American Evaluation Association is celebrating Youth Focused Evaluation TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

My name is Susan Igras, a Senior Advisor at the Institute for Reproductive Health, Georgetown University whose work is mainly focused in Africa and Asia.  One of our projects is operating in urban-poor areas of Kinshasa, Democratic Republic of Congo.  Designed to evaluate scalable interventions that address social and normative factors that limit adolescent and youth choices and sexual/reproductive health outcomes, we are challenged with fitting youth engagement into the research and evaluation process.  Everyone likes the idea, but how can you operationalize it without jeopardizing the externally-implemented research component?

Staff from participating organizations sat together to brainstorm a way forward; we decided we could focus youth evaluation on questions relating to improving program design, youth engagement, and implementation.  We ended up creating tables that allowed us to get practical:

Evaluation questions |data sources | youth role in designing data collection tools | and youth role in collecting data to answer the questions.

This seemingly-simple planning exercise was critical to move from a nice idea to an actionable evaluation activity.  We are still working on making all steps as youth-led as possible – stay tuned for a blog from one of our youth evaluators!

 RAD resource – for those of us working in French-language contexts:

http://www.troussemj.ca/content/page-de-renvoi – See the chapter on youth-led evaluation in this practical toolkit for engaging youth in mental health issues.

The American Evaluation Association is celebrating Youth Focused Evaluation TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

We are Aspen, Journey, Sira, Sati, and Alexus- members of the Youth Evaluation Team at the Goodman Community Center (GCC) in Madison, Wisconsin. We are high school youth who work with GCC staff, and graduate students and researchers at the University of Wisconsin-Madison to evaluate different programs in our community centers and schools.

Lessons Learned:

Within our evaluation process, many people ask, “Why should there be a youth evaluation team in our community center?” We have learned that the most important reason an evaluation team is beneficial is the opportunity for youth voice.

  • The program gives us a chance to give our input and ideas on problems within our community centers on the north and eastside of Madison.
  • It also gives us a chance to take responsibility for the improvement in the communities, and practice leadership skills around decision making.
  • Another thing that we like about being on an evaluation team is getting to collaborate with staff to organize surveys and collecting data. We also get to work as a team with the staff and get to be treated like one of them, instead of just a student giving feedback.
  • When it comes to the evaluation teams and the process, they’re a good reference to have for college applications and it is great for finding different connections throughout the community for more jobs and evaluation opportunities.
  • This program gives us, and the staff that we work with, a chance to work on improving our skills of working with others, and communicating with people we may not know to get connections for our data collecting. All of these activities have given us the opportunity to grow as a person and experience a college level type of prospect.

Our purpose for having this evaluation team and doing all this work is to better our communities and become role models for younger kids who might want to be a part of an evaluation group to have their voices heard. We also plan on leaving our community center and schools better than they were when we first started.

The American Evaluation Association is celebrating Youth Focused Evaluation TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Hello! We’re Elizabeth DiLuzio and Miranda Yates from the Strategy, Evaluation, and Learning Division at Good Shepherd Services in New York City. As an organization dedicated to youth and family development, we strive to develop and offer programs that integrate the values, insights, and ideas of our participants. And, as an organization dedicated to evidence-based practice, we are continually seeking ways to incorporate innovative and effective methodologies into our work.

Photovoice, a research methodology implemented with youth and other frequently marginalized populations, utilizes the power of photography as a catalyst for self-expression. It invites individuals to capture on film information about their lives and perspectives that might otherwise be difficult to express. A Community Portraits grant from the Human Services Council and Measure of America, with funding provided by The Leona M. and Harry B. Helmsley Charitable Trust, recently enabled us to utilize Photovoice as a tool for including youth perspectives in our strategic planning process.   Our project focused on the East New York neighborhood of Brooklyn where Good Shepherd Services seeks to deepen its work.

Interested in implementing this approach?

Hot Tips:

  1. Prepare Your Prompts Carefully. Craft 3-5 simply worded prompts that capture the questions you seek to address. Write prompts as first person statements.  Our prompts were:
  • These are the places where I feel like I belong.
  • This is my community at its best.
  • This is something that I would like to change.
  1. Utilize Your Resources.  Ask program staff to assist with recruitment. Solicit feedback on project materials, from the project flyer to the informational packets. We also found success in partnering with a former participant and professional photographer who shared tips with participants and helped host the meetings.
  2. Point-and-Shoot, Disposable, or Cell Phone? There are pros and cons to the type of camera you select. Factors include budget, pixels, product availability, and photograph collection method.
  3. Harvest the Feedback. Design a participatory meeting that offers space and time for participants to reflect on their photos and those of others. Encourage participants to discuss and interpret the photos –  identifying trends, themes, and what can be learned
  4. Share the Results in Multiple Ways. In addition to informing strategic planning, use the results to impact conversations in multiple forums. Ensure participants have copies of all their work to take with them. Display the photos at the program site. Create a photo gallery that is open to the public. Bring photos to spark conversation at a community convening. Write an advocacy report.

Rad Resource:

I Bloomed Here”, a guide created by the National Indian Child Welfare Association, has more helpful tips and ideas for designing your own Photovoice project.

The American Evaluation Association is celebrating Youth Focused Evaluation TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

I’m Amy Campbell, an evaluator at Centerstone Research Institute in Nashville, TN. While I work on several evaluation projects, one of the most rewarding is the Tennessee Healthy Transitions Initiative, where I’m able to work closely with youth and young adults (Y/YA) who have or are at risk of developing mental health or co-occurring disorders.

This year, we had an opportunity to conduct a Youth Participatory Action Research (YPAR) project with a Healthy Transitions Young Adult Leadership Council for the purposes of generating information to inform the services offered to Y/YA in Chattanooga, TN. We were able to engage Y/YA, train them in research methods, and collaboratively develop and implement a research project. Members of our YPAR team presented at Evaluation 2016 and shared how their findings will impact the design of the Tennessee Healthy Transitions Initiative.

Moving forward, this YPAR team will be meeting with Healthy Transitions leaders at the local and state levels to share their findings and collaboratively develop program solutions based on the data. Their findings will also influence the Leadership Council’s actions in the future; they are discussing social media campaigns focusing on issues they identified and other data-informed projects.

Lessons Learned:

  • Stakeholder buy-in is crucial. If you think you might not have this buy-in, you should start by having discussions that try to address this.
  • Y/YA need adequate support and training to be able to effectively engage. Share the expectations you have for your research team early (e.g., commitment, deliverables, etc.), and provide a solid foundation of the basic tenets of research in your first trainings. Use every interaction with your team as an opportunity to teach them about good research design and processes.

Hot Tips:

  • Free food is one of the best recruitment and retention tools you have at your disposal. We fed our team at every meeting, and we offered food for our data collection “event.”
  • Utilize social media and technology! I am located about 150 miles away from our research team, so we had to find creative ways to stay in contact. We used Facebook Messenger to stay connected between meetings, Evernote to keep track of meeting notes and action items, and Google Hangouts to meet remotely when we couldn’t meet in person.

Rad Resources:

The University of California, Berkeley’s YPAR Hub is a great resource for training and preparation exercises for your YPAR teams.

The American Evaluation Association is celebrating Youth Focused Evaluation TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Older posts >>

Archives

To top