AEA365 | A Tip-a-Day by and for Evaluators

CAT | Youth Focused Evaluation

We are Julie Poncelet, Kim Sabo-Flores, and Kai Fierle-Hedrick with Algorhythm, a company that builds solutions to ignite ongoing learning and drive impact in the nonprofit sector. We recently conducted a participatory evaluation with the Nellie Mae Education Foundation that focused on the impact of its Amplifying Student Voice and Leadership (ASVL) grant fund: a fund that supports youth organizing and aims to put youth at the center of conversations about education reform in New England, particularly in the area of student centered learning.

One of our goals was to build ASVL groups’ internal monitoring and evaluation capacity and we did this by establishing Local Evaluation Teams (LETs): teams of Youth Organizers and Adult Allies who were interested in learning about participatory evaluation, and in gathering key data on their organizing efforts. We wanted to help groups better understand their impact on youth peers, schools, and the community. And we wanted to help them effectively and compellingly share the story of that impact with their supporters.

We launched the LETs with an Orientation Webinar that introduced ASVL groups to the concept, goals, and purpose of an LET. Each group then had the option to participate, and the foundation made it clear that no one would be penalized for opting out. In the end, all but two groups engaged in the LET process, which included personalized support and some rad resources to support their work:

Rad Resources. We provided LETs with two key tools: a LET Handbook and an Event Monitoring Form. The handbook outlined detailed instructions for 4 easy to use, low-tech evaluation activities that could be implemented before, during, or after a community event, workshop, or action. The monitoring form was a simple worksheet that prompted Youth Organizers to track key information about an event/action, participants, and basic evaluation data.

Hot Tips. Our team also provided LETs with personalized support via two site visits and virtual coaching sessions. We met with LETs prior to events/actions to help them identify goals and determine what they wanted to learn, from whom and when. We helped them choose 1-2 simple evaluation methods from the LET Handbook they could use to gather relevant data. And, after events/actions, LETs shared their completed monitoring forms with us (and in some cases, the raw data). Last but not least, at the end of the process we met with LETs to help them “make meaning” of their data and determine how to use it for ongoing improvement, planning and communication about future organizing efforts.

Overall, feedback from the LETs was incredibly positive. Youth Organizers appreciated the accessible format of the LET Toolkit, as did Adult Allies. And we hope it will be a useful resource in your own youth led evaluation efforts.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.

We are Sara Plachta Elliott, executive director of the Youth Development Resource Center in Detroit, and Alicia McCormick, youth development director at Urban Neighborhood Initiatives, a community development organization in Detroit that provides a variety of youth programs. We collaborate to improve youth program quality, making the improvement process more youth-led along the way.

Many youth programs use the Youth Program Quality Assessment (YPQA), or other program quality assessments. YDRC runs the Youth Program Quality Intervention for youth programs in Detroit. We believe the quality improvement process is even more powerful when organizations like UNI engage youth and empower them to lead a quality improvement process for the programs they are involved in.

This summer, UNI deepened youth engagement in the evaluation process by having youth design their own program quality observation tool to compliment the organization’s use of the YPQA. UNI hired a team of 3 high-school aged youth, guided by a young adult intern, through the city’s summer youth employment program. YDRC partnered with UNI to give a day of training to the team on youth-led evaluation, and then the youth team spent the summer designing their own program observation tool and testing it with UNI’s youth programs. The youth used the results to give recommendations to the organization for future program improvements. The youth are now serving as part of the board’s Youth Development Committee, further deepening their leadership and influence within the organization.

Hot Tips:

  • Prepare the adults. An organization’s leaders need to create structures that allow for deeper youth participation. Inviting youth to join UNI’s Youth Development Committee was a structural change that allowed for more impactful youth participation.
  • Give youth agency. While the adults asked the youth to focus on program quality for one aspect of the project, the youth team also selected their own research focus which was “Why Youth?” They choose to produce a video and this increased their engagement in the evaluation work.
  • Pay youth for their work. In the case of UNI, paying youth through summer youth employment was critical for meaningful work and engagement. Adults get paid for their work on evaluation, and youth should too.

Rad Resources:

  • The Youth Engaged in Leadership and Learning curriculum from the John W. Gardner Center provides a lot of activities and meeting agendas to support youth-led research.
  • The Weikert Center for Youth Program Quality’s Youth Program Quality Assessment is a widely used quality improvement assessment tool. Their aligned Youth Work Methods trainings, such as Youth Voice, help build a foundation of readiness.
  • The Neutral Zone’s Youth Driven Spaces initiative provides a variety of resources for increasing youth-driven work, including a TAC Guidebook, a Youth-Adult Partnership Rubric, and an Agency Self-Readiness and Capacity Assessment.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.

Hi, I am Cassandra Jessee and I serve as the director of the USAID-funded YouthPower Learning project, a youth-focused knowledge-creating, -curating and -convening project and through it, I represent Making Cents International and the International Center for Research on Women in Washington, DC.

 

We defined our core approach, positive youth development (PYD) to ensure relevance for low-and middle-income countries (LMIC), whereby PYD builds and holistically supports the competencies, skills and abilities of youth so that they are empowered to reach their full potential. PYD differs from many approaches in that it views youth as active partners in development efforts, rather than as problem actors or obstacles to be overcome.

To support youth-focused evaluation and program design in LMIC, we developed the PYD Measurement Toolkit, in which we outline a simple PYD measurement framework with four overlapping domains: assets, agency, contribution, and enabling environment. This framework provides a simple entry point into measurement and aims to makes story telling about how PYD programs easier. We included illustrative PYD measurement indicators and possible tools organized by these domains that can be considered and adapted for the varying LMIC context.

Hot Tips to advance PYD:

  • Engage youth meaningfully throughout all phases of programming and evaluation. Meaningful youth engagement is a key component of effective PYD programs. Training, supporting, and mentoring youth to participate meaningfully has direct skill-building benefits and helps ensure program effectiveness and evaluation relevance.

Rad resources: We commissioned a series of videos that provide guidance and best practices on engaging youth. Also check out a recent webinar series on engaging youth we did in collaboration with AEA YFE TIG:

  • Youth Voice in Action: Tips, Strategies, and Advice from Youth Evaluators
  • Engaging Hard-to-Reach Youth in Research and Evaluation
  • Engaging Youth in Research
  • Use the term “Positive Youth Development” when evaluating programs that incorporate two or more domains of PYD. In our recently released systematic review of PYD programs in LMIC, only one program explicitly identified itself as practicing PYD. The more programs and evaluations recognize that they are employing PYD principles and use PYD terminology, the more the global community will understand the importance of PYD approaches, as well as their effectiveness.
  • Ensure consistent measurement of PYD outcomes. Many PYD programs measure sector-specific outcomes, such as knowledge of HIV, contraceptive use, job placement rates, or reduction in conflict. Very few assess PYD outcomes, such as self-efficacy, positive identity or interpersonal skills. The PYD Measurement Toolkit can help integrate PYD principles into your M&E plans.

Get Involved

Join one of our communities of practice to share and learn with others and check out www.youthpower.org and its repository of more than 2000 resources and events dedicated to PYD.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.

Thursday

 

My name is Somongkol Teng, Extension Educator for Evaluation at the University of Minnesota Extension Center for Youth Development. Recently, I conducted an evaluation using focus groups with our 4-H Online Adventure Program, a collaborative, project-based learning program for Minnesota 4-Hers, ages 10 to 12. While many evaluators are familiar with focus groups with adults, conducting one with youth requires careful considerations and preparation.

Below are lessons learned and hot tips:

  • Pick the right facilitator. A good facilitator with adults might not be as good with youth. In our case, we had a colleague who had starred in our training videos facilitate the sessions. He was selected because he was not too involved in the program, but was recognizable by the youth.
  • Be attentive to the age range. Keep the age range no more than two years. Different age groups behave differently and require different strategies.
  • Keep the group small. Unlike focus groups with adults, we found conversation was easier and richer with a smaller group of youth, usually around 5-6.
  • Group youth participants thoughtfully. Find out in advance about the youth’s group dynamic and try to separate close friends. This strategy helped ensure a wider range of comments.
  • Start with fun icebreaking activities. Invest 10-15 minutes for some fun ice-breaking topics about celebrities, video games, etc. to get the conversation started.
  • Ask age-appropriate questions. Remember that youth will have fewer life experiences to draw from compared to adults. When developing questions, keep sentence structures simple, avoid yes/no questions, and be aware of questions that potentially threaten the freedom and independence of young people (e.g. if interested in knowing how decisions were made about their 4-H project selection, try not to stress on “who” made the decision since few youth liked to admit before their peers that their parents decided for them).
  • Use interactive and participatory activities. Including technology or drawing kept the session lively and fun. We embedded a live online polling using UMU, a free online platform for engaging learning experience, into one of our activities.
  • Keep the session short. We found it effective to keep the focus groups to one-hour sessions using a short set of 6 to 8 questions.
  • Provide food. Food is the key to the heart. Find out what the youth like. Do not underestimate the power of food to keep them engaged.
  • Get consents. This is critical! Determine what the appropriate protocol might be to get parental or guardian consent. That said, it is equally important to get youth’s assent to participate in the focus group. Communicate why their participation matters.
  • Be flexible. Things are bound to not go as planned. Have fun, and go with the flow.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.

·

Hi, we are Abhijay Kumar  and Jordan Scrimger, members of the Metropolitan Youth Policy Fellows (MYPF), a diverse group of youth working together for a better metropolitan Detroit. The MYPF, which includes members from multiple communities —across cities, neighborhoods, and suburbs – formed in order to participate in issues that impact our lives. We think it is important that youth have a voice in policy decisions. Youth know what is going on in their communities and have ideas for solutions.

The first project we undertook was a survey (N=1,191) asking youth to identify issues in their communities. Afterwards, we engaged youth in focus groups (N=53) to hear more about ideas they had for change. We gathered valuable information and created a report, making these findings and recommendations accessible to everyone.

Along our journey, the MYPF members have reflected on our experiences and compiled some tips that are valuable for anyone wanting to support the efforts of youth:

Hot Tips:

  • Reflect on your power and privilege. Acknowledge resources where you possess privilege, like networks, funding, knowledge, etc. and recognize how it can be used to optimize youth efforts.
  • Provide space, be easy to access, and invite conversation. The belief that adults and youth are equals and that the exchange of ideas is bidirectional creates a more enriching and productive atmosphere for all.
  • Be a “coffee filter”. Rather than change the flow of thoughts, facilitate the refinement process towards a goal.
  • Get youth voice and input. Youth should not be an afterthought. They should be provided a platform to directly influence policy-making.
  • Treat youth as allies, not like a burden. We want a voice in the way that our lives are decided. Our viewpoints can provide valuable insight into the world that could only be conveyed through collaborations with youth.
  • Don’t tokenize. Offering youth an authentic way in which their ideas can be taken seriously and with the intent to problem solve is essential.
  • Ask us! Don’t make assumptions. Decisions that are made must be representative and inclusive of candid youth perspectives. They should be actively involved in the decision making process, as youth can help ensure that good intentions reap positive impact.

RAD RESOURCES:

 

group photo of authors

            Members of our team after presenting at the Fourth International

              Center for Culturally Responsive Evaluation and Assessment Conference.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.

I’m Nicole Clark, social worker and owner of Nicole Clark Consulting (http://www.nicoleclarkconsulting.com), where I partner with community-based groups, local/national organizations, schools and more to design, implement, and evaluate programs and services geared toward women and girls of color.

In 2015, I conducted an impact evaluation on a 6-week intensive summer leadership program geared towards high school young women of color in the New York City (NYC) area. The program provided social justice classes, workshops, field trips, and leadership seminars with accomplished women of color leaders and is the flagship program of an NYC-based organization that provides social, political, and economic leadership programming for young women. The organization received funding to implement the program in another borough of NYC where we sought out to determine if it could be implemented successfully.

We used a mixed methods approach consisting of classroom observation, focus groups with the participants, in-depth interviews with on-site leadership, parents, and staff, and a post-intervention survey.

Lesson Learned #1: Conduct a community asset map to highlights the linkages, relationships, and resources located in a community. Prior to the start of the program, organizational staff conducted a community asset map to determine what resources, services, and community organizations and members were in the area. We concluded that this program presented a unique opportunity for the organization to bring a social justice curriculum leadership curriculum to the community.

Lesson Learned #2: Consider the accessibility of a program’s activities. A determinant of participants applying to the program was the commute time to the classroom site. While the program was open to all high school self-identified young women in each of NYC’s five boroughs, the flagship location is Manhattan. Remaining in their home borough allowed participants whose parents and guardians were concerned over them traveling to Manhattan to remain in the borough. Also, participants shared feeling more connected to the program and to the other participants because they were all from the same borough.

Lesson Learned #3: Staff capacity plays a major role in how a program is implemented. While the participants recommended the program be implemented in each borough, program staff identified the lack of staff capacity to do so. As a small organization, several on-site leadership and organization staff frequently traveled between the Manhattan site and the new borough. Also staff felt they had more community relationships in other boroughs compared to the borough in which the program and evaluation was implemented in. This provided a challenge for the staff, but also an opportunity to build community partnerships in a new borough

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.

I’m Sondra Stegenga, an occupational therapist, home visitor, educational administrator, and Ph.D. student at the University of Oregon.  Evidence has shown that meaningful family involvement is key to long-term outcomes for children. In early intervention and early childhood (EC) systems we are charged with basing services, supports, and goals on family needs and priorities. Given the varied learning needs and contextual and cultural values of families, and the lack of research on involving families in data practices, this process may be unintentionally overlooked or underutilized. In a recent study, Brawley and Stormont found that although 82% of EC teachers identified sharing data with families as important, only 42% reported regularly doing so. Data collection in EC programs can become a rote task, completed without much meaning or family involvement. Failing to include families in data processes not only violates foundational tenets of early intervention and early childhood but more importantly deprives families of valuable learning and reflection, greater involvement in their child’s plan, and improved chances of successful outcomes.

Lessons Learned:

  • In 20+ years of working with children and families I learned the impact of involving families in data practices. This lines up with what researchers and evaluators have noted that involving families in data processes leads to increased communication and better outcomes.

Hot Tips:

  • To engage parents in data practices we must first engage families in the whole educational process. Consider cultural, contextual, and family needs. Engagement may look different to each family, but should be conveyed thorough mission, goals, and formal practices explicitly outlining the importance of and practices supporting family involvement. Gathering input from through a variety of methods (via smartphone, in-person, and times convenient for the family) is imperative to meaningful family engagement.
  • Involve families from the beginning as “partners” in data collection, reflection, and use. This will demystify the process and support full, meaningful family engagement. Explain reasoning for data, timelines, and gathering data. Take time to understand parents’ prior experience, fears, and questions related to data. Ask parents what is meaningful to them and discuss how they would like to measure their child’s progress.
  • Use various modes of data presentation. Graphs and visualizations are shown to be powerful communicators of data. In addition, telling the story of the data and linking to family’s needs, priorities, and contexts is key to understanding.

Rad Resources:

The American Evaluation Association is hosting the Disabilities and Underrepresented Populations TIG (DUP) Week. The contributions all week are focused on engaging DUP in your evaluation efforts. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Greetings, I am Brian Molina, a graduate student at in Western Michigan University’s Industrial/Organizational Behavior Management doctoral program. I have conducted single subject research and implemented performance improvement projects across many different settings and organizations.

Lessons Learned:

  • Single subject research is used to evaluate program effectiveness across large groups. The belief that single subject research can only be used with one person at a time is a common (yet understandable) misconception. The number of people in the analysis depends upon the frame of reference we are interested in evaluating. For example, we may easily choose to evaluate the performance of a single individual, a single team, a single department, a single organization…and so on.
  • The methodology is friendly toward organizations with limited resources or experience in evaluation.  Evaluation can be as intimidating! For organizations that are new to the process, interpreting and understanding large group statistical analyses may be difficult. Single subject research typically results in data that show behavior change over time, which can easily be interpreted by researchers and clients alike. Easier data collection and analyses make it more likely that organizations will begin and continue evaluation of their activities.
  • Single subject designs allow for more maximum flexibility in implementing program changes. Conducting research is rarely an orderly process that goes precisely according to plan. Single subject methodology accommodates this unpredictability well. Changes in behavior are rapidly observable during the course of program implementation, not simply at the conclusion of sometimes-lengthy data collection. This allows leaders to make on-the-go changes to the intervention that best serve the client, without contaminating the results of an experimental evaluation.

Rad Resources:

Single Subjects example

The American Evaluation Association is hosting the Disabilities and Underrepresented Populations TIG (DUP) Week. The contributions all week are focused on engaging DUP in your evaluation efforts. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hello! My name is Nick Petten, owner of Petten Consulting in Toronto, Canada. I am a practicing program evaluator focused on child and youth programs and a children’s rights advocate. My work with children and young people over the last decade and a half has impressed upon me that they are fully capable of expressing and communicating their lived realities and that those realities are a valued focus of study.

Lessons Learned

When researching childhood, a critical methodological concern is the power dynamics that occur between adults and children. Unequal power can compromise your data and severely distort truth. But more importantly, it can cause harm to your subject.

The new sociology of childhood is a scientific discipline that promotes the process of obtaining assent from children to participate in research. So, how do you seek assent with children?

Hot tip #1: Develop an assent and accountability framework that will help you explain the process of research and its findings in a child-friendly manner and systematically ‘check-in’ with children about their participation in the research.

Hot tip #2: Develop protocols to use when seeking children’s assent that considers as many factors as possible about why they would answer in a particular way.

 Hot tip #3: Dedicate time to building relationships and having conversations with the gatekeepers and remember that such conversations need to strike a balance between providing critical information and too much detail that can lead to confusion.

Hot tip #4: Engage in internal reflection through an explicit and systematic process throughout the research process and think about how the insights gained may influence how you converse with children and their gatekeepers.

Rad resources:

If you are starting to think about how to genuinely involve children’s participation in research, you better be ready to read. By reading some of the great material on research with children you can begin to understand children’s position in research – even in our adult-centric world.

Here is some material to get your started on you reading adventure:

For more information, please go to my website: www.pettenconsulting.com

The American Evaluation Association is celebrating Youth Focused Evaluation TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

My name is Susan Igras, a Senior Advisor at the Institute for Reproductive Health, Georgetown University whose work is mainly focused in Africa and Asia.  One of our projects is operating in urban-poor areas of Kinshasa, Democratic Republic of Congo.  Designed to evaluate scalable interventions that address social and normative factors that limit adolescent and youth choices and sexual/reproductive health outcomes, we are challenged with fitting youth engagement into the research and evaluation process.  Everyone likes the idea, but how can you operationalize it without jeopardizing the externally-implemented research component?

Staff from participating organizations sat together to brainstorm a way forward; we decided we could focus youth evaluation on questions relating to improving program design, youth engagement, and implementation.  We ended up creating tables that allowed us to get practical:

Evaluation questions |data sources | youth role in designing data collection tools | and youth role in collecting data to answer the questions.

This seemingly-simple planning exercise was critical to move from a nice idea to an actionable evaluation activity.  We are still working on making all steps as youth-led as possible – stay tuned for a blog from one of our youth evaluators!

 RAD resource – for those of us working in French-language contexts:

http://www.troussemj.ca/content/page-de-renvoi – See the chapter on youth-led evaluation in this practical toolkit for engaging youth in mental health issues.

The American Evaluation Association is celebrating Youth Focused Evaluation TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Older posts >>

Archives

To top