Greetings. I’m Mary Murray, founder of of MEMconsultants. For two decades my team has built the capacity of youth-serving organizations to use evaluation practices to strengthen programs. Consistently, this has involved gathering information from youth via end of program surveys, interviews and focus groups. Often, we incorporate activities that expand beyond simply collecting information from youth, as we believe Youth Participatory Evaluation (YPE) is a valuable method of fostering youth development while amplifying youth voice and strengthening the programs intended to serve them.
A favorite recent activity involved engaging youth in redesigning an end-of-program survey. Our client, a community-based organization that delivers out-of-school-time programming in public housing sites, recognized that their survey was overdue for review. It reflected a mix of items important to program managers, required by funders, and left over from by-gone eras. The client shared our value of providing young people leadership roles and empowered us to facilitate a process that allowed for youth leadership in this survey review effort. We set out to engage youth in identifying what topics were relevant and worthy of inclusion on the survey, and advise us on how to design the survey.
Prior to COVID-related shut downs, we scheduled the two focus groups, one with middle schoolers and one with high schoolers. We confirmed 12 participants for each (anticipating fewer would attend), offering pizza and a grocery gift card as thank you gifts for their time and wisdom. When together, we covered four major discussion topics:
- Youth described their experiences in the program.
- Youth clarified the topics they believe should be included on an end-of-program survey.
- Youth reviewed the existing survey and recommended edits.
- Youth gave advice on how to improve the design, format, or any other aspects of the survey.
Follow up sessions were held over Zoom (since programs were delivered this way). Youth reviewed the drafted survey, completed it as a pilot test, and discussed their additional feedback. Then, we added an unanticipated topic: how do these newly designed surveys need to be adapted, given COVID?
Youth have consistent views on which survey items are relevant to their lives and program experience (and would elicit valid responses), and which are simply not pertinent (and would likely elicit a thoughtless response). They recommend more items about program experiences and short-term benefits and fewer items about connections to their future and long-term outcomes.
Youth engage in familiar debates over the relative merits of qualitative and quantitative survey items. This group recommended more open-ended questions, at the beginning of the survey where they would be answered when respondents were not yet fatigued.
Youth advised that even the best survey will not elicit valid or complete results unless respondents are given reasons to take the survey seriously. They recommend that program participants also help develop survey administration scripts, and that in some cases, program participants themselves help deliver the survey.
- The Youth Participatory Research Practice Guide, prepared by Wilder Research in partnership with the Youth Leadership Initiative, offers an example of an intensive YPE process and useful tools. It is complemented by the report From Respondents to Evaluation Practitioners, which summarizes reflections on the Wilder Foundation’s Youth Leadership Initaitive’s Youth Participatory Evaluation Workshop and Practice Guide.
- The Continuum of Youth Involvement in Research and Practice, developed by Act for Youth, demonstrates how Youth Participatory Evaluation can start small and be incrementally developed in your program.
The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to email@example.com. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.