Hello! My name is Lizzie Esposito and I am working with Courageous heARTS, a youth-led nonprofit art studio in the heart of south Minneapolis. I am serving with heARTS for one year as an Americorps promise fellow, and have the opportunity to have many new experiences in the world of youth work, nonprofits and education. Daily, I work with students at Roosevelt high school in a space we call the restART room. In this unconventional space, students can be found practicing self care- whether that’s setting goals, journaling about things that they’re grateful for, relaxing with some music and art, stretching their bodies or reading a great book. This work comes out of the dedication Courageous heARTS has for caring for and illuminating young people, and it’s been amazing to see them take ownership and responsibility for their own happiness and wellness this school year.
On November 15th, we had the honor of hosting professionals in evaluation as part of the American Evaluation Association Conference in Minneapolis, organized by YFE TIG. This trip gave visiting evaluators a chance to step inside a youth organization in the city, and allowed us to pick their brains and soak in their expertise in evaluation. We started by sharing an experience of mindfulness and expression- repurposing playing cards as artistic messages to ourselves. I thank our guests for bringing their creative side out for the afternoon! After presenting a logic model for the program, the evaluators in attendance were able to provide feedback on the anticipated outcomes, methods for evaluation as well as format and narrative of the model itself. So far this year I have been able to hear from students on occasion about their experiences in the space by having them fill out a sign in sheet that includes documentation of their moods as they enter and leave the space. However, I know there is room to get a clearer idea of how the programing is working! As the restART room is in the pilot stages, and am so appreciative of the advice and guiding questions I heard as we go about developing how to best evaluate this program.
Through this exchange of experiences and ideas, I found two major takeaways as someone just dipping their toes into this type of thinking.
Hot Tip:
- Specify outcomes. I think my tendency when it comes to youth work is to see the million positive benefits a program could have for a young person. The initial logic model for our program was filled with a laundry list of things you hope to develop with the young people you care about- positive self image, routines, great communication, problem solving skills and more. Yet with such broad aims for a program, how can anything get done? This is not to say these outcomes won’t exist, but getting specific and targeting fewer skills can make measurement of outcomes easier, and better guide curriculum and programming. For instance, the evaluators noticed some of the outcomes presented were filled with multiple skills (learn strategies for self care, develop habits and routines, prioritize wellness) that may each need their own definitions and strategies for evaluation. With the hopes for the program streamlined, I can focus daily activities in the service of these goals.
- Creative methods for data collection. This is something that has been important to us at Courageous heARTS. We want to encourage discussion, collaboration and art in all interactions with the studio. We wanted ideas for how to avoid using surveys for measurement as we work with young people on developing creative methods of expression and think it would be amazing to honor that. Evaluators had some great ideas about how to use journaling, image analysis, story, focus groups and collaborative projects as ways of evaluating. One idea brought up was to have each student submit a piece of artwork or writing about their experience with the program and create a book or zine to capture outcomes and notice common themes or takeaways. We are excited to also implement the use of a color wheel to have students mark a color that represents their feelings as they enter and exit the room each day, and share why they chose that particular color, and what happened if anything to trigger a shift.
The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.
Hi Lizzie,
My name is Gala Vlasic – I am a graduate student at Queen’s University in Ontario. I am currently completing a course on program evaluations and found your article on Aea365. The insights you mentioned about evaluation within the scope of youth programs such as the restART program you outlined were interesting, specifically – the two ‘Hot Tips’ you mention both seem immediately valid when addressing youth driven evaluation.
Your acknowledgement of the need to specify a direct outcome for the evaluation is invaluable. So often when faced with a youth-driven program it can become easy to find so many lofty goals and ways of implementation that sound so viable and interesting. Once we try to put these ideas into practice, the evaluators, participants and program organizers can all become overwhelmed by too much work, too many options and too much information. Strategizing to focus on specific outcomes within a specific time frame seems like an appropriate solution. In our course studies we are learning the value of specificity, especially when creating program theories and logic models. As well, we are learning that care must be taken by evaluators to not implement utilization-driven evaluations where a predetermined outcome is supported by biased data collection or review (Saunders, 2012). Contextual factors driven by evaluators who are inspired by certain ideals and ideas for a program can skew the results of an evaluation, even if their intentions are good. In his article on the usability of evaluation, Saunders explains that an evaluator’s “situation recognition, responsiveness, anticipation and [ability to] analyze people” (2012, p. 198) are all crucial to an effective evaluation. Therefore, I wonder how difficult it is for an evaluator to separate their own bias and emotions from the program in review? Considering your tip on scaling back the quantity of outcomes to focus on quality of outcomes, I am curious on how can an evaluation support this with a conscious recognition of evaluation bias or disposition?
Thank you for highlighting this information, I have even more to consider than before!
Saunders, M. (2012). The use and usability of evaluation outputs: A social practice approach. Evaluation, 18(4), 421–436. https://doi.org/10.1177/1356389012459113