My name is Lauren Bloem. I’m an internal evaluator at AchieveMpls. As the strategic nonprofit partner of Minneapolis Public Schools (MPS), AchieveMpls rallies community support to inspire and equip Minneapolis youth for careers, college, and life. I led a strategic project this year to learn how youth want to be consulted in Career and College Readiness programming. Naturally, I set out to talk with youth about this question.
We relied on principles of both Youth Participatory Evaluation and Human Centered Design as we planned youth listening sessions. We provided space for high schoolers to reflect on current programming and design solutions to their own self-described challenges. We planned an activity to facilitate conversations around each key question, including: What is it like to be in the middle of postsecondary planning in high school? Where do you go for information about careers and college? What is the likelihood you would engage with social media for career and college planning? What makes you feel heard and listened to?
Activities included Human Survey, Journey Mapping, and Voting with Candy (all found in Kim Sabo Flores’ Youth Participatory Evaluation). I’ve found facilitating active conversations like these prompt youth to be open, honest, and have a lot of fun.
Lessons Learned: Listen to Youth
We learned a lot from the 38 youth we spoke with! Youth appreciated being listened to, and they would all recommend attending a listening session to a friend. We are working to build capacity among school-based staff to facilitate listening sessions with youth throughout the next school year, rather than this being a function of only evaluation staff.
We also heard about some practical concerns, like the fact that students hear a lot about how to get into college, but not necessarily how to pay for it, and also a desire to learn more about careers while in high school. These opportunities will guide program development moving forward.
Check out this Stanford d.school Facilitator’s Guide and adjust the “challenge” for your audience. The overarching steps in this human centered design process of Empathy, Define, Ideate, Prototype, and Test align naturally with evaluative thinking. This framework resonates with youth, as it describes a process of learning and problem solving we all use daily. Additionally, I find it a flexible framework I can use to guide evaluation and learning from community stakeholders.
And we aren’t done! A truly iterative process will start over next fall, asking new questions informed by youth listening sessions this past year. If you have ever successfully scaled youth participatory evaluation, let me know how it went!
Hot Tip for visitors to Minnesota for Evaluation 2019:
Minnesota is home to many immigrant and refugee communities, including Mexican and Central American communities; Hmong, Vietnamese, Cambodian, Karen, Karenni, and other Southeast Asian groups; as well as Somali, Oromo, Ethiopian, and other East African communities. Check out the Midtown Global Market for a range of tastes and experiences, or get a little more specific at the El Burrito Mercado, Hmongtown Marketplace or the Karmel Mall (a Somali market).
We’re looking forward to the fall and the Evaluation 2019 conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.
Hello Lauren,
I gravitated towards your post for two reasons: I am currently taking a course on program evaluation and inquiry and I am a college teacher who regularly wonders how students make the decisions they do regarding post-secondary choices.
How you went about your project really intrigued me. Relying on principles from both a Youth Participatory Evaluation approach and the Human Centered Design were not only relevant to your target group but yielded the insight needed to answer your research questions. Both the Youth Participatory Evaluation and Human Centered Design are new concepts to me and I couldn’t help but do some side research on both.
I see the Human Centered Design approach as a method of collecting data but also an act of problem solving, at its finest! The idea of putting yourself in the person’s shoes that you are “designing for” will bring about a solution that is directly and deeply representative of the user. It’s easy to make assumptions or think “we” know what would work best, but in actual fact, this act is far from empathetic.
You mention that the process outlined in Human Centered Design “aligns naturally with evaluative thinking”. I noticed how you said evaluative thinking instead of evaluation. This prompted me to compare these two terms side by side. My key takeaway was that evaluation can be seen as the what – it requires certain skills and resources. Alternatively, evaluative thinking is the why – it requires certain attitudes and is an approach and a way of thinking. Human Centered Design is just that – it is a way of thinking. A creative and innovative approach to problem solving.
Lastly, out of curiosity – other than the listening sessions you facilitated; did you collect feedback using other methods?
Thank you for your thought-provoking post Lauren.
Good luck with your journey in evaluation. From what I am learning in this course, it is such a diverse topic with so much to digest!
-Diana Helferty
Hello Lauren,
I gravitated towards your post for two reasons: I am currently taking a course on program evaluation and inquiry and I am a college teacher who regularly wonders how students make the decisions they do regarding post-secondary choices.
How you went about your project really intrigued me. Relying on principles from both a Youth Participatory Evaluation approach and the Human Centered Design were not only relevant to your target group but yielded the insight needed to answer your research questions. Both the Youth Participatory Evaluation and Human Centered Design are new concepts to me and I couldn’t help but do some side research on both.
I see the Human Centered Design approach as a method of collecting data but also an act of problem solving, at its finest! The idea of putting yourself in the person’s shoes that you are “designing for” will bring about a solution that is directly and deeply representative of the user. It’s easy to make assumptions or think “we” know what would work best, but in actual fact, this act is far from empathetic.
You mention that the process outlined in Human Centered Design “aligns naturally with evaluative thinking”. I noticed how you said evaluative thinking instead of evaluation. This prompted me to compare these two terms side by side. My key takeaway was that evaluation can be seen as the what – it requires certain skills and resources. Alternatively, evaluative thinking is the why – it requires certain attitudes and is an approach and a way of thinking. Human Centered Design is just that – it is a way of thinking. A creative and innovative approach to problem solving.
Lastly, out of curiosity – other than the listening sessions you facilitated; did you collect feedback using other methods?
Thank you for your thought-provoking post Lauren!
Good luck with your journey in evaluation. From what I am learning in this course, it is such a diverse topic with so much to digest!
-Diana Helferty