Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

YFE TIG Week: Purple slushy data points: lessons learned from program evaluation in high school settings by Scott Mengebier

Hi, I’m Scott Mengebier and I work with an Evaluation team for a national foundation.

For the past year, I have been leading our program evaluation in research compliance nationally, and driving data collection on a regional level for our largest project: measuring the impact of a social-emotional learning intervention on high schoolers’ academic performance and behavior.

We entered high schools as a first-year program in a large Midwestern urban school district where students had experienced trauma, suffered from chronic levels of stress, and were prone to mistrust new and potentially temporary volunteers. In these dynamic applied settings, we set out to test students’ executive function using objective performance measures on the NIH Toolbox administered on iPads to increase reliability and confidentiality.

Lessons Learned:

  1. Relationships are everything – cooperation follows trust, which can only be earned.

Start at the top: students often look to trusted adults (but not necessarily those in charge) for cues on how to react to our program and research.

We invested time and energy to befriend these key allies and engaged them to help promote our program’s value throughout the community. Their support greatly improved recruitment and retention throughout the testing period.

  1. Keep it short, keep it real – from the moment you start talking, you have 45 seconds.

We packaged essential information into quick, memorable nuggets that aligned the intellectual benefits from their participation (“get to know your flow”) with more practical rewards (“bring that form back, get some hot chips”).

  1. Come for the slushies, stay for the stories.

Short-term gains from food incentives can drive up initial interest and participation, but it can also become saturated and collapse over time.

Purple slushies were popular in bringing participants in for recruitment, but promises of pizza parties did not guarantee students would turn up for testing. This led to frustration (a.k.a. stress-eating pizza) and inefficient use of resources (well – almost), whereas long-term relationship building yielded higher retention in our evaluation.

Hot Tips:

  1. Get a copy of the class rosters: knowing (and remembering!) students’ names when you wish to recruit them helps build rapport and develop those key relationships.
  2. Use an ‘exit ticket:’ checking students’ understanding through rapid-fire questions will reinforce the essential takeaways, especially for those students who may not have been listening until you mentioned ‘hot chips.’
  3. Keep incentives small, but be consistent: small rewards like food can help build initial interest but they are not as important as delivering on any promise you make, especially for long-term engagement and continued attendance.

Capturing data is a thrilling and challenging part of the evaluation process, but with youth, each potential data point is built on trust and some creative thinking.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.

2 thoughts on “YFE TIG Week: Purple slushy data points: lessons learned from program evaluation in high school settings by Scott Mengebier”

  1. Hi Scott,

    Thank you for sharing this blog post. I am currently taking an introductory Program Evaluation and Inquiry course. I am new to the field and still developing an understanding of evaluation practices and programs. Your post resonated with me and my experiences working with youth as a teacher for the last decade. There is an abundance of programs targeted at youth; therefore, the involvement of youth in evaluations in imperative to understand program strengths, weaknesses, and potential inequities that may exist within the program framework.

    The hot tips provided in tackling evaluations with youth are practical and resonate with my own experiences. Trust cannot be understated in interactions with youth. As an evaluator it is important to have the support of the adults that students already trust. Walking into a room full of youth who do not know you or trust you will negatively impact your ability to conduct an evaluation. I agree you can bridge the trust by taking an active interest in learning names and interest quickly. Food and snacks work as a great incentive in creating intrigue around a program or encouraging initial participation. I think that the novelty wears off quickly and evaluators must capitalize on the initial incentive participation to build relationships and rapport with students so that they continue to come and participate.

    Placing the evaluator amongst the very community they are attempting to evaluate brings forth positives and negatives. Evaluators who work closely with stakeholders can build genuine relationships which in turn may increase participation in the program and provide greater insight into the program (Shulha & Cousins, 1997). There is also evidence that misuse of findings can also occur when the evaluator is too close to the program community. When evaluators begin building relationships with program stakeholders, they are more likely to be biased in their reporting due to pressures that may emerge from the program community (Shulha & Cousins, 1997).

    What types of checks and balances would you recommend ensuring that evaluators remain neutral in their evaluation and reporting even after forging close relationships with stakeholders throughout the evaluation?

    Kind regards,
    Christina

    References

    Shulha, L., & Cousins, B. (1997). Evaluation use: Theory, research and practice since 1986. Evaluation Practice, 18, 195-208.

  2. Reisha Williams

    Hi Scott,

    Thank you so much for sharing your experiences! As a high school teacher myself and a Master’s student learning about program inquiry and evaluation, I connected with your article on two different levels.

    I found your insights regarding what helps adults connect with students to be very accurate. Relationship-building is key, but I imagine that it is far more challenging as an ‘outsider’ that is there for a short amount of time as opposed to a teacher that interacts with the students every day. Knowing students’ names is a great tip! I can imagine some of my students responding with a judgmental “How do you know know my name?” quip, but secretly feeling flattered and intrigued. And in terms of food incentives, you hit the nail on the head. I find that they are extremely responsive and engaged when candy, for example, is brought out on occasion, but they lose interest and it doesn’t inspire long-term participation.

    On a more academic side of things, we recently explored the topic of evaluation use in one of my Master’s courses, and I was especially interested in how relationship-building and stakeholder involvement plays a role. Shulha and Cousins argue that evaluators require strong interpersonal skills to build relationships and reduce stakeholders’ skeptical views on their presence (1997). As program evaluation can often appear very logical and mechanical, I think it is important to reflect upon the importance of communication and people skills as well.

    Thanks again for sharing your thoughts!
    -Reisha

    Shulha, L., & Cousins, B. (1997). Evaluation use: Theory, research and practice since 1986. Evaluation Practice, 18, 195-208.

Leave a Reply to Reisha Williams Cancel Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.