Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

Finding What Works When Working with Kids by Shannon Sharp and Moira Ragan

Greetings, AEA365 readers! Liz DiLuzio here, Lead Curator of the blog. To whet our appetites for this year’s conference in beautiful New Orleans, this week’s posts come to us from the feature the perspectives of the Gulf Coast Eval Network (GCEval) members, where the uniqueness of doing evaluation in the gulf south will be on display. Happy reading!


Greetings from the Center for Research Evaluation (CERE) at the University of Mississippi. In our roles as Evaluation Associate (Shannon) and Senior Evaluation Associate (Moira), we regularly work to engage children of various ages in research and evaluation, from program theory development to evaluating program implementation. Using some approaches and strategies that worked well—and many that didn’t—we are sharing tips to give youth voice in program evaluation and soliciting your help in compiling more.

While youth are key stakeholders in many evaluations, traditional data-collection methods can be challenging, especially for those under 12. Practical and logistical issues often lead to data that are less meaningful, more superficial or incomplete in accurately reflecting young stakeholders’ points of view. When two unique and fun methods flopped (see below), we took a step back to figure out what to do differently in the future.

Lessons Learned (what didn’t work)

  • Photovoice: Taking pictures and writing how they made children feel would be fun and engaging, right? Yes, but most young children (1) didn’t know how to work a Polaroid camera, (2) lacked adequate writing skills and (3) had a hard time expressing their feelings in words. Coming from psychology and education backgrounds, we were excited about the method but neglected to think it through. We salvaged data collection by taking pictures of things the children pointed to and writing their reactions for them. This didn’t help them express their feelings, but it was a start.
  • Sharing walls: Who doesn’t love sticky notes? Well, these kids might have but it didn’t help us collect the data we needed. They struggled with the prompts we provided, mostly writing their own names or drawing (unrelated) pictures. In subsequent projects, we wrote the instructions for adults, encouraging them to write responses shared by their child(ren). This helped somewhat, but adults didn’t always read or follow the prompts either.

But, we can’t always default to adults when working with children!

Hot Tips (what worked)

Recently, young stakeholders shared what helped them engage in both the program and evaluation… spoiler alert, these happen to be good ways to build relationships with children, generally.

  • Show up. Be present during participation. This shows you care.
  • Connect over similarities. Get to know the children you’re working with and find ways to relate—i.e., take a personal interest. Children spoke highly of conversations with program leaders and the evaluation team about family and hometowns.
  • Simplify language. Don’t use jargon and terminology unfamiliar to children. For example, did you know many children (not just young children) don’t know what a focus group is? Call it a “group interview” or “a talk with them and their classmates.” This isn’t limited to children—a good rule of thumb for adults is to use language on a 6th grade reading level (or lower, depending on the audience).

Rad Resource?

Following these principles will set the stage for successful engagement and we want to hear about it.

This is where you come in. Rather than minimize their place in our evaluations, we hope to build on lessons learned from a variety of evaluations—ours and others’—to give children a voice in evaluating programs. We’ve developed a survey for researchers and evaluators to share their experiences working with young children (ages 5-12). We want to know what worked, what didn’t and what you learned through these experiences. If you’d like to contribute, please take a moment to complete this SURVEY by September 23.

We will share survey findings and tips at our Evaluation 2022 session: (Re)evaluating inclusive methodology: Best practices for hearing young voices.


We’re looking forward to the Evaluation 2022 conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to contribute to AEA365? Review the contribution guidelines and send your draft post to AEA365@eval.org. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.