Greetings from McGregor, Minnesota! I am Cheryl Meld, Afterschool Program Director at McGregor Public School. McGregor is located in Greater Minnesota about 2.5 hours from the Twin Cities (120 miles).
I am excited to share my experience about how to approach evaluating the complex 21st Century Community Learning Centers (21st CCLC) program. This competitive grant program allows community learning centers to provide high-quality year-round out-of-school time programming for young people.
Hot Tip: Partner with your Afterschool Network.
In a collaborative effort to demystify program evaluation, an engaging continuous improvement process anchored with a workshop called “Making Meaning with Multiple Data Sets (M3)” was created with several out-of-school-time intermediaries. M3 invites program staff to come together as teams to look at 2-4 sets of data to intentionally reflect, plan and identify action steps to improve their afterschool programs. Minnesota’s 21st CCLC grantees are all required to attend this once a year.
“The young people were all seasoned participants of afterschool programs and surveying.
More importantly, they really cared about afterschool programs.”
Cheryl Meld
Lesson Learned: Young People Know Surveys.
Underestimating the value of youth engagement in evaluation may limit your program’s quality and growth. Until recently, I assumed the word “evaluation” would flip an off switch in those who attend afterschool programs. Young people are bombarded with surveys about academic progress, school climate, lunch menu offerings, assets, relationships, and social interaction. I wondered, “What would a 15 year-old know about evaluation?” Quite a bit, as it turns out.
Hot Tip: Do Youth Advisory Boards.
Asking high school students to become involved in evaluation processes seemed to me unlikely to garner much attention. I was proven wrong when several high school juniors on our Youth Advisory Board appealed to me to let them help with evaluation of our 21st CCLC afterschool programs. We began by reviewing the analysis of the previous year’s surveys.
Hot Tip: Bring Data Back.
It was the consensus of the Youth Advisory Board that summary statements did not accurately reflect students’ program experience. As they reflected on survey respondent errors, they believed if they had a role in introducing an evaluation survey related to program experience, they could help their peers understand why survey data is important, and thus be more accurate and honest in responding.
Lesson Learned: Young People Care.
The young people were all seasoned participants of afterschool programs and surveying. More importantly, they really cared about afterschool programs. They valued the opportunities they that experienced and understood that authentic data would help make a case for continued programming. Their program insights were important to program improvement. Their role in survey introduction inspired discussion about evaluation ethics, the need for accurate positive and negative input, and they provided an informative yet neutral role in motivating students to give honest responses.
Hot Tip: Engage Everyone!
The end of year survey process took more time this year. Scheduling the student volunteers required teacher and principal buy in. The analysis of outcome data we collected is ready for review and we are working as an intergenerational team to build better programming opportunities for next year.
Hot Tip for visitors to Minnesota for Evaluation 2019:
Minnesota is known as the Land of 10,000 lakes (but actually we have more like 12,000!) I live by Mille Lacs Lake, which is one of our biggest lakes. We’re about 2 hours north of the Twin Cities, in Central Minnesota. We have great resorts, walleye fishing, two state parks, Mille Lacs Indian Reservation, Grand Casino, and the Mille Lacs Indian Museum and Trading Post. If you can spare the time before or after the AEA 2019 conference, we’d love to have you visit (or better yet, come back in June!! J )
We’re looking forward to the fall and the Evaluation 2019 conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.
Hi Cheryl,
I want to thank you for sharing your findings with all of us, I found it both relevant to the graduate course I am currently taking and particularly inspiring as I am working on a program evaluation design focused on work with teens. I will be collaborating with some of my colleagues to lead a health and wellness committee and I have been pondering how to integrate the student participants into the evaluation process. Some of these evaluative processes included surveys, questionnaires, and direct communication. I had always assumed that young people lack motivation and interest in participating with evaluation, but the more experience I gain the more I realize they care about matters that are directly related to them.
I think you touch on why students care and want to become involved quite well and this is something I will keep in mind as I move forward in my own professional practice and evaluation procedures. Moving forward, I would like to utilize students in the role of partners to their adult counterparts. Based on these role, adults/instructors will initiate the process and the students will assist in gathering the information. Based on the maturity of the group you are working with I suppose you could tailor these roles as necessary.
I think it is fascinating that you were able to witness and record that students did not feel that the summary statements did not accurately reflect the students experience of the program. This is telling for the mere fact that adults do not always get it right. By allowing students an active role in the evaluation process, the results become more transparent and accurate. This is something program stakeholders should keep in mind as they continue to evaluate programs where youth are involved.
Thanks for the great read!
Emily Wood
Hi Cheryl,
I really enjoyed your post on Young People as Evaluation Experts. I currently teach senior high school and have also taught grade 7-9 in the past and you are very correct- young people really do know surveys and they sure do care about the things they really care about! Young people are much capable than we give them credit for at times and they want to take part in things more than people expect. This school year I had the honour of working with some fantastic students who were part of both internal and external advisor boards and councils. When we were looking to undertake an internal school survey, these students took the initiative to engage their peers in the survey regarding our school programs. I would absolutely agree that their participation in the process upped the amount of respondents in our survey but also increased the detailed level of observations made and instilled more honest and insightful responses. When young people truly believe that we actually care about and value their input, they will provide us with responses that are key to a true evaluation.
Thank you for your post, it helped me to articulate something I observed myself as well as helped to remind me as I head into another school year just how important it is to listen to those that want to talk.
Kind regards,
Jessica
I appreciate hearing about your experience involving young people in the internal survey process. They have a perspective and context essential to program quality!
Hi Cheryl,
Your blog post was so interesting to me as an educator. I am constantly looking for ways to bring student voice into my planning and assessment but perhaps do not take into account how much knowledge I can truly gain from them in terms of evaluation. Similar to Amanda, I am also a student at Queen’s University and learning about program design and evaluation. The evaluation part is quite new to me and I am finding so many correlations between what I do as a teacher in my classroom compared to what evaluators do when evaluating a program. What stands out to me the most is how engaged the students were with this after school program and that they came to you wanting to help with the evaluation of it. Thereby showcasing how important it is to hear from the voices of your ‘stakeholders’ and take into account what is important to them when evaluating a program. Creating those longer sessions where you intentionally met to make meaning of the data you collected is something is something I find truly valuable, and it was great that you had the principal and teachers buy into that process and understand how important it was to the success of your program.
Thank you for a great read,
Brittney Lehmann Scherr
Britney,
Thank you for your comments! The team of young people continues to impress us with their understanding of our many uses of data and their willingness to improve the quality of data we collect and how it impacts our programming. Cheryl
As a student learning the various utilizations and processes of program evaluation, I couldn’t agree with you more that there is a need to “demystify” the intricacies of the evaluation process. While looking at the “Making Meaning with Multiple Data Sets (M3) workshop details and structure, I appreciate the benefit of grouping multiple collaborators together to not only make sense of data, but to specifically use the interpretations of that data in order to determine next steps. Evaluation cannot be effective without implicating a great variety of stakeholders in multiple aspects of the evaluation.
I think that you have quite aptly touched on an important aspect of improving educational programs, that young people care. It has been repeated to me both through the Faculty of Education and reminded to me in my many years of teaching, that students are more engaged when they feel empowered. Therefore, your statement that “underestimating the value of youth engagement in evaluation may limit your program’s quality and growth” is doubly impactful. Allowing students to voice their satisfaction, concerns, or allow them a voice to determine “next steps” can have a significant impact in terms of a program’s success. I might also add that when findings of an evaluation are communicated to various stakeholders, engaging students might also enhance the use of evaluation findings in determining action plans. You mention the impact of teacher and principal buy-in; most certainly having students participate in the evaluation process will also encourage student buy-in.
Thanks for your “food for thought” HOT TOP of creating Youth Advisory Boards. Given the informed nature of our students, it seems very natural to ask them to share their knowledge and experiences in participating in the evaluation process. You have given me quite a bit to think about as I complete my own evaluation of school programs. Moving forward, I will consider the impact of such type of process participation from more than just youth groups.
Hi Cheryl,
I really enjoyed reading your article as it resonated with me to get youth involved in the evaluation process. If you are evaluating a program that you would like to improve for young children/adults, it is best to get their advice and opinion too. As you mentioned, they do care and it is important to give them a voice in programs they participate in. On that note, I am glad you mentioned that they need to understand the importance of data in order to take it seriously. It is best to have conversation with them about what you are looking to find out and why, and what those results will yield overall. Great insights and tips – thanks for sharing.
Best,
Victoria
Victoria- Yes, they do care, and have a context for understanding how good data impacts program quality. They are our number one stakeholders!
Cheryl
As a student learning the various utilizations and processes of program evaluation, I couldn’t agree with you more that there is a need to “demystify” the intricacies of the evaluation process. While looking at the “Making Meaning with Multiple Data Sets (M3) workshop details and structure, I appreciate the benefit of grouping multiple collaborators together to not only make sense of data, but to specifically use the interpretations of that data in order to determine next steps. Evaluation cannot be effective without implicating a great variety of stakeholders in multiple aspects of the evaluation.
I think that you have quite aptly touched on an important aspect of improving educational programs, that young people care. It has been repeated to me both through the Faculty of Education and reminded to me in my many years of teaching, that students are more engaged when they feel empowered. Therefore, your statement that “underestimating the value of youth engagement in evaluation may limit your program’s quality and growth” is doubly impactful. Allowing students to voice their satisfaction, concerns, or allow them a voice to determine “next steps” can have a significant impact in terms of a program’s success. I might also add that when findings of an evaluation are communicated to various stakeholders, engaging students might also enhance the use of evaluation findings in determining action plans. You mention the impact of teacher and principal buy-in; most certainly having students participate in the evaluation process will also encourage student buy-in.
Thanks for your “food for thought” HOT TOP of creating Youth Advisory Boards. Given the informed nature of our students, it seems very natural to ask them to share their knowledge and experiences in participating in the evaluation process. You have given me quite a bit to think about as I complete my own evaluation of school programs. Moving forward, I will consider the impact of such type of process participation from more than just youth groups.
Hi Sheila,
As a student learning the various utilizations and processes of program evaluation, I couldn’t agree with you more that there is a need to “demystify” the intricacies of the evaluation process. While looking at the “Making Meaning with Multiple Data Sets (M3) workshop details and structure, I appreciate the benefit of grouping multiple collaborators together to not only make sense of data, but to specifically use the interpretations of that data in order to determine next steps. Evaluation cannot be effective without implicating a great variety of stakeholders in multiple aspects of the evaluation.
I think that you have quite aptly touched on an important aspect of improving educational programs, that young people care. It has been repeated to me both through the Faculty of Education and reminded to me in my many years of teaching, that students are more engaged when they feel empowered. Therefore, your statement that “underestimating the value of youth engagement in evaluation may limit your program’s quality and growth” is doubly impactful. Allowing students to voice their satisfaction, concerns, or allow them a voice to determine “next steps” can have a significant impact in terms of a program’s success. I might also add that when findings of an evaluation are communicated to various stakeholders, engaging students might also enhance the use of evaluation findings in determining action plans. You mention the impact of teacher and principal buy-in; most certainly having students participate in the evaluation process will also encourage student buy-in.
Thanks for your “food for thought” HOT TOP of creating Youth Advisory Boards. Given the informed nature of our students, it seems very natural to ask them to share their knowledge and experiences in participating in the evaluation process. You have given me quite a bit to think about as I complete my own evaluation of school programs. Moving forward, I will consider the impact of such type of process participation from more than just youth groups.
/rp
Hi Cheryl,
I really enjoyed your article on the AEA365 Blog post. I currently am completing my masters and investigating program evaluation more. Your blog post caught my eye and attention because I am a teacher too. I agree with your comment, “I assumed the word “evaluation” would flip an off switch in those who attend afterschool programs.” Even as an adult, I think when attending PD’s or meetings, those end of the meeting survey’s can often be answered without much thought just to “get them done.” So, I loved that you allowed the students to have a “role in introducing an evaluation survey related to program experience, [so] they could help their peers understand why survey data is important, and thus be more accurate and honest in responding.” If I am being honest, this masters class has actually opened my eyes to responding clearly and honest on evaluation surveys. I always thought if I didn’t give positive answers or feedback on a survey the speaker/presenter/program/ect would be fired for doing a poor job. I never realized there was so much more that was being looked at and how important the data was for improvement.
I think having students involved in the evaluation for afterschool programs is a great idea. I think allowing students to help create that authentic data will help lead you to finding more positive outcomes about the program as well. It also gives the students responsibility and buy in with the program. Just like you said, “young people care!”
Thank you for your awesome idea and shared experience. I hope I can transfer this idea of involving youth in our program evaluations in my own school!
Thank you again,
Riley Wood
Hello Cheryl,
I really enjoyed your post on the AEA365 Blog from your recent post on August 1, 2019.
In my Program Evaluation masters course at Queen’s University, we learning about evaluation practices, use and design. One area we have discussed is the evaluation process to better enhance the use of evaluation. As an elementary school teacher, I have been shifting my focus to a more qualitative perspective but your article reminds me that the use of survey’s can offer valuable insight. Your HOT TIP: Engage Everyone! is a great reminder of our lessons on ensuring all stakeholder’s being part of the process to have more buy in. Setting up discussion time will allow use to use the information from a survey may require more time, but better planning and design can result from these efforts.
Your example reminded me about setting up smaller programs in my classroom and involving students as they offer valuable insight to the designing and evaluation process.
Thank you for the insight.
Respectfully, Amanda Stene
Hi Cheryl,
I really enjoyed your post on Young People as Evaluation Experts. I currently teach senior high school and have also taught grade 7-9 in the past and you are very correct- young people really do know surveys and they sure do care about the things they really care about! Young people are much capable than we give them credit for at times and they want to take part in things more than people expect. This school year I had the honour of working with some fantastic students who were part of both internal and external advisor boards and councils. When we were looking to undertake an internal school survey, these students took the initiative to engage their peers in the survey regarding our school programs. I would absolutely agree that their participation in the process upped the amount of respondents in our survey but also increased the detailed level of observations made and instilled more honest and insightful responses. When young people truly believe that we actually care about and value their input, they will provide us with responses that are key to a true evaluation.
Thank you for your post, it helped me to articulate something I observed myself as well as helped to remind me as I head into another school year just how important it is to listen to those that want to talk.
Hi Cheryl,
I enjoyed reading your article because I think it is important to involve youth in the evaluation process. Student voice is a very valuable part of education, and in all forms of evaluation. As you mentioned, they do care. They are often willing to give their opinion/suggestions on how to make things better for their experience, so they are a great resource to use in the process. As long as they know what you will be looking for in the data and what those results will yield, they can then take it seriously and help provide thoughtful insight.
Great tips – thanks for sharing,
Victoria