Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

ACM TIG Week: Collaborative Partnerships with Clients as a Foundation for Evaluation and Program Improvement by Yvette Clinton & Sharika Bhattacharya

Hello! We are Drs. Yvette Clinton and Sharika Bhattacharya from ICF and we partner with Young Audiences of Louisiana (YALA) to evaluate their arts-integrated professional development programs implemented in Louisiana schools. These programs seek to increase teachers’ knowledge and capacity for arts-integrated instruction to positively impact student outcomes. As part of our formative and summative evaluation activities, we administer teacher surveys and conduct focus groups or interviews with teachers, principals, and teaching artists to better understand their experiences in the program. We would like to share lessons learned in partnering with YALA to help them identify areas for program improvement based on evaluation findings. 

Lessons Learned:

If one of your evaluation goals is to help the client make improvements to their program, there are important steps to take before you design the evaluation and collect data!

  • Be intentional about fostering client buy-in for the evaluation and using them as a program resource. If your client is committed and invested in evaluation, it will make your job easier as an evaluator.
    • Actively engage client in discussions about their program and objectives before designing your evaluation. Having a deep understanding of what your client envisions for the program will help you design an evaluation that best meets their needs.
    • Build client ownership in the evaluation by sharing your evaluation design and proposed instruments, and tapping into their knowledge. Your client’s feedback could make the evaluation stronger.
  • Ensure your data collection scope is broad enough to capture feedback from a range of stakeholders.
    • Develop instruments to capture perspectives of participants and program implementers. These stakeholders often have unique insights on program challenges and areas for improvement, based on their different vantage points.

Hot Tips:

  • Think beyond the written evaluation report for sharing findings. Verbally sharing key findings with your client can help maintain engagement and foster valuable discussion that can lead to program improvement.
  • Plan to share interim findings on a schedule that maximizes the potential for actions based on evaluation findings. Be aware of the program cycle so that program implementers have ample time to refine the program based on findings. 
  • Consider your relationship with the client and the stakes of the evaluation when determining how to share program challenges or weaknesses in a productive way.
    • Use stakeholder voices to identify and share recommendations for program improvement. If your participants identify challenges or suggestions for improvement, consider incorporating quotes when sharing these findings and offering solutions.
    • o   If the program did not meet an expected outcome, be transparent about the evidence you have to support the finding and identify potential solutions, which may include a different approach, or suggesting who might be involved in making changes to the program, as well as how you as the evaluator can provide ongoing feedback. 

Rad Resources:

The American Evaluation Association is celebrating Arts, Culture, and Museums (ACM) TIG Week. The contributions all week come from ACM TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

1 thought on “ACM TIG Week: Collaborative Partnerships with Clients as a Foundation for Evaluation and Program Improvement by Yvette Clinton & Sharika Bhattacharya”

  1. Hi Drs. Yvette Clinton and Sharika Bhattacharya,

    Thank you for sharing your article on AEA365 about collaborative evaluation suggestions for an arts based program. I enjoyed reading your article and took some time to look at YALA’s website and I absolutely love their program philosophy. I am an elementary school teacher in Canada, and am currently taking a course in Program Evaluation and development to complete my Professional Masters of Education. I connected with your article not only for the evaluation component, but also because I have taught music for a few years and have recently permanently switched over to the general classroom teaching middle school grades (6/7). I fully believe in and use an arts centered approach in many of the ‘traditional’ academic areas such as math and language arts, as I believe it leads to the students having a holistic learning experience which results in deeper understanding. I would love if a program like YALA would pop up in my area!

    One of the requirements in my Program evaluation and development course is to create a program evaluation for a non profit of my choice from start to finish. To say this has been a challenge for me has been an understatement. I learned quickly that evaluation is complex with many moving parts that are connected. I appreciate the areas that you highlight in your “Lessons learned” section, particularly being intentional to foster stakeholder buy in before the evaluation is even designed to have a strong understanding of what my stakeholder envisions for their program. This important suggestion/statement by you both is something that I am understanding better as of late. As I have moved along through my coursework, I am becoming more aware of the importance and significance that the relationship between evaluator and stakeholder can have on an evaluation’s success and use.

    An area that I would say I am not as strong in during the evaluation design phase is my statement of evaluation use and how I would achieve use with my evaluation. In your Hot Tips section, both of you again raise some suggestions that I believe would have a positive impact on my own evaluation if used. I am using a participatory approach in my outcome evaluation and I believe that your suggestion of using quotes from my stakeholders in my evaluation findings of challenges and weaknesses would greatly strengthen and increase my evaluations use and by doing this it also lends itself to following the ethical guidelines for evaluators (ie: truly letting the people that are affected by the program have their voices heard.) Also, I believe I underestimated the power of verbal dialogue to share report findings with my stakeholders. This suggestion of also using conversation as a reporting method makes complete sense for my program. Lastly, I think I may have had a hole in my understanding of evaluation use as I did not consider it part of the evaluation scope to need to provide solutions for any challenges and weaknesses that are identified in the recommendation. I previously thought that this would be the responsibility of the program operators after hearing the findings of the evaluation. Being an evaluator that can provide and guide my stakeholder to solutions will again, strengthen that relationship of trust between the two, and increase the use of the evaluation. I have not turned in my final program design yet and am going to edit and add in these great recommendations, so, thank you!

    This evaluation business is complex! As I continue to unravel the layers and learn a little more each day about the ins and outs of program evaluation, I am thankful for forums such as AEA365 and experts like yourselves who can share your best practices to help us rookies (aka me!) to better inform our practices.

    All the best and stay safe,

    Julia Douglas

Leave a Reply to Julia Douglas Cancel Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.