Hey there! I’m Paisley Worthington, a doctoral candidate studying program evaluation at Queen’s University (Kingston, Canada). Today’s post is about research on evaluation that digs into how the Collaborative Approaches to Evaluation (CAE) principles can be used to foster high levels of participation and co-creation in an online environment. This project was led by myself, Cheryl Mak (M.Ed. graduate), Michael Holden (doctoral candidate), and Dr. Michelle Searle (professor), all of Queen’s University.
From October 2020 to July 2021, our team evaluated a program within our university. Due to the pandemic, each and every interaction between the evaluation team and the client team was virtual. We were concerned about how we might develop strong relationships with one another and foster collaboration in this challenging new environment.
Both teams wanted high levels of collaboration, so we decided to use the CAE principles to frame our work. Happily, our evaluation turned out to be very collaborative, and in the months that followed we noticed the evaluation results were put to good use by the client. This research looks back retrospectively to identify the ways we operationalized the CAE principles.
Rad Resource
Check out the Shulha et. al (2015) for the research behind the eight principles and how each can be interpreted.
We completed a secondary analysis of evaluation materials, namely: meeting minutes, presentation materials and activities, final reports, and check-in reflections. Our coding strategy focused on pinpointing the ways our activities aligned with and diverged from the CAE principles; we achieved this by using CAE as our analytical framework.
Our results found that we used many small practices that aligned with the principles. Additionally, each of the eight CAE principles was linked to at least one activity throughout the evaluation.
Lessons Learned
We sorted the CAE activities and saw that they naturally fell into four broad categories that you might also be able to try in your own evaluation practice:
- Hosting structured and interactive meetings – Sticking to a somewhat familiar meeting agenda but using different activities each time, and always including an ice breaker!
- Posing purposeful questions that elicit engagement – Directly asking the client team about their opinions and preferences on evaluation design, having explicit conversations about program goals in 1, 2, 5 years…
- Soliciting feedback – Collaboratively editing instruments and proposals in real time during meetings.
- Creating opportunities to collaboratively analyze data – Hosting two data parties to involve clients directly in some analysis activities in a totally online setting!
Although I shared only a few examples, we used many techniques that consistently aligned with these four practices and noticed how important they were in fostering that collaboration and productivity we all were after. One throughline of these four practices is intentional and nurturing engagement with the client team.
Hot Tip
Treating clients’ time and energy as critical resources not to be over-exploited goes a long way! We conserved clients’ time by carefully constructing agendas that made it easy to collaborate during scheduled meetings. Our meetings followed a somewhat consistent format, always including ice breakers, evaluation updates, and purposeful engagement (e.g., discussing strategy/mission/vision, collecting feedback, or doing collaborative data analysis.)
Although evaluators may have been using practices like these for many years, learning how they can work in online spaces continues to be important after the height of COVID-19. We used various digital tools (e.g., Zoom, Padlet, Sketchboard) to facilitate these processes virtually; we touch on these tools and provide more examples of how we used the CAE principles in our upcoming practice note.
Rad Resource
Keep an eye out for our practice note in the Canadian Journal of Program Evaluation!
Relationship building is flexible and unique to each context. There is no way to guarantee a productive partnership, even if you use all the hot tips and cool tricks out there. However, we offer these four practices as a starting place to bring the CAE principles to life.
The American Evaluation Association is hosting Research on Evaluation (ROE) Topical Interest Group Week. The contributions all this week to AEA365 come from our ROE TIG members. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.