Hi, I’m Courtney Clingan, a Senior Research Analyst with The Improve Group, a research and evaluation consulting firm based in St. Paul, Minnesota. To wrap up our week of posts on evaluation teams, I want to share a memorable experience I had when staff from a client organization truly became part of the evaluation team.
We recently started working with HOPE Coalition, a Red Wing, Minnesota-based agency that serves people who have lived through domestic violence, sexual assault, child abuse, and homelessness. Like other nonprofits that serve clients with complex needs and have small budgets, HOPE Coalition was looking to establish sustainable and realistic evaluation methods.
To empower HOPE Coalition staff to use their knowledge of client needs and to give them skills they could apply after our project is finished, we designed a workshop that made the staff part of the team. We started with a clear overview of logic models. Then we transitioned into a more hands-off, coaching role as we reached the critical “stepping back” point where we handed more responsibility over to staff, who took what we had taught them and designed logic models for other agency programs.
Lesson Learned: This workshop design created space for a valuable collaboration. While my colleague Stacy Johnson and I contributed our expertise in evaluation design, HOPE Coalition staff contributed their experience from working with a hard-to-survey population.
Jo Seton, Grant Writing and Communications Coordinator for HOPE Coalition, said she liked how we explained logic models by building one for HOPE Coalition on the wall with sticky notes, and kept it clear and simple, without being condescending. Then we tapped into the true experts on our evaluation team – HOPE Coalition staff – and they took the reins.
“By logic model number five we had it down to a fine art,” Jo said.
Hot Tip: We studied up on the client in a way that complemented the clients’ own experience. Jo sent us information on HOPE Coalition to read beforehand and we took full advantage of this in preparation. She said she appreciated that we had a “nuts and bolts” understanding of her agency before we arrived – even as her staff provided the deep understanding of programs and services. It meant we could spend less time on learning about the client and more time on evaluation strategy.
As a result of our work with HOPE Coalition, the agency has developed a “culture of evaluation,” one of its goals before starting the work. By engaging clients as team members, staff grew a sense of ownership in the process that will be sustainable for HOPE Coalition’s evaluation efforts.
The American Evaluation Association is celebrating Evaluation Teams Week. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to firstname.lastname@example.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.