Hey all! We are Olivia Melvin, Gray Flora, and Yasmin McLaurin. This blog is written from the perspective of an evaluator-client dyad of a student success program served by a University-Based Evaluation Center at the University of Mississippi. This year-long evaluative capacity building partnership formed in early-stage program development. The content for this blog post was generated via unstructured reflective interview.
Finding Evaluation
Q: Why seek out an evaluative capacity building partner?
It was recommended by a trusted source who suggested the program seek out formative evaluation in early stages to inform future iteration and program refinement. This seemed like a feasible opportunity that could have lasting impact on programmatic success. If evaluation can help with data-driven decision-making, why not develop the skills to carry this out in-house? Capacity-building made sense to ensure the continuation of evaluation.
Initial Anticipated Use
Q: What did the program team expect to get out of the evaluative partnership?
The team was looking for a way to streamline internal processes. The team was already doing great work and keeping record of that work; however, they knew that they could improve efficiency with dedicated effort. They were looking for a way to track ongoing interactions with students internally and communicate effectiveness to external partners. The team knew that the program needed depth and strategy to sustain, scale and demonstrate impact.
On-site Evaluation
Q: Why choose an evaluator based at your institution?
You’re able to skip a lot of steps in speaking the common language by partnering with an evaluation center at your institution. The evaluation team made connections across the University and demonstrated shared values in place-based and asset-based approaches aligned with Mississippi identity. There has also been a mutual emphasis on meeting students where they are and eliminating additional burden that can come with data-collection and reporting.
Benefit to Students
Q: What have been the students’ impressions of the evaluative process?
This is relational work. The program team builds community with each of the students served. The evaluative process must mirror the program’s commitment to community building. The program team shares, “the students really like being heard.” Each data-collection activity has started with quiet, semi-nervous faces that ease into long talks about every tangentially related topic you can imagine. Students aren’t rushing out the minute we reach the end of a protocol; they’re popping by the program team’s offices to say “it wasn’t that scary after all.” We want to demystify the feedback loop process. The only way to see change and improvement is to uplift your experience, but you have to feel safe and trust the folks you’re sharing it with. These students trust the program team and the evaluators (by proxy), and now they’ve added two more familiar faces to their campus directory.
Lessons Learned
Q: Is there anything else you’d like to share about how this process has gone over the last year or so?
“The evaluative work has made the program much stronger. I think that it provides a level of legitimacy to the work that can be hard to quantify without an evaluative perspective.” It’s been an amazing process working with such a dedicated program team and going into this with a capacity-building lens made each activity all the more valuable and actionable. Constant check-ins and a responsive approach to pivoting and adapting contributed greatly to the success of this partnership.
The program team recommends working with a University-Based Center for programs housed at universities, particularly because “they have a better understanding of the processes and context of these institutions.” There’s a learning curve in terms of navigational capacity of larger institutions if you’re not familiar with them.
The American Evaluation Association is hosting University-Based Centers (UBC) TIG week. All posts this week are contributed by members of the UBC Topical Interest Group. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.