Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

University-Based Centers TIG Week: When Evaluation Doesn’t Mean Evaluation by Elizabeth Winchester

I’m Elizabeth Winchester (she/her) from the Social Research and Evaluation Center at Louisiana State University. I’ve been a research associate working with non-profits, foundations, federal, state, and local agencies for over 20 years creating data collection and evaluation plans. For today’s University-Based Center TIG week post, I want to share my thoughts on working with agencies who ask for evaluations when that’s not what they really want.

University-based centers (UBC) usually operate on soft money. Their university doesn’t provide guaranteed funding for full-time faculty and staff positions, only office space and perhaps a full-time position. UBCs must rely instead on grants and contracts to fund faculty and staff positions, travel, and supplies. Our UBC is mission based, and we are in a fortunate financial position to only take projects that fit into that mission. We still need to generate funds. Like other UBCs, it is critical to nurture relationships with our community, state, and federal partners and deliver excellent work products to these partners to keep our center thriving.

Michael Quinn Patton explains evaluation science as the “systematic inquiry into how, and how well, interventions aimed at changing the world work.”[1] In our office, we generally use the term evaluation to mean the examination of the effectiveness or impact of a program. We provide the information generated to our partners in order for them to improve the program or for decision making around these programs. Over my years of working with non-profits and government agencies, I’ve learned when a partner asks for an “evaluation” what I’ve described above isn’t always what they have in mind.

Many times, what these partners really want is someone to collect, organize and restate data. These simple measures may be a funding requirement for a grant, and therefore that’s the entire scope of work in our “evaluation” contract. While there is a place for collecting the number of participants or program activities, when a partner asks for an “evaluation”, these numbers are all they want. While our partners reach out to us for an evaluation, they really hire us to provide numbers to satisfy a funder.

Hot Tip:

We recognize that not every project needs a comprehensive evaluation, but we must let our partners know the number of participants or program activities won’t tell them about their program’s success, effectiveness, or fidelity to a program model. Nor will we know if the goals for the program were met, exceeded, or missed the mark. This information won’t give our partners much to improve decision making.

As we move forward with our partners, a new part of our job is to educate them about what an evaluation is and is not, and how a true evaluation could benefit their program.


[1] Patton, M. Q. (2018). Evaluation Science. American Journal of Evaluation, 39(2), 183–200. https://doi.org/10.1177/1098214018763121


The American Evaluation Association is hosting UniversityBased Centers (UBC) TIG week. All posts this week are contributed by members of the UBC Topical Interest Group. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.