Hi! I’m Hildie Cohen, Research Director at NORC at the University of Chicago. Listening and pausing is a skill that I am constantly practicing in my work managing research and evaluation projects. I must listen to clients in order to design evaluation plans, reports, and deliverables that meet the goals of the project. I must listen to staff who are assigned to carry out the multitude of tasks involved in preparing, implementing, and analyzing the results of the evaluation. I must listen to stakeholders who will receive, read, or utilize the evaluation findings. And, I must listen to subject matter experts who provide advice on methodology, survey design, and analytics.
But the most salient piece of wisdom I have received about listening was from conducting an evaluation of a curriculum designed to train future adolescent healthcare works to address substance abuse with adolescents. I was attending a workshop and a leader from a youth-service organization stated that when she designs a program or evaluation, she follows the rule, “Nothing about them, without them.” Since I have heard this phrase, it has changed the way that I approach evaluation work.
This rule of thumb applies both to designing an intervention and data collection. For me, this means providing a space for adolescents to test aspects of the intervention and give their feedback. On another project evaluating an HIV-prevention program for urban, black men who have sex with men, it meant inviting members of this community to pilot surveys, data collection methods, and trainings.
Planning an effective evaluation that will lead to high quality and representative results takes time. This planning takes even more time when I build in the steps to listen to and reflect upon the perspectives of all constituents involved. But, it is a necessary step in order to reduce bias in the research process and to ensure that we collect the most accurate information.
Rad Resource: Based on Patton’s Utilization-Focused Evaluation (2012), The Evaluation Center has a checklist for the steps of practicing this type of program evaluation. You can see the checklist at: https://wmich.edu/evaluation/checklists, Under step 3 they note that identifying, organizing and engaging primary intended users “optimizes the personal factor, which emphasizes that an evaluation is more likely to be used if intended users are involved in ways they find meaningful, feel ownership of the evaluation, find the questions relevant, and care about the findings.” I have found over and over again that trying to approach evaluation from a place of curiosity and willingness to learn how intended users of an intervention or findings feel about the work has led to useful information that advances the common goal of the project for all involved.
The American Evaluation Association is celebrating Arizona Evaluation Network (AZENet) Affiliate Week. The contributions all this week to aea365 come from our AZENet members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to email@example.com. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.