My name is Chris Michael Kirk, and I am a Doctoral Candidate in the Community Psychology Program at Wichita State University. During my time here, I’ve had the opportunity work with Dr. Rhonda K. Lewis on the evaluation of a federally-funded youth-serving program at the University.
In programs like this, the demands of program implementation can easily overshadow the need for rigorous, methodological evaluation. While annual reports are required, these questions may fail to demonstrate true program outcomes. In this context, evaluators may have to negotiate with program staff to balance the needs of the program with the need for strong evaluation.
Hot Tips for Negotiating the Value of Evaluation
Clearly Show the Value of your Work: While data may be limited, evaluators can find small ways to demonstrate how evaluation results can be valuable. In our situation, this entailed the creation of high-quality brochures for distribution to funders, partners, and community leaders. While research articles were simultaneously written and published, the brochures held greater value for program staff and allowed the evaluation work to continue and expand.
Make Their Life Easier: Program staff are busy people and any suggestions for change may be interpreted as an additional task to be completed. One way to overcome this resistance is to frame needed changes as helping save the amount of work time required by the staff. In our case, we were able to streamline a survey collection process, which improved response rate and data fidelity, while making the job of collecting survey data more simple for staff.
Compromise: Even with proper framing, program staff may make requests which require compromise on the part of the evaluation team. In our case, this entailed shortening the baseline survey. While this was not ideal, we worked with program staff to address their concerns and maintained the key elements needed to effectively measure proximal program outcomes.
Uncover the Shared Question: Most critically, evaluators should work with program staff to find a shared question. This may involve helping staff move beyond asking “What is required for the annual report?” to questions more central to the efficacy of the program. In our case, staff wanted to know more about the ways students felt the program helped them. By identifying this question, the door was opened for the inclusion of qualitative interviews with program participants and staff, which greatly strengthened the evaluation efforts.
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to email@example.com. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.