AEA365 Curator note: Back in January, AEA365 readers asked to read about how evaluators deliver negative findings to clients and stakeholders. This week, we feature 5 articles with four evaluator perspectives on this topic.
Hello! I’m Kylie Hutchinson, independent evaluation consultant and trainer with Community Solutions Planning & Evaluation and author of A Short Primer on Innovative Evaluation Reporting.
The following is Part 2 of practical tips for delivering negative evaluation findings to stakeholders, gleaned from my own experience. (Note that these ideas won’t work in all circumstances; their use depends on the context of a specific evaluation.)
Hot Tips:
- Use constructive feedback versus criticism. Criticism comes from a place of judgement and is focused on the past, e.g., “The program didn’t meet its target.” (Never mind that evaluation is admittedly about making judgements, we can still be sensitive when presenting bad news.) People can’t change the past, and it doesn’t motivate anyone to move forward. Constructive feedback, on the other hand, is future-focused and comes from a place of caring and respect. Statements such as, “Let’s talk about ways to better meet the program’s target,” are more empowering and position the evaluator as working alongside staff.
- Alternate between the active and passive voice. Consider using the second person and active voice for positive results, e.g., “You met the program targets,” and if necessary, the passive voice for negative ones, e.g., “The targets were not met.” This may help to soften any blows.
- Give them a decent sandwich. The sandwich technique is a well-known method for giving feedback – slip a negative finding between two positives. However, ensure the second positive is as substantial as the first and not a lame compliment at the end, otherwise people will still leave discouraged.
- Be prepared to be wrong. I have regularly had to go back and review my conclusions and recommendations in light of new information provided by stakeholders. Is there additional information about the program or context in which it operates that might affect the results? This is where additional stakeholder interpretation and an interactive data party comes in very useful.
- Be sensitive. Sometimes I get so caught up in the data analysis and findings that I forget that real people have put in a lot of blood, sweat, and tears into their program to get where they are. It’s relatively easy to evaluate a program, but a lot harder to work in the non-profit trenches day in and day out for little pay. The incredible daily commitment that non-profit staff demonstrate is humbling given the challenging complexity of most social change interventions. Whenever I mess up presenting negative findings it’s because I’ve forgotten that even minor negative news can come across as discouraging for hard-working staff.
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.