AEA365 | A Tip-a-Day by and for Evaluators

TAG | negative findings

AEA365 Curator note: Back in January, AEA365 readers asked to read about how evaluators deliver negative findings to clients and stakeholders. This week, we feature 5 articles with four evaluator perspectives on this topic.  

Hello! I’m Kylie Hutchinson, independent evaluation consultant and trainer with Community Solutions Planning & Evaluation and author of A Short Primer on Innovative Evaluation Reporting.

The following is Part 2 of practical tips for delivering negative evaluation findings to stakeholders, gleaned from my own experience. (Note that these ideas won’t work in all circumstances; their use depends on the context of a specific evaluation.)

Hot Tips:

  • Use constructive feedback versus criticism. Criticism comes from a place of judgement and is focused on the past, e.g., “The program didn’t meet its target.” (Never mind that evaluation is admittedly about making judgements, we can still be sensitive when presenting bad news.) People can’t change the past, and it doesn’t motivate anyone to move forward. Constructive feedback, on the other hand, is future-focused and comes from a place of caring and respect. Statements such as, “Let’s talk about ways to better meet the program’s target,” are more empowering and position the evaluator as working alongside staff.
  • Alternate between the active and passive voice. Consider using the second person and active voice for positive results, e.g., “You met the program targets,” and if necessary, the passive voice for negative ones, e.g., “The targets were not met.” This may help to soften any blows.
  • Give them a decent sandwich. The sandwich technique is a well-known method for giving feedback – slip a negative finding between two positives. However, ensure the second positive is as substantial as the first and not a lame compliment at the end, otherwise people will still leave discouraged.
  • Be prepared to be wrong. I have regularly had to go back and review my conclusions and recommendations in light of new information provided by stakeholders. Is there additional information about the program or context in which it operates that might affect the results? This is where additional stakeholder interpretation and an interactive data party comes in very useful.
  • Be sensitive. Sometimes I get so caught up in the data analysis and findings that I forget that real people have put in a lot of blood, sweat, and tears into their program to get where they are. It’s relatively easy to evaluate a program, but a lot harder to work in the non-profit trenches day in and day out for little pay. The incredible daily commitment that non-profit staff demonstrate is humbling given the challenging complexity of most social change interventions. Whenever I mess up presenting negative findings it’s because I’ve forgotten that even minor negative news can come across as discouraging for hard-working staff.

 

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

AEA365 Curator note: Back in January, AEA365 readers asked to read about how evaluators deliver negative findings to clients and stakeholders. This week, we feature 5 articles with four evaluator perspectives on this topic.  

Hello! I’m Kylie Hutchinson, independent evaluation consultant and trainer with Community Solutions Planning & Evaluation and author of A Short Primer on Innovative Evaluation Reporting.

Delivering negative evaluation findings is possibly one of the hardest things an evaluator has to do. Accomplishing it effectively is a bit like Goldilocks’ porridge. Too harsh and direct and you’ll make people defensive. Too indirect and they may not take action. Just right and they’ll use the results moving forward. I definitely admit to totally bungling this task at times. Here are a few practical tips for presenting bad news gleaned from my own experience. (Note that these ideas won’t work in all circumstances; their use depends on the context of a specific evaluation.)

Hot Tips:

  • Build trust. People can’t learn when they feel nervous or threatened, and we want them to fully absorb and understand what we’re saying. Effective stakeholder engagement, at the beginning and throughout an evaluation, is critical for building trust and developing stakeholder ownership of the final results, both good and bad.
  • Prepare them early. Prepare stakeholders for the possibility of negative results by engaging them in an informal discussion early on in the evaluation, e.g., “How well do you think the program is doing?” or “What would you do if the results were not as you expected?”
  • Drop clues. During the data analysis phase, consider giving the organization small warning signs, such as, “It’s still early days, but we’re seeing lower than expected scores. Can you think of why this might be so?”
  • Be clear. Be prepared to explain in detail how the negative findings were derived and ensure the lines of evidence are crystal clear. The more lines of evidence you are able to demonstrate, the easier a bitter pill might go down.
  • Let others do the talking. When the news is particularly bad, I usually include a greater number of quotes from the qualitative data so people can hear it straight from the horse’s mouth and not me.
  • Consider participatory data analysis. Rather than the evaluator being the bearer of bad news, let people “discover” the bad news themselves by inviting them to a data party and asking for their assistance with interpretation.
  • Don’t send an email bomb. As Chari Smith says in her post, never, ever email a final report without going through it with people first. In some instances, program managers may appreciate a heads up and the opportunity to meet privately to digest the news and plan their response prior to meeting with staff. Nobody wants a nasty surprise or to be put on the spot during a public presentation.
  • Give people time to digest. In other instances, you might wish to give the results to everyone ahead of time so they can fully process the evidence before meeting with you. Then you’re not faced with a barrage of defensive questions from people who haven’t had time to read the full report and understand how you reached your conclusions.

Stayed tuned for more tips in Part 2!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Chari Smith

Chari Smith

AEA365 Curator note: Back in January, AEA365 readers asked to read about how evaluators deliver negative findings to clients and stakeholders. This week, we feature 5 articles with four evaluator perspectives on this topic. 

A client called after reading the evaluation report and said, “The data are wrong.” My curiosity rose, as well as my blood pressure, and I wanted to understand more.

Hi, I’m Chari Smith with Evaluation into Action. This phone call was a turning point in my understanding of how to shift clients to understanding program evaluation as a learning opportunity. The rest of the conversation goes like this:

Me: “Can you elaborate?”

Client: “Participants say we aren’t communicating with them about the activities. But we are. We cannot send this to our funder.”

My thoughts: They are in fear of losing funding. Emotions are driving their fear. How do I shift them from a state of fear to a state of learning?

Hot Tips:

Validate, Educate, Collaborate

©2018. Chari Smith. Evaluation into Action. All Rights Reserved.

  • Validate their concerns: “I understand this is alarming to you. We will discuss how to use the data and share with the funder.”
  • Educate: “The data aren’t wrong, this is what participants said. This means your communication methods with them need to change. Let’s discuss what that can look like. Instead of emails, how about an initial phone call to all nine organizations? Set up a google group so they can discuss as well?
  • Collaborate: We worked together to create a one-page improvement plan, highlighting the finding and providing a brief description how communications will change, and then be measured. This was sent along with the full evaluation report to the funder.

Results:

  • Funder was happy to see transparency.
  • New communication methods worked, participants reported in later survey the felt well-informed and appreciated the change.
  • Client was relieved (me too!), and leveraged that experience to secure additional funds by highlighting how they used data to improve their program.

Lessons Learned: Never, ever email a report. Always go through it with them in person first, and then email it after the meeting.

Rad Resources: I am passionate about this topic, it prompted a white paper: Building a Culture of Evaluation. Please let me know about other resources on this topic. Thanks!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Bernadette Wright

Bernadette Wright

AEA365 Curator note: Back in January, AEA365 readers asked to read about how evaluators deliver negative findings to clients and stakeholders. This week, we feature 5 articles with four evaluator perspectives on this topic. 

Greetings! I’m Bernadette Wright, founder of Meaningful Evidence. We help nonprofits to leverage research to tackle complex issues. Being in the public speaking organization Toastmasters taught me a few useful tips for presenting negative (and positive) evaluation findings. You can even use these techniques for giving feedback to friends, family members, and co-workers!

Collect useful data.

To give a useful evaluation, you need to start by collecting useful data.

In Toastmasters, evaluators plan ahead by reading the speaker’s speech manual to understand the purpose of the speech. Is it to persuade, to entertain, to inform? You can also talk with the speaker to find out their personal goals for their speech.

Similarly, planning a useful evaluation first requires learning about the program. Read the program materials, review the literature, and ask stakeholders how they plan to use evaluation results. That lets you shape your evaluation strategies to fit the purpose.

Start and end with something positive.Collect useful data, start and end with something positive, don't be all positive

In Toastmasters, no matter how much work a speech might need, you always want to start and end your evaluation with something positive and specific (the “sandwich” technique). That lets the speaker know what to keep or do more of. It also gives them encouragement to try again.

For example, you might start with, “I loved the expression in your voice—I felt the emotion!” You might close with, “By making that change, I feel your speech will be highly entertaining. I look forward to your next speech!”

In delivering evaluation results, I always like to start and end with something that went well. It could be the progress made in carrying out planned activities, the strategies that were most beneficial, or the positive effects that were found. That lets program directors know what to keep or expand. It also gives them encouragement to incorporate your evaluation findings to increase their program’s success.

Don’t be all positive.

In Toastmasters, even the most polished speakers are always looking to get better. If a speaker hears nothing but praise, they might wonder whether going to meetings is worth the time. Evaluators are challenged to find at least one small idea for improvement in every speech. It may be as minor as changing a word here or adding a longer pause there.

In evaluation, when a manager wants to maximize their program’s potential, they might feel they’re not getting their money’s worth if an evaluation is nothing but praise. So, always include ideas on how to do even better.

Rad Resource:

You can download Toastmasters International’s guide on “Effective Evaluation” in the Resource Library on their website.

Rad Resource:

If you are interested in learning more about Toastmasters, you can find a club near you to visit.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

AEA365 Curator note: Back in January, AEA365 readers asked to read about how evaluators deliver negative findings to clients and stakeholders. This week, we feature 5 articles with four evaluator perspectives on this topic. 

Hello AEA 365 readers! I’m Glenn Landers, the Director of Health Systems at the Georgia Health Policy Center (Andrew Young School of Policy Studies, Georgia State University). A large portion of our work is evaluation, and we’ve been fortunate enough to work in every state and many of the territories. No one likes being the bearer of bad news, but sometimes it can’t be helped.

Recently, I was engaged in a developmental evaluation of a collective impact initiative that was intended to last ten years with ample funding. Five months in, we realized the initiative was in trouble. One year in, the project was basically over. Several techniques helped incorporate the bad news into the process as learning.

Hot Tip:

Evaluation Advisory Groups! We always try to have an advisory group made up of those whose work is being evaluated and those who will use the products of the evaluation. This way, we can test what we are learning with a small group for feedback before sharing with a wider audience.

Hot Tip:

Feedback loops! We also set up several feedback loops with the funder, the facilitator, and the work’s steering committee. This way, we shared information in small packets and gained the benefit of group sense making so that everyone understood why things weren’t working as planned.

Hot Tip:

Evaluation as Learning! We were fortunate to have a project sponsor who was interested in learning from what was not working just as much as what was working. Knowing this upfront helped us to be more comfortable in being candid.

Lesson Learned:

There’s nothing that can be substituted for being present with the people who are doing the work. Relationships and trust develop over time. The more present you are with them, the more they will be able to be in a position to hear the results – whether good or bad.

What’s worked for you in delivering bad news?

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Archives

To top