Hi, I’m Sara Vaca, independent consultant, helping Sheila curate this blog and occasional Saturday contributor. I haven’t been an evaluator for a long time (about 5 years now), but I have facilitated or been part of 16 evaluations, so I start getting over the initial awe of the exercise, and I am starting to be able to take care of other dimensions rather than just “surviving” (that is: understanding the assignment, agreeing on the design, leading the data collection process, simultaneously doing the data analysis, validating the findings, debriefing the preliminary results, finalizing digesting all these loads of information for finally packaging it nice and easy in the report).
I want to think that I incorporate (at least I try to) elements of Patton’s Utilisation-Focused Evaluation during the process, but until recently, my role as evaluator ended with the acceptance of the report (which is usually exhausting and challenging enough), taking no concrete actions once I had delivered it, partially because: a) it was not specified in the Terms of Reference (or included in the days of contract), or b) I usually didn’t have the energy or clarity to go beyond after the evaluation.
However, I’ve understood since the beginning of my practice that engaging in evaluation use is an ethical responsibility of the evaluator so I’ve just recently started doing some shy attempts to engage myself in it. Here are some ideas I just began implementing:
Cool Tick: Include a section in the report called “Use of the evaluation” or “Use of this report” in the document, so you (and them) start thinking of the “So what?” once the evaluation exercise is finished.
Hot Tip: Another thing I did differently was to elaborate the Recommendations section, but not in a prescriptive manner. Usually I would analyse all the evaluation ideas for improvement, and I would prioritize them according to their relevance, feasibility and impact. This time, I pointed out the priority areas I would focus on, and a list of ideas to improve each area, without clearly outlining what to do. Then I invited the organization to discuss and take that decision internally, and maybe forming internal teams to address each of the recommendations to gain more ownership.
Although, in occasions, clients have reached out months/years after the evaluation for additional support, this time I proactively offered my out-of-the-contract commitment to support, in case they think I could be of help later down the road.
Rad Resource: Doing proactive follow-up. I’ve read about this before, but haven’t yet done it systematically yet. So, I will set a reminder 3-6 months after the evaluation and check on how they are doing.
As you see, I’m quite a newbie introducing mechanisms and practical things to foster use. Any ideas are welcome! Thanks!
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to firstname.lastname@example.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.