AEA365 | A Tip-a-Day by and for Evaluators

TAG | recommendations

Greetings! I’m Beverly Serrell, museum exhibition consultant, evaluator, and developer with Serrell & Associates in Chicago, Illinois. As a practitioner, I am very interested in finding helpful information to improve my practice in the planning, development, and assessment of exhibits. When the Building Informal Science Education (BISE) project invited me to look at their database and investigate a question of my choice, I was most curious about recommendations in summative evaluation reports. Did the advice, (e.g. recommendations or suggestions for improvements) compare to mine? Were there trends that could be shared and applied?

I started my report by looking at 50 summative evaluation studies in the BISE database that were coded as including “recommendations.” Further sorting brought the list down to 38—with a diverse selection of science disciplines, (e.g., botany, zoology, astronomy, biology, ecology, geology, and health sciences).

Lesson Learned: Orientation was often the single biggest challenge to get right in exhibitions. Using a bottom-up method of review, the issue that emerged as most common included the need for better orientation within an exhibition. Recommendations for improvements to orientation came from problems related to the various physical and psychological needs of museum visitors. Two other suggestions were closely tied to orientation: more clarity in conceptual communication and better delineation of exhibit boundaries. These recommendations and more are discussed and examples are given in my full report, “A Review of Recommendations in Exhibition Summative Evaluation Reports.”

Hot Tip: Criticism is about the work, and the work can always be improved. Whether to include a section on recommendation in an exhibitions summative evaluation is somewhat controversial. Some evaluators think that it is the client’s job––not the evaluators––to interpret the data, and that making recommendations for improvements can cast a negative light on the institution and hurt its reputation with funders. It is important for evaluators to make sure at the outset of a project that the client is eager to hear the thoughts of an experienced evaluator.

My advice for making recommendations in summative evaluation reports is to go ahead and make them. Without couching them in meek tones, be specific and give the context and evidence for why the recommendation is being made. Evaluation is recognized today as a valuable part of the process; it’s no longer us (evaluators) against them (designers, curators, etc.).

My favorite example of an exhibition report with numerous indicators of success and a balanced offering of practical suggestions for improvements is Sue Allen’s 2007 summative evaluation of “Secrets of Circles” at the San Jose Children’s Museum.

The American Evaluation Association is celebrating Building Informal Science Education (BISE) project week. The contributions all this week to aea365 come from members of the BISE project team. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

I’m Kylie Hutchinson, independent evaluation consultant and trainer with Community Solutions Planning & Evaluation. I also tweet regularly at @EvaluationMaven.

Have you ever wondered what evaluation recommendations and SpongeBob SquarePants have in common? Well, in my opinion, a lot.

Think about why we make recommendations. We want stakeholders to take action on our evaluation findings. But we all know this doesn’t happen by magic. And it doesn’t occur as soon as we submit our final report either. In fact it can be months or years before managers and policy-makers are actually in a position to make decisions based on our findings.

In order for utilization to happen, I think recommendations need to be three things:

  • easily absorbed (at the time of first reading)
  • sticky (so they stay in the minds of decision-makers)
  • have ‘legs’ (so they prompt action).

Hmmm…now think…what has good absorption, is sticky, and has legs? Exactly! SpongeBob SquarePants!

 

Rad Resource: Here’s a tip sheet on Recommendations That Rock!

Hot Tip: Well-written recommendations don’t have to check every tick in the box, but they do deserve significant attention. Don’t leave them to the end or the last minute. Instead, keep a running list of your initial ideas as soon as they occur, even if it’s at the beginning of the evaluation. And always run them by your stakeholders to increase ownership and the chances of implementation. Better yet, develop them collaboratively during a data party.

Rad Resource: You can find a Pinterest page with other resources for writing better recommendations here.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello Colleagues! My name is Judith Kallick Russell. I am an independent evaluation consultant in civic engagement, community development and peace building. My clients are national organizations (NGOs) and international organizations (UN agencies, international NGOs and foundations). In my work, I have found that it can be challenging at the report writing stage to provide findings and recommendations which are easily translated into actions for clients. The following are some ideas to address this.

Hot Tip: Include boxes or comments on the side in the findings section which suggest questions for reflection. There may be findings which raise questions you feel require further consideration by the client. Including thought provoking questions or comments in the report – visually separate, but linked to a finding – can encourage the client to explore the issue after the consultancy is completed.

Hot Tip: Frame your recommendations in stages or levels. Some organizations are not ready or able to make big changes at the moment of the evaluation. Once you learn from them what they feel capable or interested in doing, you could structure your recommendations providing options with stages or levels. For example, you might want to describe recommendations for a particular issue according to good, better and best.

Hot Tip: Make time for dialogue when finalizing the report. Consider establishing a process for finalizing the report in the very beginning. You might want to gain informal feedback from a few key stakeholders. Then provide a finalized draft to a representative group within the organization. Maybe conduct a workshop about main findings and recommendations, encouraging participation and collective thinking to deepen their understanding of the issues they face. Incorporate all input into the final report as you see fit. Be sure to focus who you ask input from and what input you are asking for, give clear deadlines, and phrase communications in a way where you are not stuck waiting for someone’s response.

If you want to learn more from Judith, check out the sessions sponsored by the Independent Consulting TIG on the program for Evaluation 2010, November 10-13 in San Antonio. Hope to see you there!

· ·

Archives

To top