International Evaluation Academy (IEA) Week: “Salmon Fishing in the Yemen”: Swimming Upstream to Transform Evaluation Systems by Candice Morkel

Candice Morkel

My name is Candice Morkel and I am the Director of the Centre for Learning on Evaluation and Results – Anglophone Africa (CLEAR-AA). We support countries in building and strengthening national M&E systems, partnering with governments, civil society and development partners.

One of my favourite movies is “Salmon fishing in the Yemen”, probably because I relate to storylines highlighting the universal challenges, power dynamics and trade-offs in decision making and agenda-setting at the political-administrative interface of government. It portrays a theatrically rigid public official, faced with the impossible task of introducing salmon fishing in the arid Yemeni desert, where the (imported) fish are sure to die. The funder, a wealthy Yemeni Sheikh, convinces the reluctant bureaucrat, Dr. Jones, to “have faith” in the mission, engendering hope for the impossible. Juxtaposed against political expediency and the constraining realities of the context, the movie represents the stand-off between political agendas and contextual realities – and how the whims of the powerful and wealthy often fuel the former, and attempt dominion over the latter.

Working at a systems level so that evaluations are transformative feels a bit like the movie plot, and a lot like salmon swimming upstream in the desert. Competing agendas may underpin the rationale for establishing national evaluation systems, which are not always explicit. Whilst there are genuine efforts to use evaluations for improved decision-making, some may be driven by demands such as regulatory compliance, or simply assimilation.

To this effect, I have observed a trend towards isomorphic mimicry in the establishment of national evaluation systems in Africa, essentially copying the structures and functions of existing evaluation systems. This approach ignores the fact that there is no “ideal-type” national evaluation system, and that many of the recommended components and functions are based on upward accountability requirements.

Increasing attention is being paid to transforming evaluation, without considering the need for transforming the bureaucratic machinery, the epistemic systems in which they are embedded, and the constraints they pose to the transformative potential of evaluation. More efforts are needed to understand and (re)design the systems that might constrain the way in which evidence from evaluations are used for purposes of decision-making and prioritisation, particularly in government.

Lessons Learned

We are learning that it is critical to build a better understanding of the development context within which evaluation is practiced.  This includes considering the historical conditions that produce contemporary conceptualisations of development in Africa, and its connection to the current form of development aid and dependency. We must acknowledge how the persistent hegemony of powerful nations over less powerful ones influences the evaluation “industry”, as well as the raison d’etre of evaluation systems in African countries.

Building national evaluation systems should provide room for reflecting on how institutions can divest themselves from simply mimicking what has been done elsewhere, and commit to bottom-up, citizen-focused structures and functions for evaluation. This is especially relevant today, not only in Africa but across the planet, given the current trust deficit plaguing all of our social institutions, not least of all governments.

In the final scene of the movie, a fish buoyantly jumps out of the water signalling a transformational breakthrough, despite the impossible conditions. To achieve our breakthrough, evaluation professionals need to pay closer attention to the written and unwritten “rules of the game”, and work to build indigenous, endogenous evaluation systems that will truly transform the continent.

Rad Resources


The American Evaluation Association is hosting International Evaluation Academy (IEA) Week. The contributions to AEA365 this week are all related to this theme. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.

Leave a Comment

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.