Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

APC TIG Week: Navigating Uncharted Waters: Design, Monitoring and Evaluation of Policy Influence Campaigns with Marc Brown

Greetings from my nation’s capital – Ottawa, eh! My name is Marc Brown and I’m the Design, Monitoring & Evaluation (DME) manager for our government policy influence campaigns at World Vision Canada.  WVC has spent the past 15 years engaging government stakeholders directly in policy creation and implementation which impacts the well-being of the most vulnerable children around the world.

Three years ago, an internal evaluation position was created to help us plan and monitor progress on our policy influence campaigns. This is a summary of our key learnings from the past few years.

Lessons Learned:

  • Policy influence campaigns are a bit like an ancient, exploratory sea voyage – uncertain destination, shifting winds, unanticipated storms and a non-linear pathway. Policy change happens in a complex environment with rapidly changing decision-makers, shifting priorities and public opinions, uncertain time frames, forces beyond our control and an uncertain pathway to achieving the desired policy change. Campaigns are unlikely to be implemented as planned and unlikely to be replicable. Design, monitoring, and evaluation must therefore be done differently than with traditional development programming.
  • A developmental evaluation approach is internally focused with the purpose of providing rapid feedback for continual program adaptation in fluid contexts. We document our original objectives and plans and the implementation results in hopes of discovering how to adapt our ongoing campaigns – to take advantage of what’s working well or emerging opportunities or to do something different in response to obstacles encountered.
This graphic illustrates the DME framework we’ve developed – starting with a DE paradigm and using the Rad Resources mentioned below and learning from our own experience.
  • An evaluator:
    • facilitates problem analysis to identify root causes and create contextual understanding;
    • helps develop a theory of change, ensuring a logical strategy is developed to address the root causes;
    • documents the results of implementation; and
    • creates space for reflection to discuss evidence / results for program adaptation.
  • The overall framework is circular because the reflection on evidence collected during our implementation leads us to again examine our context and adapt our engagement strategy to guide future implementation.

Rad Resources:

  1. ODI, Rapid Outcome Mapping Approach – ROMA: we’ve used lots of these tools for issue diagnosis and design of an engagement strategy. Developing a theory of change is foundational and useful to evaluators to identify desired changes for specific stakeholders to create indicators and set targets.
  2. The Asia Foundation, Strategy Testing: An Innovative Approach to Monitoring Highly Flexible Aid Programs: This is a good comparison of traditional vs. flexible M&E and includes some great monitoring templates. Documenting the changes in a theory of change and the reasons for the changes demonstrates responsiveness. That’s the value of reflection on evidence that has been facilitated by the internal evaluator!
  3. Patton’s book, Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use, provides a valuable paradigm in creating an appropriate monitoring framework.

The American Evaluation Association is celebrating APC TIG Week with our colleagues in the Advocacy and Policy Change Topical Interest Group. The contributions all this week to aea365 come from our AP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.