Roadblocks to Change Within Collective Impact Initiatives and How Evaluators Can Help by Elayne McIvor

Hi there! I’m Elayne McIvor, an evaluation consultant with Catalyst Consulting.  We support non-profits, government agencies and foundations to create social change more effectively and efficiently.

Over the last decade, we have seen extensive uptake of the ‘Collective Impact’ (CI) approach in attempt to solve complex social problems at scale. I’ve had the opportunity to support a number of CI initiatives to learn from their efforts, shift their strategies, and assess differences made. While these initiatives experienced varying degrees of success, they often struggled to realize the very systems and population-level changes CI efforts are designed to influence.

Reflections

In my experience, the following commonly hold collaboratives back from achieving system- and population-wide impacts.  

  • Implementing traditional program strategies that are mapped out in multi-year work plans, rather than crafting systemic interventions that are dynamic and iterative. 
  • Focusing on collaboration amongst non-profits, as opposed to fostering meaningful alignment with cross-sector players.
  • Insufficient time and financial investment to realize deep systems and population-level changes.
  • Funders holding collaboratives accountable to achieve rigid deliverables and use standardized indicators of success, limiting their agility and responsiveness.
  • Traditional evaluation approaches and practices continue to be requested (and provided). We know that complex systems are messy and rapidly evolving, and that conventional evaluation methods cannot support emergent learning and strategy development in such contexts. However, business as usual seems to be more common than not.

Hot Tips:

Here are some ideas for how evaluators can better support CI initiatives to realize their intended impacts.   

  • Be reflective.

To effectively support CI initiatives, strategic learning tools and approaches are needed. These new ways of working require evaluators to embrace ambiguity, contribute to strategy discussions, and frequently adapt evaluation designs. Traditional evaluation approaches grounded in linear thinking aren’t going to cut it. Take the time to ask yourself whether this is something you are actually comfortable with, or not.

  • Push back on requests and requirements.

Clients and funders often specify evaluation requirements that could actually hinder momentum and learning within CI initiatives, as well as their ability to achieve deep changes. I have seen requests that CI evaluations measure a set of pre-determined indicators, focus on assessing outcomes, use certain data collection tools, and produce mid-term and final reports. To best serve CI initiatives, we need to advocate for more responsive approaches.

  • Help define systems change.

There are multiple definitions of ‘systems change’ out there and CI partners can quickly become overwhelmed with understanding what they have been tasked to do, and how to do it. We can aim to minimize confusion by coaching them to focus on levers or conditions for systems change, as outlined in The Water of Systems Change by Kania, Kramer and Senge.

Rad Resources:

For further guidance on evaluating CI and systems change efforts, check out:

Evaluating Collective Impact: Five Simple Rules by Mark Cabaj and Evaluating Complexity: Propositions for Improving Practice by Preskill and colleagues.  

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.