Systems- and Complexity-informed Evaluation Week: Design-Driven Evaluation and System Change by Cameron Norman

I’m Cameron Norman. I am a professional evaluator, educator and designer and President of Cense Ltd. My work is focused on helping organizations to learn, grow, and develop using design-driven evaluation (DDE).

Design-driven evaluation isn’t a technique or method but a means of strategic learning and development that connects the practice of design with utilization-focused approaches such as developmental evaluation. It came about from being in situations where my clients had all the data and feedback they wanted yet didn’t know how to use it to adapt and evolve their programs. They found their evaluation data was disconnected from the design process even though it was that very data that was needed to guide the development of the program.

Design and evaluation are symbionts in the service of innovation. They are Han Solo and Chewbacca, Thelma and Louise, or peanut butter and chocolate. Maybe anything and chocolate.

Design is about creating something to fill a need or want and incorporating what we know and learn into what we make. We design all the time (even if we don’t realize it). A DDE approach connects design with evaluation. For those programs focused on innovation, doing a utilization-focused, systems, or developmental evaluation without incorporating design is like finding Michael Quinn Patton attending an AEA conference without wearing one of his trademark sweaters. It’s an incomplete experience.

A design-driven evaluation approach embeds data and synthesis with learning, program design, and adaptation together as a package. This is an important distinction. A DDE is both an innovation service and a product. It’s not applied after the fact or in stages but is an embedded part of a program’s development throughout its life cycle with evaluation providing a means of feedback. It blends evaluative thinking with design thinking and enlists data to help imagine and reimagine a program’s needs, value, and impact and how it can best serve people and operate within systems.

Hot Tip

Design is for everyone. The skills of design can be learned and many of them build upon familiar experiences we had as children making things in our bedrooms, kitchen tables, and schoolyards. The first step is to cultivate a mindset for design. Open yourself to new ways of seeing things, framing problems, and making things. Connect to that part of you as a child that liked to explore, tried things, didn’t care (or even knew) whether they would succeed or not. Kids are amazing innovators for that reason. We can be like that again. Good design creates products and services that are enjoyed, resilient, and make a positive impact on people and the planet. Who doesn’t want that? The tools and techniques are easy to learn once you have this mindset.

Rad Resources

  • Habit Design: One of the best ways to start is to apply design thinking to your habits and practices. These simple steps can help you build your design awareness.
  • Storyboards: A practice borrowed from filmmaking is a useful way to envision how people use a program and where you might apply new insight and evaluation to make it better.
  • Supporting Systems Transformation Through Design-Driven Evaluation A New Directions in Evaluation article published in 2021 goes into detail into what DDE is and illustrates how it’s been used.

The American Evaluation Association is hosting Systems- and Complexity-informed Evaluation Week. The contributions to AEA365 this week are all related to this theme. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.

Leave a Comment

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.