Systems Week: Glenda Eoyang on Complexity Demands Simplicity

I’m Glenda Eoyang, Executive Director of the Human Systems Dynamics (HSD) Institute. We help people thrive in the middle of chaotic environments. Our associates draw lessons from complexity sciences to help their clients and colleagues adapt to unpredictable change. We believe that the most complex challenges require the simplest solutions. If you’re already drowning, the last thing you need is to stir the waters with even more complexity! Many of our associates are professional evaluators who use HSD theory, models, methods, and tools to assess the results of systemic and emergent interventions. One in particular, Beverly Parsons, has developed a simple map for complex initiative evaluations. I’m going to give you an overview of that map, describe how we’ve used in on projects, and send you to her paper to see for yourself how simple complexity can be.

Lessons Learned: Complex systems tend to exist in one of three states. Each state needs a different evaluation design.

Systems that are in the “organized” state are well understood, predictable, and reliable. Outcomes can be predicted, performance against those outcomes can be measured, and conclusions can be drawn about the effectiveness or efficacy of an intervention. We have used such approaches on (some aspects of) skill development and process improvement.

Systems that are in the “unorganized” state appear to be random. Outcomes are not predictable at all, and you can’t even see coherent patterns. In these early, unpredictable stages, exploratory evaluation methods have to take the place of outcome measures. We used this approach to track unexpected consequences from the development of a community of practice.

Systems that are in the “self-organizing” state generate and resolve tensions over time. Coherent patterns form, but they evolve over time. Community development opportunities are often self-organizing in nature. On the one hand, the patterns are predictable, but on the other hand, you never know what to expect. Evaluating self-organizing patterns requires still a third evaluation design—one that responds to the emerging patterns and learning processes among the players. We are using this approach now to do a retrospective systemic assessment of a federally funded employment initiative.

Hot Tip: Many human systems that need to be evaluated include aspects of all these states. Some aspects are organized, but others are unorganized or self-organizing. Effective evaluation designs for these systems will be multi-faceted with aspects of outcomes (where they are possible), exploration (where it is necessary), and emergence (where it is a good fit). We are currently working with an international agricultural research project that requires this fourth, integrated design.

Rad Resource: Beverly Parsons has defined each of these states and outlined evaluation approaches for each in a wonderful booklet. You can get copies of it from

Rad Resource: For more on the Human Systems Dynamics Institute, visit our website ( or email me (

The American Evaluation Association is celebrating Systems in Evaluation Week with our colleagues in the Systems in Evaluation AEA Topical Interest Group. The contributions all this week to aea365 come from our Systems TIG members and you may wish to consider subscribing to our weekly headlines and resources list where we’ll be highlighting Systems resources. You can also learn more from the Systems TIG via their many sessions at Evaluation 2010 this November in San Antonio.

Leave a Comment

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.