AEA365 | A Tip-a-Day by and for Evaluators

TAG | comparative analysis

Hello! We are Maureen Hawes from the University of Minnesota’s Systems Improvement Group, Arlene Russell, independent consultant, and Jason Altman from the TerraLuna Collaborative. We are writing to share our experience with fuzzy set Qualitative Comparative Analysis (QCA).

You may have faced questions similar to ones that we grappled with as evaluators, using quantitative analysis as part of a mixed methods approach. We wondered:

  1. Is there a method more adept at addressing nuance and complexity better than more traditional methods?
  2. Can quantitative efforts uncover the causes of future effects for developmental and formative work, or just prove impacts, or the effects of past causes?
  3. Does regressing cases to means misalign with our values and efforts to elevate the voice of those that are often not heard.
  4. Should we be removing outlier cases before analysis? Note: see Bob Williams’ argument that we should approach “outlying data with the possibility of it being there for a reason” rather than by chance.

In supporting our partner, we set out from the beginning, knowing that each of our cases (school buildings) were complex systems. Two major considerations were particular sticking points for us:

  1. Equifinality: We expected that there would be more than one pathway to implementation
  2. Conjuncturality: We expected that variable influence would be in combination rather than isolation


Hot Tip: Our solution was a QCA, based on set theory and logic and not statistics. QCA is a case-oriented method allowing systematic and scientific comparison of any number of cases as configurations of attributes and set membership. We loved that QCA helped answer the question “What works best, why and under what circumstances” using replicable empirical analysis.

QCA is either the crisp-set variety (conditions judged to be present or absent) or more contemporarily, fuzzy set QCA (fsQCA). fsQCA allows for sets in which elements are not limited to status as either a member or non-member, but in which different degrees of membership exist.

Lessons Learned: Our fsQCA analysis of a medium-sized sample of 21 buildings (in 6 districts) uncovered a message our partners could act on. Among other findings, the analysis identified a pathway to positive program outcomes that relied on ALL 3 of the following factors being in place:

  1. Project engagement
  2. Leadership/ infrastructure
  3. Data collection/ use

Worth considering: The number of QCA applications has increased during the past few years though there are still relatively few applications. Since it was introduced by Charles Ragin in 1987, QCA has been modified, extended, and improved, contributing to a better applicability of QCA to evaluation settings.

Rad Resources:

  1. We have a longer read (complete with references) available
  2. Charles Ragin, houses information that he finds pertinent to the technique, and tools that he has developed to complete analysis
  3. Compass hosts a bibliographical database where the user can sort through previous applications of fsQCA


Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·


To top