Hi. I’m Heather Krause, founder of Datassist. I’ve been doing data analysis around the globe for a long time and am an advocate for using ethics and equity as the foundations for all our work. My presentation on Feminist Data Analysis focused on step by step processes to avoid sexism, racism, homophobia and more in using data. Feminist data analysis requires us to examine the many assumptions embedded in our habitual data practices including where power dynamics are coming into play, what assumptions and values are being prioritized over others, and who is benefitting from all aspects of our choices around data and analysis.
So often data products are thought of as objective, scientific ways of figuring out what’s really going on and what is working. However, data and statistics are never actually objective. I think about data projects in terms of the seven steps of the data life cycle. Every single step of data and evaluation is deeply embedded with the worldviews and hidden implicit biases of the people involved. Each step presents opportunities to increase equity, inclusion, and fairness.
My talk walked us through the steps of the data life cycle and provided key questions and tools for identifying and correcting sexism, bias and more at each step. For example, in the Project Design step, constructing the methodology of any data project has many potential equity pitfalls. Probably the most prevalent bias here is towards comfort. What do the people involved know how to do? The amount of accidentally sexist or racist method choice is staggering, and often just due to limits of understanding, training, and level of comfort. The design of a data project is inherently subjective because it runs up against the limits of what the people running it think to measure. Big donors also tend to direct their funds towards what’s comfortable and often monolithic. You can almost forgive someone who always runs RCTs to try to answer all questions, but you simply can’t.
The Data Analysis step is often seen as the most objective and free from bias. In reality, there are a huge number of assumptions, interpretations, and conceptual biases that are an inextricable part of data analysis. A statistical impact evaluation model, for example, can easily be built to be technically correct and simultaneously biased against women, or other vulnerable groups.
Many participants in the event shared their own experiences, suggested tools and resources and stood in long lines after the event to ask questions. These conversations led to the development of We All Count, a project for equity in data. We’re sharing tools, tips, and stories on this site. We would love for you to add your story or comments to the community.
The full presentation, including the lists of bias-awareness questions to ask at each step, is available for anyone to see here.
There is also a draft of a really great new book about Feminist Data from Catherine D’Ignazio and Lauren Klein here.
The American Evaluation Association is celebrating Feminist Issues in Evaluation (FIE) TIG Week with our colleagues in the FIE Topical Interest Group. The contributions all this week to aea365 come from our FIE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to email@example.com. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.