Hello, I’m Jacqueline Singh, MPP, PhD (she/her), an evaluator and program design advisor from Indianapolis, Indiana. With 30+ years in evaluation, I’ve worked across higher education, government, and nonprofit settings. The 2023 AEA conference theme, The Power of Story, inspired me to explore autoethnography for my round-table session, as a way to bring personal stories into evaluation. This approach values people as narrators of their own experiences letting their voices shape their stories. In this post, I’ll share some thoughts on autoethnography and resources to consider for its use in evaluation practice.
Lessons Learned
Autoethnography is a method that blends personal storytelling with cultural analysis. It’s an ideal approach for evaluators who want to get a deeper understanding of the impact and meaning of programs. In Essentials of Autoethnography, Christopher Poulos describes the method as a way to connect our own experiences with bigger cultural, political, and social issues. It’s especially helpful for evaluations that need to go beyond numbers and dig into what people really feel and experience.
At the heart of autoethnography is reflexivity, which means thinking carefully about how our personal backgrounds, values, and beliefs shape what we see and understand. Using autoethnography, evaluators can reflect on their own experiences to add a unique, personal view. This can help bridge individuals’ personal stories with the bigger picture, adding depth to evaluations that need cultural and social context.
Autoethnography works well in many evaluation settings:
1. Program Evaluation: Evaluators can use autoethnography to get an insider’s view on program impact. For example, an evaluator involved in a volunteer probation program might share personal reflections on how the program affected them, adding a rich, participant-centered perspective.
2. Educational Evaluation: In schools, teachers or students can use autoethnography to explore their classroom experiences. This might show deeper insights into instructional practices and student engagement, bringing more life and context to typical evaluation data.
3. Policy Evaluation: Autoethnography is also helpful for seeing how policies affect people’s lives. By sharing personal stories, evaluators reveal the real-world consequences of policies, showing both intended and unintended effects.
Autoethnography also works especially well with certain evaluation approaches:
1. Utilization-Focused Evaluation (UFE): UFE aims to give practical, user-centered findings. Autoethnography helps by connecting personal experiences to insights that are useful and relatable to those involved in the program.
2. Culturally Responsive Evaluation (CRE): CRE focuses on understanding cultural contexts. Autoethnography allows evaluators to examine their own cultural assumptions, making it easier to connect with diverse communities and evaluate with cultural sensitivity.
3. Developmental Evaluation: In programs that are new or rapidly changing, autoethnography supports continuous learning. Evaluators can capture how the program evolves, documenting unexpected changes and adaptations as they happen.
4. Transformative Evaluation: For evaluations aimed at social justice, autoethnography’s reflective approach helps evaluators understand power dynamics and systemic issues. By highlighting personal stories from marginalized perspectives, autoethnography supports evaluations that focus on advocacy and change.
Rad Resources
For individuals interested in using autoethnography, these resources can help: Poulos’s Essentials of Autoethnography provides a great starting point, while Cooper and Lilyea’s article I’m Interested in Autoethnography, but How Do I Do It? offers step-by-step guidance. The Handbook of Autoethnography, edited by Stacy Holman Jones, Tony E. Adams, and Carolyn Ellis, is also full of examples and insights for evaluators.
Autoethnography lets evaluators connect personal experiences with broader insights, creating evaluations that feel real and resonate with both participants and collaborators. By blending personal reflection with broader analysis, this method offers a nuanced, human-centered way to understand program and policy impacts.
The American Evaluation Association is hosting Indiana Evaluation Association (IEA) Affiliate Week. The contributions all this week to AEA365 come from IEA members. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.