Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

AEA365 Contributor, Curated by Elizabeth DiLuzio

Spurious Precision – Leading to Evaluations that Misrepresent and Mislead by Burt Perrin

Sometimes it is helpful to be very precise. But, in other cases, this could be irrelevant at best, and quite likely misleading. And destroy, rather than enhance, the credibility of your evaluation – and of you. Hi, I’m Burt Perrin, and I’d like to discuss what considerations such as these mean for evaluation practice.

If one is undergoing brain surgery, one would hope that this would be done with precision based upon established knowledge about how this should be done. But one can be no more precise than the underlying data permit. Yet attempting this is where too many evaluations go wrong.

No More Crappy Survey Reporting – Best Practices in Survey Reporting for Evaluations by Janelle Gowgiel, JoAnna Hillman, Mary Davis, and Christiana Reene

Janelle, JoAnna, Mary, and Christiana here, evaluators from Emory Centers for Public Health Training and Technical Assistance. We had the opportunity to present a session entitled No More Crappy Surveys at last year’s AEA Summer Evaluation Institute. We are on a mission to rid the world of crappy surveys, and are here to share some of our Hot Tips and Rad Resources to do so.

If you haven’t already, check out the first and second blog posts in this series, No More Crappy Surveys – Best Practices in Survey Design for Evaluations (you can check it out here) and No More Crappy Survey Analysis – Best Practices in Survey Analysis for Evaluations (which you can read here). Today, we’ll be following up with some tips on how to report your survey findings to different audiences and tips to engage partners throughout the survey process.

Reflections from a Youth Evaluator by Yasemin Simsek

Greetings! I am Yasemin Simsek, a master’s candidate in American University’s Measurement and Evaluation program. The Quantitative Methods in Evaluation course required me to partner with an organization to identify a research need, collect and analyze data, and write a report. I had the incredible opportunity to work with the Neema Project, a nonprofit organization dedicated to empowering women experiencing poverty, gender-based violence, or teen pregnancy in Kitale, Kenya through services such as skills training, counseling, and faith-based support.

Measuring DEI in Our Own Workforce: Lessons from Four Studies Across Two Years by Laura Kim and Brooke Hill

We are Laura Kim (Senior Consultant at the Canopy Lab) and Brooke Hill (Senior Program Manager at Social Impact). Laura is part of the team that works on Canopy’s Inclusion and Leadership series, which explores the forces that influence who gets to advance in international development and why. Brooke is the technical lead for the BRIDGE survey and co-leads the Equity Incubator, a lab studying equity and inclusion through data.

Sharing How the Inaugural AEA Student Evaluation Case Competition Went by Dana Linnell, Steve Mumford, Carolina De La Rosa Mateo, Julian Nyamupachitu, Rana Gautam, Jennifer Yessis, Christine Roseveare, and Asma Ali

We are the Student Evaluation Case Competition Working Group (Dana Linnell, Steve Mumford, Carolina De La Rosa Mateo, Julian Nyamupachitu, Rana Gautam, Jennifer Yessis, Christine Roseveare, and Asma Ali). We’re excited to tell you about the inaugural competition!

No More Crappy Survey Analysis – Best Practices in Survey Analysis for Evaluations by Janelle Gowgiel, JoAnna Hillman, Mary Davis, and Christiana Reene

Janelle, JoAnna, Mary, and Christiana here, evaluators from Emory Centers for Public Health Training and Technical Assistance. We had the opportunity to present a session entitled No More Crappy Surveys at last year’s AEA Summer Evaluation Institute. We are on a mission to rid the world of crappy surveys, and are here to share some of our Hot Tips and Rad Resources to do so.

If you haven’t already, check out the first blog post in this series, No More Crappy Surveys – Best Practices in Survey Design for Evaluations (you can check it out here). Today, we’ll be following up with some tips on how to analyze your surveys (which, of course, you’ve made sure are not crappy!). Stay tuned for our final post of this series, on how to report your findings to different audiences.

In recognition of Indigenous Peoples’ Day: Recognizing Indigenous Rights and Sovereignty by The IPE TIG Leadership

Happy Indigenous Peoples’ Day! Indigenous Peoples’ Day occurs on the second Monday of October in the United States and recognizes the resilience of Indigenous peoples and the fact that Indigenous peoples still exist and to make space and honor the contributions that Indigenous peoples have made and continue to make. Other settler colonial states, e.g. …

In recognition of Indigenous Peoples’ Day: Recognizing Indigenous Rights and Sovereignty by The IPE TIG Leadership Read More »

The Power of Story by Corrie Whitmore

Hello AEA friends –

This is Corrie Whitmore, the 2023 President of AEA. I’m thrilled to be in Indianapolis for EVAL2023: The Power of Story and hope to see many of you here, attending workshops and conference sessions and enjoying the great local attractions.

Our conference week kicks off on October 9, which is also “Indigenous People’s Day.” Indigenous People’s Day is special to me, because I’m the mother of Indigenous children, and also because it offers such a clear example of The Power of Story. Let me explain…

Decolonization in Evaluation Week: Knowledge asymmetries and struggles for space: Towards a decolonial turn in the evaluation of ‘development’ and ‘conservation’ programmes by Linda Khumalo & Gert Van Hecken

We are Linda Khumalo and Gert Van Hecken. We collaborate on a project to reimagine M&E from a decolonial perspective. Here we share some lessons learned. Linda is an evaluation practitioner and scholar contributing to the Made in Africa Evaluation (MAE) discourse who sees the challenges of applying a transformative evaluative lens. Gert is an …

Decolonization in Evaluation Week: Knowledge asymmetries and struggles for space: Towards a decolonial turn in the evaluation of ‘development’ and ‘conservation’ programmes by Linda Khumalo & Gert Van Hecken Read More »

Decolonization in Evaluation Week: Finding Place: Grounding Evaluation by Decolonizing Context by Katie Boone

Hi there! I’m Katie Boone, PhD student in Organizational Leadership Policy Development – Evaluation Studies at the University of Minnesota. From our Guiding Principles, Evaluator Competencies, and Statement on Cultural Competence, context has a crucial role in evaluation practice. One of the greatest blindspots I see in the field of evaluation, is our consciousness to …

Decolonization in Evaluation Week: Finding Place: Grounding Evaluation by Decolonizing Context by Katie Boone Read More »