Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

Uncategorized

LEEAD Fellows Alumni Curated Week: Considerations for Using Comparison Groups to Examine Equality, Equity or Social Justice by Jaymie Lorthridge

Hi! My name is Jaymie Lorthridge and I use a culturally responsive and equitable evaluation (CREE) approach to assess human service programs. The “C” in CREE requires attention to culture during all phases of the evaluation and when combined with the “E”, requires examination of cultural experiences by assessing outcomes related to equality, equity, and ideally …

LEEAD Fellows Alumni Curated Week: Considerations for Using Comparison Groups to Examine Equality, Equity or Social Justice by Jaymie Lorthridge Read More »

LEEAD Fellows Alumni Curated Week: Advancing Data Equity through Culturally Responsive and Equitable Data Parties by Chandria Jones

Hello! I’m Chandria Jones, Principal Research Scientist in Public Health and Affiliate Staff at the Center on Equity Research at NORC at the University of Chicago. I’m also one of the editors of the book Culturally Responsive and Equitable Evaluation: Visions and Voices of Emerging Scholars. In the realm of public health and social sciences, …

LEEAD Fellows Alumni Curated Week: Advancing Data Equity through Culturally Responsive and Equitable Data Parties by Chandria Jones Read More »

LEEAD Fellows Alumni Curated Week: Magnifying Culturally Responsive and Equitable Evaluation (CREE) and Culturally Responsive Indigenous Evaluation (CRIE) in Connection to Diversity, Equity, Inclusion, and Belonging (DEIB) by Tamarah Moss, Kimberly Harris, Rachel Powell, Jochebed Gayles, and Jennifer Garcia

Hi, we are Tamarah Moss, Kimberly Harris, Rachel Powell, Jochebed Gayles, and Jennifer Garcia, and we welcome you to this week’s blog series. Originally coming together in the Leaders in Equitable Evaluation and Diversity (LEEAD) Program, we hail from varying backgrounds in applied economics, business, education, human development, public health, and social work. Our evaluation …

LEEAD Fellows Alumni Curated Week: Magnifying Culturally Responsive and Equitable Evaluation (CREE) and Culturally Responsive Indigenous Evaluation (CRIE) in Connection to Diversity, Equity, Inclusion, and Belonging (DEIB) by Tamarah Moss, Kimberly Harris, Rachel Powell, Jochebed Gayles, and Jennifer Garcia Read More »

Undermining Intervention Design Effect (IDE) puts an intervention into External Reinforcing Factor’s Trap (ERFT) by G.M Shah and Farid Ahmad

Hello, AEA365 community! We are G.M Shah (Principal Evaluation Specialist) and Farid Ahmad (Chief, Strategic Planning, Monitoring, Evaluation and Learning). We both work with the International Centre for Integrated Mountain Development (ICIMOD). ICIMOD is a regional intergovernmental knowledge organization serving eight Regional Member Countries (RMCs) of Hindu Kush Himalaya (HKH) Region. Our headquarters is based in Kathmandu, Nepal.

Reflections of a Pipeline PhD: Reaching Postgraduation Aspirations by Jacqueline Singh

Hello, I’m Jacqueline Singh, MPP, PhD (she/her), an evaluator and program design advisor based in Indianapolis, Indiana. I identify as a first generation low income (FGLI)—and, nontraditional college graduate. What does it mean to be a FGLI and nontraditional college student? For me, it means that I experienced most of the higher education pipeline as a single parent. And, I received degrees from different types of higher education institutions (i.e., community college, state university, prestigious private university). Work responsibilities, at all stages of life, prevented me from participating in extra-curricular activities. My focus had to be on the workplace, coursework, GPA, paying bills, providing for and raising two children—not necessarily in that order. I also attended the University of Pennsylvania’s Graduate School of Education (GSE) to earn a PhD in higher education. The pathway I took was tough, but it was a viable one. Consequently, I refer to myself as a “Pipeline PhD.”

The Evidence Act & The Need to Facilitate Policymakers’ Access to Policy Position Statements by Quisha Brown

Greetings, I’m Quisha Brown, the mind behind the Progressive Outcomes Scale Logic Model (POSLM), a methodology designed to uncover systemic issues stemming from unfair systems and policies. Unlike traditional methodologies, the POSLM relies on the real-time experiences of individuals in marginalized communities. When communities seek to initiate policy and system changes, comprehensive engagement is crucial. The POSLM steps in by analyzing qualitative data to grasp community sentiments and converting these insights into actionable indicators for organizations to act upon.

Benefits of Becoming an AEA Member by Mike Zapata

My name is Mike Zapata, and I am the Membership Coordinator at the American Evaluation Association (AEA). I will have been serving AEA for four years this coming April. In my role, I focus on membership engagement, acting as the main point of contact for membership-related questions and leading monthly strategy meetings where my colleagues and I brainstorm ideas to offer greater value to our members.

Evaluating Initiatives to Increase Retention in STEM Fields: Lessons Learned by Adriana Cimetta and Rebecca Friesen

Hello everyone! We are Adriana Cimetta and Rebecca Friesen with the Center for Educational Assessment, Research, and Evaluation at the University of Arizona. Many of the initiatives that we evaluate aim to increase retention of STEM majors, particularly those from underrepresented populations, through research experiences. Traditional lab apprenticeships are limited and usually reserved for upperclassmen, by which time many aspiring STEM students have changed majors or dropped out altogether. In response, many college science departments have sought to expand access to authentic research opportunities.

Checklist for Sustaining an International Evaluation Community of Practice by Kim Norris

Hi, I’m Kim Norris, Monitoring, Evaluation and Learning (MEL) Director for American Institutes for Research (AIR)’s International Development Division. I’ve had the joy of establishing, contributing to and benefiting from a number of Communities of Practices (CoPs) in the field of evaluation since beginning professional work too long ago to mention here. From exchanging ideas and perspectives with peers from around the world, to improving skills and learning from our unique and sometimes shared experiences, CoPs have been invaluable to my own development. Like any evaluator, I also realize that for CoPs to be successful, guidelines and measures help everyone involved make full use of CoP resources and opportunities. So, I developed a checklist and questions gathered from experiences, discussions with colleagues and the many resources available on the subject.