Lymari Benitez of Pace Center for Girls here, with my coauthor Katie Smith Milway of MilwayPLUS social impact advisors. Through our experience and research across 15 nonprofits and evaluation experts committed to blending feedback measures with empirical evidence, we are learning that feedback not only helps us improve programs, but also influences participant engagement and advocacy, helps gauge outcomes and deeply supports equity, in line with three principles of The Equitable Evaluation Framework.
Lesson 1: Involving participants in defining research and gathering data boosts equity
In service of equity at every step of evaluation, nonprofits can invite those most affected by community programs to define research questions and help gather data.
For example, SOLE Colombia, based in Bogotá, hosts learning spaces where participants self-organize to address community challenges, like education or safety, and invites the community to suggest their own questions, listen to each other and adapt the questions until participants agree they are ready to learn and dialogue around themes in a self-organized manner.
Think of Us, a U.S. service-delivery and advocacy group for foster youth, hires young people who have experienced foster care to be members of its research teams and apply their lived experience to gather testimony. Such testimonials about foster youth experiences in group homes, have prompted calls for policy change. Nonprofits building evidence this way are shifting power and helping participants economically by hiring and paying them.
Lesson 2: Systematically disaggregating perceptual data can surface structural and systemic barriers
Disaggregating feedback data by race, age, gender or other relevant variable, provides insight to challenge structural and systemic barriers in the context of historic and cultural diversity. Pace Center for Girls, a girls’ empowerment program, introduced Listen4Good surveys in 2016, which allowed participants to share their experiences and rate how likely they would be to recommend Pace services. Pace disaggregated responses by race, and found that Black, White and Hispanic girls experienced Pace services with very different levels of satisfaction. Follow-up discussions with each group surfaced suggestions for enhancing processes like trauma counseling and creating safe space and mechanisms for ongoing feedback. Pace committed to these changes, and, five years later, girls of all races gave the same ratings, a high likelihood that they would recommend Pace to others, bringing the experiences of the least satisfied level with those of the most.
Such analysis is also growing across large-scale evaluation shops. For example, RTI International is evaluating employment social enterprise (ESE) organizations. A first set of analyses revealed that participating in an ESE was associated with better employment and economic outcomes 18-months later. By disaggregating the data, the team confirmed that the benefits of ESE participation were equitable for people in racial, ethnic, and gender groups.
Lesson 3: Feedback not only empowers participants, but can also predict their outcomes
Feedback supports participants in owning their own narratives, and fosters empowerment. This, we are learning, can actually advance outcomes.
Our collaborator, Center for Employment Opportunities, after seeing how feedback sparked program improvements, began to change its own decision-making structures to share more power with participants. In 2019, it appointed former participants to advisory committees across the United States, and a year later, to its board of directors. With a research grant from Fund for Shared Insight, CEO began exploring how participants’ input connected to obtaining permanent jobs, and learned that those who gave feedback during the first four weeks of the program were more likely to meet the organization’s job search and placement goals three months and six months later. CEO’s practices, aligned with principles of equitable evaluation, created a predictor of outcomes.
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to firstname.lastname@example.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.