Hello! We are Jessica Saucedo and Tatiana Elisa Bustos, PhD students studying community psychology with interests in health equity research, implementation science, and equitable evaluation with Indigenous and Latinx communities.
We wanted to share our experiences completing community-engaged evaluation internships in the context of first-generation doctoral students and in relation to engaging marginalized communities. We worked on a prior evaluation project that aimed to increase racial equity in Lansing, Michigan. We’ve also participated on other evaluation projects, including the Building Bridges, a program designed to increase college retention among Latinx students, and a training evaluation of a culturally-responsive classroom assessment scoring system for Tribal Head Start programs. We want to share lessons learned and tips for other students working on evaluation projects for the first time.
Lesson Learned #1: Be Engaged!
projects require genuine relationship-building with partners. Creating strong
foundations with our community partners over time has increased trust in the
partnership and project productivity. As first-generation Latinas, who grew up
having strong ties with our own communities, we work best in trusting, positive
environments. Building relationships included pausing the discussion on
evaluation, sharing meals, attending community events, and shifting focus to
real-world discussions on what it means to be a Latina navigating the nonprofit
- Resources for staying engaged during COVID-19
Lesson Learned #2: Responsive not Reactive
We’ve learned about the value of staying responsive to our partners’ needs. In working with community organizers, we emphasize the importance of placing community needs first. Find a balance between the values and needs of your partner with those of the evaluation project. Aligning with the current needs of the project has led to more positive outcomes in our experiences, even if those needs have shifted. This could mean switching frameworks midway through the evaluation to reflect the new direction of the project. For example, we’ve had to switch from developmental to advocacy impact to community organizing evaluation frameworks. Another example includes creating “alternative” reports for more engagement from Latinx teens using visuals, blogs, or drawings.
- Here’s a resource library and strategies for maintaining partnerships that has worked for us.
- Resources on creating engaging reports and cultural responsiveness
Lesson Learned #3: Not everything will work out
Community-engaged evaluation is a wildcard! You can show up every day and give it your best, but not have it work out as you expected. As first-generation Latina graduates, we tend to question our abilities and lose faith in what we’re capable of doing once things go wrong. In an ideal world, we want everything to be under control. We’ve had to remind ourselves and each other that we cannot push the process; we cannot control community context. We learned to not take things personally, to continue to give our best without compromising the quality of work regardless, even if that strayed us from project timelines.
- Reading about Evaluation Failures provided more insights into what can go wrong in evaluation projects and how common “failures” are in the field overall.
The American Evaluation Association is celebrating Latinx Responsive Evaluation Discourse TIG Week. The contributions all this week to aea365 come from LA RED Topical Interest Group members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to email@example.com. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
1 thought on “LaRED TIG Week: Lessons Learned from Community-Engaged Evaluations by Jessica Saucedo and Tatiana Elisa Bustos”
Thank you for this very personal posting. I’m working on a project for USAID to identify evaluations that have used participatory evaluation in the various phases of the evaluation process. Would you have some examples of evaluations you’ve studied or worked on that that used participatory methods and how did it work.