Hi, we’re Frances Carter-Johnson, Felicia Fullilove, Mark Leddy, Regina Sievert, Simone Soso and Eric Stone. We facilitated an Evaluation 2019 Think Tank entitled “Methods and Practices for Evaluation Capacity Building Projects in STEM Education Communities.” Our team included Data Scientists, AAAS Science and Technology Policy Fellows, Program Officers and an AEA GEDI in the Directorate for Education and Human Resources (EHR), Division of Human Resource Development (HRD). The team shared a mutual interest in broadening participation in STEM and building evaluation capacity in communities historically underserved and underrepresented in STEM.
Our team saw the need for a session where experts could work together to share high level solutions. After analyzing projects focused on evaluation capacity building supported by HRD core programs such as Alliances for Graduate Education and the Professoriate, Historically Black Colleges and Universities Undergraduate Program, and Tribal Colleges and Universities Program, four common challenges were identified. The needs and strategies related to each are presented as lessons learned:
1. Evaluation Resources
Need: Internal/external institutional support; Culturally responsive evaluation training.
Strategies: Funding for evaluation researchers for professional development in culturally responsive evaluation to encourage face-to-face and virtual training and relationships.
2. Culturally Responsive Approaches
Need: Understanding that cultural competence is not a destination, but a journey.
- Evaluators view themselves as learners as well as knowers; invest in building community relationships (e.g., advisory boards, human-centered design, or a cultural liaisons)
- Evaluators negotiate time to build relationships sharing with the community how data will be used and value to/for the community.
Need: Money and time to conduct the evaluation as well as anticipating staff turnover and how evaluation is viewed.
- Place collaboration at the forefront via evaluator/PI proposal collaborations focused on stakeholder buy-in, logic model development and time to improve evaluation continuity
- PIs and community members with intimate knowledge/lived experience of a culture as collaborators to help break the barriers and promote collaboration.
4. Expanding Evaluation Paradigms
Need: Recognition that emergent frameworks have a theoretical basis.
- Community Liaisons/advocates to support removing bias.
- Funding agencies identify and decrease bias towards new and emerging frameworks for evaluation. Funders can also add cultural requirements to requests for proposals.
Prioritizing culturally responsive evaluation, training stakeholders and community building were common themes of the proposed strategies. We are committed to applying these lessons learned to continue to have NSF be Where Discoveries Begin in Science, Evaluation and Broadening Participation in STEM. We look forward to hearing from you on innovative approaches to build evaluation capacity in communities historically underserved and underrepresented in STEM.
- Programs of Interest to Researchers and Evaluators in Broadening Participation in STEM at MSIs
- Learn more about NSF evaluation capacity building awards
The American Evaluation Association is celebrating STEM Education and Training TIG Week with our colleagues in the Science, Technology, Engineering, and Mathematics Education and Training Topical Interest Group. The contributions all this week to aea365 come from our STEM Education and Training TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to email@example.com. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.