Greetings, AEA365. Liz DiLuzio here, lead curator of the blog. Although today’s post is from the AEA365 archives, it is as relevant now as it was when it was originally posted in 2019. Whether this is your first read or your tenth, I hope you find something new and thought-provoking in this post.
Greetings AEA Colleagues! When we learned of the 2019 AEA conference theme, Speaking Truth to Power, we instantly knew our evaluative work with Historically Black Colleges and Universities (HBCUs) held meaningful resonance.
Our team includes: John Chikwem – NSF PI Lincoln University of Pennsylvania; Toks Fashola – American University; Kevin Favor – Lincoln University of Pennsylvania; and Monica Mitchell – MERAssociates, LLC.
In our STEM equity and access work, we have always recognized that despite 3% representation of HBCUs among all US institutions of higher education (IHEs), they continue to defy odds by disproportionately producing STEM bachelor’s degrees earned by African Americans. Our student focus groups reflect the 2017 New York Times On Campus Opinion piece written by a Spelman student who recounted her decision-making process to attend an HBCU was based on a culturally-affirming and high expectations environment.
Institutional diversity does exist among HBCUs (e.g., public, private, small, large) and affords ample opportunities to conduct culturally responsive evaluation.
In our work as evaluators, we use culturally responsive strategies and frameworks to examine broadening participation interventions. While availability of STEM evaluation instruments has been increasing, we normally find ourselves having to adapt available instruments in order to explicitly address cultural responsiveness.
A new STEM Evaluation Repository is available on the AEA website to identify STEM-related instruments for adaptation. Remember, always seek permission to adapt from the instrument’s author/developer.
We adapted mentor and mentee versions of the Mentoring Competency Assessment (MCA) to assess the extent of culturally responsiveness in mentoring undergraduate STEM students at Lincoln University. We created a culturally responsive mentoring construct by combining existing survey items with items we developed. The newly developed construct provided insight on the extent to which STEM mentoring accommodated different communication styles, considered cultural differences, and accounted for bias and prejudice. Unexpectedly we found striking gender differences. Female faculty mentors and female students were more in tune with cultural responsiveness than their male counterparts. This finding shed light on the need for STEM faculty development in culturally responsive mentoring. The strengths of female STEM faculty can support the design and delivery of faculty development.
The use of culturally responsive evaluation in the STEM context is not meant to suddenly produce a silver bullet but rather to provide greater insights and the framing of additional unforeseen but useful questions.
We acknowledge the need to develop validated culturally responsive instruments to evaluate STEM broadening participation interventions. In the meantime, we recommend instrument adaptation as a useful mechanism to support the conduct of culturally responsive STEM evaluation.
The American Evaluation Association is hosting STEM Education and Training TIG Week with our colleagues in the STEM Education and Training Topical Interest Group. The contributions all this week to AEA365 come from our STEM TIG members. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.