Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

MSI Fellowship Week: Strategies for Equitably Evaluating Research Productivity during COVID-19 by Erica T. Sosa

My name is Erica T. Sosa, and I am an Associate Professor in Public Health and Associate Dean of Research in the College for Health, Community and Policy at the University of Texas at San Antonio. I am also a very proud American Evaluation Association MSI Fellow.

As the COVID-19 pandemic shifted the way we live, work modalities and research related restrictions followed. As a public health researcher, I was happy to see us put public health and the safety of research participants in front of the desire to sustain research productivity. However, I also realize the pandemic did not affect all faculty members equally. I read popular media articles on how the pandemic disproportionately burdened women due to the increased need for childcare at home and their larger role in providing that care. However, several factors likely influenced the pandemic impact on research productivity in an academic environment. Briefly, we can think of individual factors, family factors, discipline factors, university-capacity and policy factors, and geographic factors. Beyond the work/life balance issues, faculty members needed to set up their home offices quickly to work remotely. Faculty members who primarily use secondary data for their research did not face restrictions on human subjects research the same way as faculty who collect primary data, especially those who work with populations who lack internet access. University resources and accommodations were also not the same across the nation. As some universities were able to provide remote access and support to their staff, others lacked the infrastructure for an efficient transition providing a gap in services. As the focus in funding and publications shifted to COVID-19, faculty members who do research in a related field were able to benefit. Several other factors contribute to varied productivity among faculty members over the past year.

Lessons Learned:

Evaluating policies implemented to address reduced research productivity must be as complex and multi-level as the factors that impede productivity. Through the MSI program, I improved my skills to use a culturally responsive and equitable evaluation approach. My mind kept coming back to how I could use this framework to evaluate the policies within the context of the sub-cultures we have in academia by university, discipline and research support. Culturally Responsive Evaluation places culture and community at the center and appreciates culture and context when evaluating programs. I am now focusing on incorporating this lens in evaluating how our policies implemented, helped or perhaps did not fully address preexisting inequities, exacerbated by the pandemic.

Get Involved:

I believe the pandemic provides us a unique opportunity to highlight the importance of culturally responsive evaluation and its role in increasing social justice. By shaping our strategies, we can not only inform future policies but draw light on the many ways we can support and improve research productivity among all faculty.

The American Evaluation Association is AEA Minority Serving Institution (MSI) Fellowship Experience week. The contributions all this week to aea365 come from AEA’s MSI Fellows. For more information on the MSI fellowship, see this webpage: https://www.eval.org/Education-Programs/Minority-Serving-Institution-Fellowship/MSI-Fellows  Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.