Hello, AEA365 community! Liz DiLuzio here, Lead Curator of the blog. This week is Individuals Week, which means we take a break from our themed weeks and spotlight the Hot Tips, Cool Tricks, Rad Resources and Lessons Learned from any evaluator interested in sharing. Would you like to contribute to future individuals weeks? Email me at AEA365@eval.org with an idea or a draft and we will make it happen.
Hello everyone! We are Adriana Cimetta and Rebecca Friesen with the Center for Educational Assessment, Research, and Evaluation at the University of Arizona. Many of the initiatives that we evaluate aim to increase retention of STEM majors, particularly those from underrepresented populations, through research experiences. Traditional lab apprenticeships are limited and usually reserved for upperclassmen, by which time many aspiring STEM students have changed majors or dropped out altogether. In response, many college science departments have sought to expand access to authentic research opportunities.
Lessons Learned
Effective undergraduate research experiences…
- Include both structure and autonomy. Although Ryan and Deci demonstrate that offering autonomy can increase motivation, scaffolding is also necessary to foster learning and confidence (Vygotsky, 1978).
- Emphasize the broader impact of the results of the research. Emphasizing the meaningfulness of the work has been shown to contribute to increased interest and ultimately retention in the subject area.
- Provide supportive relationships, within groups and by mentors/instructors. Both theoretical grounds, and our own empirical results indicate the importance of feeling a sense of relatedness in the course to foster retention.
Effective evaluations…
- Measure changes retrospectively in sense of belonging in the scientific community and confidence and other outcomes in scientific skills both quantitatively and qualitatively to understand the extent and mechanisms of the change. Measuring retrospectively will enable you to avoid the ceiling effect associated with pre and post surveys that often fail to capture changes. Use validated surveys such as Persistence in the Sciences Survey (Hanauer et al., 2016) or Scientific Identity Scale (Estrada et al., 2011) to enable you to compare results nationwide.
- Administer the Laboratory Course Assessment Survey (Corwin et al., 2015) to determine if a course includes authentic research experiences.
- Consider the relational, political, discursive, and structural dimensions of power affecting participants and the program.
- Understand the importance of intersectionality and disaggregate results by student demographics, including gender, race, ethnicity, and socio-economic status, and other characteristics.
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.