We are Callie Dean of CYBER.ORG , Kimberly Cowley of ICF, and Tom Withee of Goshen Education Consulting. We serve as internal (Callie) and external (Kim and Tom) evaluators of three different federally funded programs, which each address different sectors within the STEM educational ecosystem: Pre-K12 schools, extracurricular initiatives, and higher education.The Federal STEM strategy prioritizes the development of shared metrics and definitions throughout this ecosystem in order to track success, while recognizing that metrics and approaches will necessarily vary based on contextual factors. This post highlights three different programs and the ways that they have engaged with existing systems to measure common outcomes: STEM self-efficacy and career awareness.
CYBER.ORG hosts an annual institute for K-12 teachers to learn to integrate cybersecurity into their classrooms. One program goal is to increase teachers’ awareness of different cybersecurity careers so they can introduce their students to relevant career paths. Because we initially struggled to find existing instruments specific to cybersecurity education (one notable exception being the Cybersecurity Engagement and Self-Efficacy Scale), we adapted items from University of Phoenix’s Cybersecurity Study to measure teachers’ familiarity with specific job titles from the NICE Workforce Framework. We also adapted items from the Computer Science Attitude survey to develop a pre/post survey for their students. This evaluation allowed us to build on an existing body of research while measuring our highly specific program goals.
The First2 Network (First2) is a statewide alliance striving to improve the persistence of rural, first-generation STEM college students in West Virginia. First2 was initiated as a pilot in 2016 and received an NSF INCLUDES award in 2018. One way the network is striving to build knowledge of STEM persistence is through examining the efficacy of those students who particulate in a two-week summer research internship experience prior to their first semester of college. As part of the external evaluation of First2, students complete an online survey as a pre/post measure at the beginning and end of their internship; the survey includes subscales focusing on STEM career, STEM efficacy, STEM identity and plans, and school belonging.
STEMKAMP is an informal STEM education opportunity funded by DoD NDEP to increase STEM career awareness and identity in military connected students in 10 K-12 public schools across the nation. The camp focuses on showcasing STEM professionals who represent minority groups. To assess the self-efficacy of KAMPers, we are utilizing both the Growth Mindset and S-STEM. These instruments were chosen because they are recognized across the STEM Education community for their psychometrics and ease of implementation. These instruments have been used in a variety of STEM related programs demonstrating their language is appropriate while still being precise. Furthermore, using these instruments allows us to compare our results with similar programs using the same instruments.
Don’t reinvent the wheel. Look for instruments that measure the same, or similar, goals as your program. Most programs’ goals are aligned to broad themes throughout STEM education, and many authors are happy to share their work.
Check out online evaluation repositories from Informal Science, the U.S. Department of Education, and the STEM Learning and Research Center (STELAR) to find survey instruments, templates, reports, and more.
The American Evaluation Association is hosting STEM Education and Training TIG Week with our colleagues in the STEM Education and Training Topical Interest Group. The contributions all this week to aea365 come from our STEM TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to firstname.lastname@example.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.