Sarah Hug on Evaluating STEM Pipeline Programs

My name is Sarah Hug and I am a Research Associate with the Alliance for Technology, Learning and Society (ATLAS) at the University of Colorado at Boulder. I am going to give a few tips regarding “pipeline evaluation” for programs aimed at increasing enrollment and diversity in science, technology, engineering, and math (STEM) fields.

Pipeline programs aim to change the career trajectories of young people in the long term. These goals for youth are often beyond the scope of the evaluation and program timelines- for example, an academic science program targeting middle school students will not be able to collect career data for at least five years, when students have graduated high school. What should  an evaluator study in the meantime?

  1. Student aspirations and interest: Evaluators can focus on student interest in the STEM fields, and their changing or continuing aspirations for STEM careers.
  2. Student knowledge of the fields: Knowledge about careers in technical areas is essential for advancement in STEM, particularly for underrepresented and under-resourced students. Evaluators can focus efforts on program participants’ change in career awareness at all academic stages. Some elements of career awareness evaluators might measure include:  knowledge of the depth and breadth of science careers, knowledge regarding what scientists do, and familiarity with the level of education needed to attain specific careers.
  3. Students’ “next- step behaviors”: Students’ college and career readiness can be influenced by early academic experiences. Discover what students might do at each level of the academic pipeline to further their STEM career readiness. This is often context and even school district specific-for example, are there clubs or camps students might join to advance their careers? What high school course choices could indicate students’ preparation for STEM careers? It is particularly important to consult program directors and school partners to gather ideas for measuring “next-step behaviors”. Comparing your program findings to local or national data for similar groups is essential for showing program impact.

Rad Resource: The National Center for Educational Statistics provides national data on student enrollment, graduation, and academic behavior at all levels of the education pipeline. Check out “The Condition of Education 2010” report, and data exploration tools to understand how students are progressing at different stages of the pipeline.

This contribution is from the aea365 Daily Tips blog, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to

2 thoughts on “Sarah Hug on Evaluating STEM Pipeline Programs”

  1. The real trick is to find and/or develop appropriate measures or assessments for the grade level. Anyone got a pre-post assessment of engagement in science for kindergarten?

  2. Hi Sarah, thanks for posting.

    I work for a research consultancy in London, UK called FreshMinds. We’ve done some v. similar work at this end for the National Audit Office, modelling the UK STEM pipeline (primary and secondary data, qual and quant). We’re also doing four European country STEM pipeline profiles (UK, Sweden, Italy and Ireland) for EngineeringUK.

    I’d love to find out more about your work and share details of ours. Please drop me a line at if you’re interested.

    Kind regards, Louis

Leave a Reply to Louis Coiffait Cancel Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.