AEA365 | A Tip-a-Day by and for Evaluators

Apr/16

6

CREATE Week: Jim Van Haneghan and Jessica Harlan on Interest in STEM: Lessons Learned

We are Jim Van Haneghan (Professor at the University of South Alabama) and Jessica Harlan (Senior Program Evaluation Specialist at the Johns Hopkins University School of Medicine). Over the past several years we have been studying a middle school integrated STEM curriculum called Engaging Youth through Engineering (EYE). We have identified one element of impact that is much more complicated to determine than initially thought: the influence of the program on student interest in STEM.

We would like to share two lessons learned from our work. First, interest develops through different stages, and it is challenging to measure these as unique phases. Hidi and Renninger (2006) differentiate more fleeting situational interest with an external locus of causality from sustained interest with an internal locus of causality. Considering specific level of student interest is important in for evaluators because students involved in STEM programs (especially ones where students have a choice to participate or not) may need a program to address “interest” differently depending upon students’ phase of interest development. A program that creates initial interest may differ in focus and impact on students at each interest level from a program that sustains interest. Additionally, when designing assessments of “interests” evaluators need to go beyond items that ask about initial interest.

The second lesson is that when looking at interest’s role, the program being evaluated is one of many influences that might facilitate or detract from students developing sustained interest. For example, our EYE modules were part of 6th, 7th, and 8th grade for the students we examined, but represented at most about 5% of the days of an entire middle school career. While the modules might have influenced led to stronger interest in STEM, as we continue to investigate EYE in a larger scale study, we have to ask whether EYE effects could be moderated by other factors (e.g., other STEM opportunities, having high/poor quality STEM area teachers, poorly sequenced or ill developed curriculum in regular math and science). When we asked students which impacted their interest in STEM more: EYE or regular math and science, most agreed their regular math and science classes had a greater impact. Failure to consider these other factors could result in evaluators making a Type III error: erroneously attributing differences between groups to program participation rather than other factors.

Rad Resources:

The National Academies Press has several books that helped us frame important questions about EYE:

The American Evaluation Association is celebrating Consortium for Research on Educational Assessment and Teaching (CREATE) week. The contributions all this week to aea365 come from members of CREATE. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

1 comment

  • Ashleigh King · April 7, 2016 at 12:21 pm

    Thanks for this post! I am currently consulting on several evaluations that intend to assess changes in students’ “interest” in STEM. Are there any resources you’ve found particularly helpful regarding the measurement aspect of this work (i.e., best practices, validated instruments, and/or pitfalls to avoid with respect to measuring youth interest in a particular discipline/field)?

    Reply

Leave a Reply

<<

>>

Archives

To top