STEM Education & Training TIG Week: Issues in Scaling out a STEM Summer Camp for Students in JROTC by Jim Van Haneghan and David Reider

We are Jim Van Haneghan (Professor and Chair, Dept. of  Counseling and Instructional Sciences, University of South Alabama) and David Reider (Principal Partner with eDUCATIONDESIGN). 

Over the past several years we have been part of a project that has examined a STEM summer camp that combines STEM and workforce development considerations in a population that is diverse and understudied: Students in JROTC programs.  Both grass roots interests from district JROTC programs and top down interest from the armed forces led to the development of STEM programs. The program was initially locally funded but was systematically examined through funding from the National Science Foundation (NSF). It involved integrating STEM challenges and workforce development into a traditional JROTC summer leadership camp. Two cycles of small-scale experimental trials showed significant short-term impacts in STEM content knowledge, confidence in 21st Century Skills, and knowledge of local STEM workforce opportunities.  Long-term impacts were mixed, with varied impacts related to different aspects of this complex intervention.  

Part of the work done during the NSF project involved developing a logic model for replicating the project elsewhere.  That opportunity was realized in a project funded by the Department of Defense to move the camp to additional JROTC programs. The   program involves several components that require local input to tie the program to local STEM workforce development as well as standardized curriculum involving integrated STEM challenges and traditional JROTC activities.  In the district where the program originated, there was a transformational leader who worked well with district administration, local business, foundations, state and regional career technical education organizations, and the national JROTC organization.  When we started looking at transporting the model to other JROTC programs, we found that this web of relationships and district support were not always present.  Further, our vision of the program involved tailoring the program to local and regional STEM related industries.  Hence, we needed to look for models to evaluate not only program outcomes, but also the various people, systems, and contexts in transporting the program.

Rad Resource

One useful perspective we found arose from considering work from “implementation Science.”  Aarons and colleagues discuss “Scale- out” for evidence-based health programs (https://implementationscience.biomedcentral.com/articles/10.1186/s13012-017-0640-6).  They emphasize “scaling-out” rather than “scaling up” (which they view as viable only when looking at something that is simply replicated in similar settings). They discuss expansion through a focus on similarities and differences between the populations served and the systems of activities that are part of the intervention. We found that the work of Aarons and colleagues useful in conceptualizing the evaluation of the STEM camp scale-out.  

Lessons Learned  

Aarons and colleagues focused on different aspects of scale-out depending upon the population and the delivery system context. In our work, we are focusing on the delivery system context, because multiple systems are involved in the program, the local JROTC, the school districts, and the community work force needs vary as the program is developed in new locations. Consequently, that has led us to examine both the characteristics of the local implementation context and ways in which these characteristics can be translated into new sites. For example, in contrast to the leadership where the camp was developed, initial districts that dropped out of the expansion project, may not have had the persistent leadership that led to the development and long-term implementation of the camp in the originating district. Additionally, there were district level issues (e.g., different relationships between the JROTC programs and the districts they served).  Using this framework helped us to think through how to systematically examine the camp as it is extended and tailored to differing organizational, school, and community contexts.


The American Evaluation Association is hosting STEM Education and Training TIG Week with our colleagues in the STEM Education and Training Topical Interest Group. The contributions all this week to AEA365 come from our STEM TIG members. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.