I’m Andrew Hayman, Research Analyst for Hezel Associates. I’m Project Leader for Southern Illinois University Edwardsville’s National Science Foundation (NSF) Innovative Technology Experiences for Students and Teachers (ITEST) program, Digital East St. Louis.
The ITEST program was established in 2003 to address shortages of technology workers in the United States, supporting projects that “advance understanding of how to foster increased levels of interest and readiness among students for occupations in STEM.” The recent revision of the ITEST solicitation incorporates components of the Common Guidelines for Education Research and Development to clarify expectations for research plans, relating two types of projects to that framework:
- Strategies projects are for new learning models, and research plans should align with Early-Stage, Exploratory, or Design and Development studies.
- Successful Project Expansion and Dissemination (SPrEaD) projects should have documented successful outcomes from an intervention requiring further examination and broader implementation, lending SPrEaD projects to Design and Development or Impact studies.
Integration of the Common Guidelines into the NSF agenda presents opportunities for evaluators with research experience because grantees may not possess internal capacities to fulfill research expectations. Our role in a current ITEST Strategies project includes both research and evaluation responsibilities designed to build our partner institution’s research capacity. To accomplish this, our research responsibilities are significant in Year 1 of the grant, including on-site data collections, but decrease annually until the final grant year, when we serve as a research “critical friend” to the grantee.
I presented at a recent ITEST conference about our role in research and evaluation activities for an audience primarily of evaluators. As expected, some questioned whether we can serve in dual roles effectively while others, including NSF program officers, were supportive of the model. Differences in opinion regarding research responsibilities amongst ITEST stakeholders suggest it may take time for evaluators to carve out a significant research role for ITEST. However, NSF’s commitment to rigorous research as framed by the Common Guidelines, coupled with the limited research capacity of some institutions, suggests possibilities for partnerships.
Lesson Learned:
- Define research responsibilities clearly for both the institution and evaluators. Separation of research and evaluation activities is critical, with separate study protocols, instruments, and reports mapped out for the entire project. A third-party may be required to evaluate the research partnership.
Rad Resource:
- The STEM Learning and Resource Center (STELAR) offers a great repository for ITEST projects including research guidance and other materials.
The American Evaluation Association is celebrating Research vs Evaluation week. The contributions all this week to aea365 come from members whose work requires them to reconcile distinctions between research and evaluation, situated in the context of STEM teaching and learning innovations.. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.