Hi Everyone. I am Melissa K. Demetrikopoulos, Ph.D., Chair of the Division of Program Development and Assessment at the Institute for Biomedical Philosophy. My research interests include partnership formation, broadening participation, scientific literacy, and student success in research. I provide professional development training on inter-institutional collaborations, building consensus, and partnership formation. Just as it is important for the various stakeholders of a STEM education project to work within a collaborative partnership, it is critical for the evaluator to have regular access to program staff to be able to support an iterative and ongoing evaluation process.
Lesson Learned #1: Everyone is busy
While approaches, requirements, and budgets vary, the National Science Foundation (NSF), National Institutes of Health (NIH), Centers for Disease Control and Prevention (CDC), and Department of Education all require evaluation of their STEM education grant funded projects. Some PI’s and project staff understand the importance of evaluation to their work, while others view it as a distraction and drain on their budget. The evaluation effort should be integrated with the research effort to both reduce participant burden and to engage program leadership in evaluation.
Lesson Learned # 2: Change happens
Grant proposals are just that: an attempt to map out what might occur over a 3-to-5-year time line. Once the STEM education program begins, more detailed decisions need to be made, logistics need to be worked out, and participants and staff need to be engaged. Inevitably, this leads to slight or major shifts from the original proposal that will continue over the multi-year course of the program. Having regular standing meetings between the evaluator and program manager/coordinator provides ongoing opportunities to collect process evaluation data as well as ongoing opportunities to reexamine the evaluation plan, timing, and instruments for the project to be responsive to the changes.
Hot Tip #1: Develop Program Data Collection Tools
Provide custom data collection tools to the program manager/coordinator to report on program data and capture their approach and success for each stage of the project (e.g. recruitment, selection, and attrition of participants, engagement of partners, calendar of events, etc.) I have found that excel and word documents work well for this effort and that it is important to meet and review the tools before they are implemented and after data collection. These program data collection tools should capture data at least quarterly and should include data that will allow you to assess how well the program meets its goals and objectives.
Hot Tip #2: Communicate Regularly
Most STEM education projects require the submission of an annual evaluation report by NSF, NIH, CDC or Dept of Ed funders, but benefit from more regular evaluation feedback to ensure that evaluation is an iterative and ongoing effort. I have found that both of these needs can be met without duplication of effort by producing separate reports on the phases of the project that can then be integrated into an annual report. I have also found that it is helpful to give a brief (20 minute) evaluation update presentation to all of the program staff at a project staff meeting at least annually and ideally semiannually.
It is critical to maintain a balance between the objectivity of being an external evaluator and having sufficient access to a program to be able to effectively evaluate the program and make data-based recommendations for ongoing improvement. Building a strong working relationship with program staff such as program coordinators and project managers ensures more complete capture of program data over the course of the year resulting in a more accurate evaluation effort and more useful and timely recommendations.
The American Evaluation Association is hosting STEM Education and Training TIG Week with our colleagues in the STEM Education and Training Topical Interest Group. The contributions all this week to AEA365 come from our STEM TIG members. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.