My name is Rachel Becker-Klein and I am an evaluator and a Community Psychologist with almost a decade of experience evaluating programs focused on STEM (science, technology, engineering, and math) education, citizen science, place-based education, and climate change education. I’ve worked with PEER Associates since 2005. PEER Associates is an evaluation firm that provides customized, utilization-focused program evaluation and educational research services for organizations nationwide.
Recently I have become very interested in thinking about how to measure impacts of climate change and other STEM programs on youths. These programs often use out-of-school time and outdoor settings to teach important content and skills, so using traditional surveys and standardized tests may not be appropriate ways of assessing youth learning. Embedded assessment offers an innovative way of capturing student content knowledge, skills, and science dispositions that can complement traditional standardized tests and surveys in formal educational settings. Embedded assessments are one form of alternative assessments and can be defined as “opportunities to assess participant progress and performance that are integrated into instructional materials and virtually indistinguishable from day-to-day program activities” (from page 184 of Wilson & Sloane). This assessment technique allows learners to demonstrate their STEM/climate change competence in informal settings without undermining the voluntary nature of learning in such settings.
Lessons Learned: While there is considerable interest in embedded assessment (as gauged by talking to other evaluators and reviewing assessment literature), there are few published articles that examine these assessment strategies for their validity, reliability, or correlation with more traditional assessment techniques. There seems to be a strong need for understanding more about how embedded assessment approaches can be used for climate change and STEM education.
Designing embedded assessment is challenging, takes a lot of time, and requires close collaboration with program staff in order to ensure that the task is truly embedded into the program activities. However, when done well, it seems worth the effort.
Cool Trick: One example of an embedded assessment is for a project that PEER evaluated which included teaching youths how to develop and create their own Augmented Reality (AR) games. In order to assess youth skills at AR game development, we created an AR challenge activity for them to do, with an accompanying assessment rubric that evaluators used to assess youth skill level.
- Network with other evaluators, at the AEA 2014 conference in Denver. You can search for evaluators like Karen Peterman who have done a lot of thinking about and development of embedded assessments.
- Check out a review of student assessment, called Assessing Student Work by the Rural School and Community Trust, or Mark Wilson and Kathryn Sloane’s paper, “From Principles to Practice: An Embedded Assessment System.”
The American Evaluation Association is celebrating Climate Education Evaluators week. The contributions all this week to aea365 come from members who work in a Tri-Agency Climate Education Evaluators group. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to firstname.lastname@example.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.