Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

CREATE Week: Tips for Navigating the Shifting Landscape of Educational Evaluation by Jim Van Haneghan

Happy New Year, readers! Liz DiLuzio here, lead curator of AEA365. We are excited to kick off 2022 with a “best of” week sponsored by the Consortium for Research on Educational Assessment and Teaching (CREATE). Every blog this week is a revival of posts with evergreen content that was so thought provoking the first time around that we just needed to give it another day in the sun. We hope you enjoy.

My name is Jim Van Haneghan.  I am a Professor in the Department of Professional Studies at the University of South Alabama and Past President of the Consortium for Research on Educational Assessment and Teaching Effectiveness (CREATE).  CREATE is an organization focused on both educational assessment and educational program evaluation in the service of effective teaching and learning (createconference.org).  Our group brings together practitioners, evaluators, and researchers for our annual conference each year.  One of our main concerns has been on the consequential validity of educational policies, classroom assessment practices, organizational evaluation, and program evaluation evidence.  This is especially important in the dynamic times we work in today where policy changes can alter the potential impact of a program and shift the nature of evaluation activity.  The recent change in administration and in the Department of Education may require educational evaluators to be facile in adapting their evaluations to potentially radical changes.  Hence, my goal in this post is to provide some tips for navigating the educational evaluation landscape over the next few years.

Hot Tips: For Navigating the Shifting Sands of Educational Policies and practices:

  1. Pay closer attention to contextual and system factors in evaluation work.  Contextual analyses can call attention to potential issues that may cloud the interpretation of evaluation results.  For example, when No Child Left Behind was implemented, a project I was evaluating focusing on a cognitive approach to teaching elementary arithmetic was changed.  Instead of the trainers and coaches being able to focus on the intended program, their focus shifted to the specifics of how to answer questions on standardized tests. The new policy changed the focus from the intended program to a focus on testing. This problem of “initiative clash” has shown up many times over my career as an evaluator.
  2. Be vigilant of unintended consequences of programs and policies. Often there are unintended consequences of programs or policies. Some can be anticipated, whereas others cannot.

Rad Resource:  Jonathan Morell’s book Evaluation in the Face of Uncertainty provides a number of heuristics that can help evaluators anticipate and design their evaluations to address unintended consequences.

  1. Revisit and Refresh your knowledge of the Program Evaluation Standards

In an era of “Fake news” and the disdain for data, evaluators need to ensure that stakeholder interests are considered, that the data are valid and reliable, that the evaluation has utility in making decisions about and improving the program, and that an honest accounting of program successes and failures has been included.  The mentality of believing only “winning’ and positive results should be shared makes it difficult to improve programs or weed out weaker ones.

Rad Resources:  The Program Evaluation Standards and AEA’s Guiding Principles for Evaluators.

  1. Enhance efforts toward inclusion of stakeholders, particularly those of traditionally poorly served groups.  Methods and approaches that take into account the perspectives of less empowered groups can help support equity and social justice in the context of educational policies and programs.

The American Evaluation Association is hosting Consortium for Research on Educational Assessment and Teaching (CREATE) week. The contributions all this week to AEA365 come from members of CREATE. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.