Hi there! I’m Maggie Schultz Patel, an independent evaluation consultant and doctoral student of Research Methods and Statistics at the University of Denver.
Program evaluation is still a relatively new discipline, and there is still a need to increase our knowledge of ourselves as a field. Further, according to a recent needs assessment, there are key research questions about the field that evaluators would like to see answered through systematic inquiry. By and large, these questions are in relation to research on evaluation’s impact, methods, and context. While debates on the merits of various methodological approaches are rampant and well-documented, it is less clear how often these approaches are used or how those methodological decisions are made in practice. Though we all may feel certain we have a grasp on when and why we tend to use particular methods as a field, we all know the pros and cons of generalizing too heavily from our own perspectives.
A systematic review of peer-reviewed evaluation literature suggests that there have been few systematic examinations of methods use. As part of my dissertation research, I am developing a mixed-method study to interrogate whether there are observable trends in the use of methods as well as to explore what goes into the selection of methods by evaluators. This will include a systematized review of evaluations published in the past ten years, potentially some latent growth curve modeling of observed trends to explore growth trajectories and inter-/intra- group differences, and interviews with practitioners to shed insight into their decision-making processes when selecting evaluation methods. This research on evaluation is expected to have important implications for the field; findings will document and contextualize the ebb and flow of methodological trends, as well as bolster future efforts to balance methodological rigor with client need. Ultimately, my hope is that the findings will help us move beyond favored methods by encouraging reflective and responsive practice in the field.
The American Evaluation Association is celebrating RoE TIG Week with our colleagues in the Research on Evaluation TIG. All of the blog contributions this week come from our RoE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to firstname.lastname@example.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.