Hello! My name is John LaVelle. I am on the Evaluation Studies faculty at University of Minnesota, and one of the things I get to do is help my students conceptualize and implement original research on the field of evaluation (RoE). We organize our projects by Mel Mark’s research on evaluation taxonomy: contexts, activities, outcomes, professional issues, and evaluators themselves. From there, once we have an idea of the topic and a few good questions, all we have to do is figure out how to answer our questions with good evidence. It is both exhilarating and daunting, much like evaluation practice itself.
Recently, my teams have been working to understand evaluators from a psychological lens. Who are we? What things do we believe are important? Do those beliefs and values relate to evaluation practice, and if so, how? At this stage, it seems that even descriptive data on evaluators is sparse, much less predictive data, so we are starting by describing the phenomenon as best we can, and seeing how existing frameworks apply (or do not) to evaluators. Our first study was on the Impostor Phenomenon in evaluators, and we are writing up a descriptive/predictive study of evaluators’ values (we hope to share this at AEA2020).
Hot Tip: I conduct a lot of research on evaluation/evaluators studies, and I try to find ways to compensate my participants. However, it doesn’t seem like giving an evaluator $5 or even $15 for participating in a study really makes a big difference to the individual. Instead, I tell my participants that I will donate to the AEA Student Support Fund on their behalf. I’ve donated close to $2,000 so far, and I know AEA would be thrilled if others would follow suit. Inquiries about matching funds should be sent to Zachary Grays.
Rad Resource: I don’t think that it is necessary to reinvent the wheel every time I want to do a RoE study. Plenty of tools and measures are already available. A resource I particularly like is the book Taking the measure of work by Dail Fields, and it contains dozens of validated scales that can be adapted to an evaluation context.
Cool Trick: I try to find ways to integrate research into the work I already get to do. For example, my students write reflective essays about their key learning experiences in evaluation, and after a few short years, I have built a collection of over 400 reflections I can use to improve my teaching. Easy data on teaching, learning, and anticipated retention!
Hot Tip: Ask questions and check your assumptions! Evaluation is a dynamic field filled with interesting people. Many of my research ideas start with concepts that I apply to evaluators (e.g., values – practice link), but I’ve also had success asking evaluators what they are interested in or what is challenging them in practice (e.g., Impostor Phenomenon).
What questions do you have about evaluation or about evaluators? Reach out, and perhaps we can collaborate!
The American Evaluation Association is celebrating RoE TIG Week with our colleagues in the Research on Evaluation TIG. All of the blog contributions this week come from our RoE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to email@example.com. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
1 thought on “RoE TIG Week: Integrating Me-search and Re-search by John LaVelle”
Great post John! I also collect information and stories students and new evaluators have shared with me over the years. We have so much to learn from both new and season evaluators! Thank you also for the resource, Taking the measure of work by Dail Fields. I was unfamiliar with that book and have added to my reading list for this summer. Thank you again for a great post!
Sondra LoRe, Ph.D.
Manager | National Institute for STEM Evaluation and Research (NISER)
Adjunct Professor | Evaluation, Statistics, and Measurement Program, Department of Educational Psychology & Counseling
The University of Tennessee, Knoxville
Office of Research & Engagement
114 Philander P. Claxton Education Building