Washington Evaluators Affiliate Week: Evidence-Based Medicine to Evidence-Based Policymaking by Esther Nolton

This week members of Washington Evaluators are sharing stories from their careers. From their pathways to evaluation to reflections from the field, these anecdotes, recommendations, and lessons learned remind us of the power of the story and the important storytelling role we play as evaluators.


My name is Esther Nolton, and I am the Immediate Past President for Washington Evaluators. Like many of us in the field, I identify as an “accidental evaluator”—a person who stumbled upon evaluation and stuck around. As I reflect on my journey to and through evaluation, I am amazed by how intentional and logical my path has actually been. I hope my story provides an example of the fascinating ways in which people become evaluators.

Both my bachelor’s and master’s degrees were in sports medicine. For a decade, I was a Certified Athletic Trainer practicing in collegiate settings. I was deeply committed to being a great clinician and utilizing evidence-based medicine. I had many questions about how we were making clinical decisions (e.g., readiness to return to play following an injury) which often had major implications on short- and long-term outcomes (e.g., inability to walk up stairs 20 years later due to chronic knee pain). It bothered me that we had policies and programs that could have a much greater impact on athletes’ lives than were often accounted for in clinical decisions.

My questions led to me pursuing my doctorate in measurement methodology because I wanted to develop better clinical measures of readiness. What began as questions about return-to-school readiness following a concussion evolved into questions about other types of readiness (e.g., college and career readiness). I became steeped in psychometrics and studied certification and licensure testing (e.g., the Board examination for Athletic Trainers). Then, I began asking questions about professional training and development ranging from literacy to competencies to training program outcomes. A mentor commented, “You’re asking evaluation questions!” which meant little to nothing to me since I was also teaching a course on Lower Extremity Evaluation at the time.

Soon, I learned this person was talking about a different type of evaluation. Program and policy evaluation. I realized a lot of my previous questions were evaluative in nature, but I didn’t have the language or the knowledge to distinguish it from (clinical) research. When I finally named my interests and decided to make this my new career path, I eventually made a formal switch to research & evaluation methods for my doctoral studies with a secondary specialization in health & education policy. I fell in love with evaluation as a tool for evidence-based policymaking. Now, I don’t just do and teach about evaluations. I specialize in leading evaluation strategy, planning, and policy to help organizations be more proactive, equipped, and rigorous in their evaluative activities. While it seems like I’ve traveled a great distance on an aimless trajectory, the truth is my desire for better evidence was both what started me on this path and continues me down this road as an evaluator.

Lessons Learned

  • There are many paths to evaluation and there is not one route that is correct or better than others. Being evidence-based or data-driven is a mindset that translates naturally to evaluation.
  • The unique backgrounds that we all bring to evaluation are what make the field beautifully diverse and wonderfully innovative. It’s essential to our practice to have this type of diversity and collaboration.
  • There are many different types of training opportunities to bolster your skills in research and evaluation. They range from “formal” to “non-formal” modes, and all are valuable!

Hot Tips

  • Plug Into Evaluation Communities – AEA (and its Topical Interest Groups), local affiliates, and other professional communities are great opportunities to meet other professionals.
  • Conduct Information Interviews – Build your network and knowledge by having conversations with evaluators to learn about their work, pathways to evaluation, and areas of expertise.
  • Practice and Learn from Experiences – Most of us learn best by doing. Try out your skills, especially in safer environments like internships and fellowships, then learn and grow from those experiences each time.
  • Consume and Produce Research on Evaluation – A great way to advance the profession is by conducting research studies on the field and practice of evaluation. These studies help us learn about the methods, contexts, circumstances, and professional issues related to evaluation that are both important to investigate and apply in practice.

The American Evaluation Association is hosting Washington Evaluators (WE) Affiliate Week. The contributions all this week to AEA365 come from WE Affiliate members. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.