I’m Wendy Wolfersteig, Director of the Office of Evaluation and Partner Contracts for the Southwest Interdisciplinary Research Center (SIRC) and Research Associate Professor at Arizona State University and President of the Arizona Evaluation Network. I focus on evaluating effective prevention programs and thus discuss evidence-based practice (EBP) and how to use it in community and government settings. I have explained EBP so many times in so many ways, and lately it is a hot topic.
It has taken years to bring the term “evidence-based” into the vocabulary of Arizona state and local government officials. The push over the past 8-10 years from federal, private business and foundation sources that insists upon accountability, has slowly but surely led officials to use words like evidence-based, or at least evidence-informed, in selecting programs to be funded.
Yet, the fate of evidence-based decision-making was not clear as the year came to an end. When is evidence – evidence? What is the evidence that it is a fact? How are science and evidence to be considered in practice and policy making?
Even use of EPB terminology was being questioned with reports that government staff were encouraged not to use certain words, including “science-based.” Further, the National Registry of Evidence-Based Programs & Practices (NREPP), a database of prevention and treatment programs with evidence-based ratings, had its funding ended prematurely.
I gain hope from my graduate students when we discuss evidence-based practice – in practice. We talk about the research, when are data facts, when do programs account for participants’ cultural and other differences, and how to make these judgments. This focuses us on what research and evidence can and cannot determine, and how we each make personal and professional decisions every day. We are left to ponder the outcome when the NREPP website says that “H.H.S. will continue to use the best scientific evidence available to improve the health of all Americans.”
- Relate and avoid jargon. Put the reasoning for evidence-based evaluation and practice into the terms used by my/your client or potential client.
- Talk about desired outcomes – and how what assessments, practices, programs, strategies and activities were selected – would impact what happened.
- Ask questions before giving answers: Why do they want a specific strategy? How do they know if it would work? Were they willing to keep on doing “what we’ve always done” without some evaluation or data to know they were spending money and time in the best interest of their clients? Do they need data to show success? Who decides?
I learned a lot about professional efforts to enhance evidence-based decision-making by participating in the EvalAction 2017 visit to my local Congressperson’s office during the AEA Conference. Here are a few resources that came to my attention.
- Report of the Commission on Evidence-Based Policymaking, The Promise of Evidence-Based Policymaking, 2017.
- Evaluation Roadmap for a More Effective Government (2013); and the AEA letters in support of legislation to use scientific methods and practices.
- Keep the discussion on evidence-based decision-making going at the Arizona Evaluation Network’s 2018 From Learning to Practice: Using Evaluation to Address Social Equity conference taking place this April in Tucson, AZ.
The American Evaluation Association is celebrating Arizona Evaluation Network (AZENet) Affiliate Week. The contributions all this week to aea365 come from our AZENet members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to firstname.lastname@example.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.