Oregon Evaluators Week: Avoiding Bias and Maintaining Objectivity by Kristi Manseth and Regina Wheeler

Hello! We are Dr. Kristi Manseth and Regina Wheeler, Portland-based evaluators working for Pacific Research and Evaluation (PRE).  

The Oregon Legislature enacted House Bill 3499 (HB 3499) in 2015 to develop and implement a statewide education plan for English Learners (EL) in the K-12 education system. Forty Oregon school districts were identified in Spring 2016 as having the greatest need to improve outcomes for EL students. For the past year, our team at PRE has been working with the Oregon Department of Education to evaluate HB 3499 both in terms of the implementation efforts as well as the outcomes for English Learners.

This project has reminded our team of how easy it is to become biased during our data collection activities and the importance of including all stakeholders in evaluation efforts. When the evaluation was initially designed, it was heavily weighted towards the voice of the school districts receiving the HB 3499 funds but did not include any data collection with ODE staff or the EL advisory committee. The EL advisory committee is made up of HB 3499 stakeholders including: an EL parent representative, district stakeholders, representatives from advocacy groups/nonprofit organizations, and educators and community members who advocated for the passage of the house bill.  We quickly learned that there was more to this evaluation than understanding the districts’ experience, how they used their funding, and how this money has impacted outcomes for English Learners. It is also about understanding the story behind HB 3499, how the law can be successfully upheld moving forward, how districts can be more fairly evaluated, and how ODE can effectively support these efforts.

Lessons Learned: Our team at PRE, like many other evaluators, have started to stop and explore our own biases and perspectives as a key step in our evaluation process. Although many of us were trained to be objective researchers, we understand that the perspectives we bring to the work cannot truly be separated from the evaluation. Expanding data collection efforts and allowing for time to recognize and process our biases and perspectives has resulted in a more well-rounded and meaningful evaluation process.

Rad Resource: Check out MQP’s blog for more about confusing empathy with bias. Another fun resource we have been using for this project is Canva to create quick visually appealing deliverables. Canva is an affordable and user-friendly online application that allows those who are not savvy graphic designers (like us) to create visual content.

This week, AEA365 is featuring posts from evaluators in Oregon. Since Evaluation 2020 was moved from Portland, OR to online, a generous group of Oregon evaluators got together to offer content on a variety of topics relevant to evaluators. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

1 thought on “Oregon Evaluators Week: Avoiding Bias and Maintaining Objectivity by Kristi Manseth and Regina Wheeler”

  1. Greetings!
    My name is Shawn Skalinski, and I am currently doing my masters in the PME program at Queens University. We are just finishing up a course called, “Program Inquiry and Evaluation” where I have, for the first time, learned about many aspects of the fascinating world of evaluation. One comment you. made in this particular blog which caught my eye was, “Expanding data collection efforts and allowing for time to recognize and process our biases and perspectives has resulted in a more well-rounded and meaningful evaluation process.” Of course we learned about the importance of recognizing bias and some ways to help make an evaluation more credible and less biased; however, I am very curious about specific strategies or procedures you do during an evaluation which allow you to lesson the bias in an evaluation. Aside from including more than one evaluator in a program evaluation, what other specific actions can be taken, in your experience? I am looking forward to hearing your insights!

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.