I’m Sandy Horn, the Senior Educator Support Specialist for SAS® EVAAS®, which has been providing Value Added reporting for states, districts, schools, and teachers for more than twenty years.
Value Added analyses have become part of the accountability model in many states and districts in the past few years, due, at least in part, to the desire to insert a measure of fairness and reason into a system that, in the past, has relied primarily on raising all students to certain levels of attainment, a practice that puts advantaged schools at an additional advantage over those serving disadvantaged students. Certainly, not all so-called Value Added models are sufficiently sophisticated to provide valid and reliable measures of student progress, but there are studies and papers that speak to that issue. I will only say that a few, of which EVAAS is one, have been found to be totally capable of providing those measures.
When the focus is on progress, as it is when looking at effectiveness through a value added lens, the playing field is leveled. A sufficiently sophisticated value added model can uncouple progress from demographics, demonstrating that, although there is a direct correlation between Achievement and various demographic characteristics (poverty, the number of minority students, etc.), no such relationship exists between Progress and these measures. So, in addition to providing analyses for accountability measures, value added can also provide reporting that, when used appropriately, can lead to the improvement of progress for students, when practitioners understand what the data has to offer.
Here are some things I’ve learned from thousands of presentations, training sessions, and interactive web conferences with district and school administrators and with individual educators and teams:
- Know your audience and adapt to its needs. One size does not fit all. Language and content appropriate for statisticians is of little use to a principal attempting to improve the progress of students in Biology or the district-wide planning committee charged with addressing the needs of failing schools.
- Know the difference between data and policy. Ensure your audience knows it, too, should policy issues be broached in a discussion about data.
- Regardless of your audience, what they want to know is:
- Is this fair and reliable?
- What does it mean?
- How can I use it to accomplish my goals?
- How can I teach and encourage others to use it?
- Support is vital, if data is to make a difference. People need a person to ask. Provide your contact information. Be responsive.
The American Evaluation Association is celebrating Consortium for Research on Educational Assessment and Teaching (CREATE) week. The contributions all this week to aea365 come from members of CREATE. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to email@example.com. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.