AEA365 | A Tip-a-Day by and for Evaluators

TAG | school data

Greetings, I am June Gothberg, Ph.D. from Western Michigan University, Chair of the Disabilities and Underrepresented Populations TIG and co-author of the Universal Design for Evaluation Checklist (4th ed.).   Historically, our TIG has been a ‘working’ TIG, working collaboratively with AEA and the field to build capacity for accessible and inclusive evaluation.  Several terms tend to describe our philosophy – inclusive, accessible, perceptible, voice, empowered, equitable, representative, to name a few.  As we end our week, I’d like to share major themes that have emerged over my three terms in TIG leadership.

Lessons Learned

  • Representation in evaluation should mirror representation in the program. Oftentimes, this can be overlooked in evaluation reports.  This is an example from a community housing evaluation.  The data overrepresented some groups and underrepresented others.

 HUD Participant Data Comparison

  • Avoid using TDMs.
    • T = tokenism or giving participants a voice in evaluation efforts but little to no choice about the subject, style of communication, or any say in the organization.
    • D = decoration or asking participants to take part in evaluation efforts with little to no explanation of the reason for their involvement or its use.
    • M = manipulation or manipulating participants to participate in evaluation efforts. One example was presented in 2010 where food stamp recipients were required to answer surveys or they were ineligible to continue receiving assistance.  The surveys included identifying information.
  • Don’t assume you know the backgrounds, cultures, abilities, and experiences of your stakeholders and participants. If you plan for all, all will benefit.
    • Embed the principals of Universal Design whenever and wherever possible.
    • Utilize trauma-informed practice.
  • Increase authentic participation, voice, recommendations, and decision-making by engaginge all types and levels of stakeholders in evaluation planning efforts. The IDEA Partnership depth of engagement framework for program planning and evaluation has been adopted in state government planning efforts across the United States.

 IDEA Partnership Leading by Convening Framework

  • Disaggregating data helps uncover and eliminate inequities. This example is data from Detroit Public Schools (DPS).  DPS is in the news often and cited as having dismal outcomes.  If we were to compare state data with DPS, does it really look dismal?2015-16 Graduation and Dropout Rates

 

Disaggregating by one level would uncover some inequities, but disaggregating by two levels shows areas that can and should be addressed.2015-16_Grad_DO_rate_DTW_M_F

 

 

We hope you’ve enjoyed this week of aea365 hosted by the DUP TIG.  We’d love to have you join us at AEA 2017 and throughout the year.

The American Evaluation Association is hosting the Disabilities and Underrepresented Populations TIG (DUP) Week. The contributions all week are focused on engaging DUP in your evaluation efforts. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · ·

My name is Catherine Callow-Heusser, and I’m the President of EndVision Research and Evaluation. I sometimes provide pro bono statistical services to a local school district that is one of Utah’s highest performing districts in elementary reading and math. They build student success despite having one of the lowest expenditures per pupil in the country.

Hot tip: Underfunded districts appreciate expert help from credible evaluators! They rarely have staffing resources to run statistics beyond descriptives on the multitudes of assessment data they collect. Get involved!

Hot tip: Combining multiple data files, potentially with different structures, is not trivial. For example, Dynamic Indicators of Basic Early Literacy Skills (DIBELS) data files typically have a row for each testing period (i.e., beginning, middle and end of year) for each student, resulting in multiple rows per student, while state test data files may have one row per student.

Rad resources: SPSS allows me to quickly restructure data so each student is represented on only one row and to merge data files based on selected variables (i.e., student_ID). I’ll show you how at my AEA demonstration!

Rad resources: Crosstabs show at a glance the percentages of students in categories, particularly when color coding is added to tables to help make data more visible. The crosstabs table below is color coded to show cells where categories are most unmatched (red) or that reflect accurate predictions (green).

The crosstabs analysis showed that students in the lower left cells of the table (included below) did poorly on both the DIBELS Next assessment and on Utah’s Student Assessment of Growth and Excellence (SAGE) state assessment, while students in the upper left cells did well on the DIBELS but poorly on the SAGE assessment.

AEA3

Lessons Learned: The visual display paired with matched data files helped district literacy specialists identify students not making adequate progress and plan successive intervention to help them become more proficient in reading. Literacy specialists also used this evidence to push for higher reading goals for “bubble students,” or those near cutoffs, and to share concerns about cutoffs with test developers.

Rad resources: These simple statistical techniques influenced district goals and interventions. Yet, too few districts have staff resources to do this. Be that resource! Train district staff or volunteer to help with data analysis! As evaluators who value providing evidence to improve programs, we can donate useful services that may change policies and help more program recipients succeed.

Rad Resource: Come to my session at Eval 2015.

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org. Want to learn more from Catherine? She’ll be presenting as part of the Evaluation 2015 Conference Program, November 9-14 in Chicago, Illinois.

Hi, I am Katherine Drake, Evaluation Specialist for the Saint Paul Public Schools, an urban district serving a diverse community of 38,000 students.  In my role as an evaluator, I regularly encounter requests for tests scores and other data on our students.

Hot Tips:

Identify a contact person in the district office rather than approach individual schools.  Many districts have research department staff that are knowledgeable on the available data sources and can give you proper guidance.  Most will require you to submit a research proposal for review (equivalent to a local Institutional Review Board or IRB process) and/or sign a data agreement or memorandum of understanding.

Do not ask for more data than you need.  However, do ask for all that you need in your initial request.  School districts have a great appreciation for efficiency and parsimony.

Detail how you will secure and protect the data.  Districts place a high value on research but an even higher value on protecting students.  Names, identification numbers, birthdates, and other identifying information will be stripped from data files before they are shared.  The Family Educational Rights and Privacy Act (FERPA) is the federal law that protects these privacy rights.

Consider the time demands that your request will have on school and district staff.  Try to design studies that do not add to the already full workloads of teachers and other staff by using existing data as much as possible.

Tie your request to the district’s mission statement and strategic plan.  If you can demonstrate that your research aligns with the district’s priorities, you are in a good position to gain entry to the organization.

Commit to sharing your research findings with the district upon completion of your report.  Educational institutions are genuinely interested in the results of research studies that may inform and guide their work.  Do not forget to include them in your report distribution list.

Rad Resources:

Details on the FERPA law:

Data Quality Campaign:  Using Data to Improve Student Achievement—Resource Library

Twin Cities Hot Tip:

When you are attending the conference in Minneapolis, do not forget that you are in just one of the “Twin Cities.”  Saint Paul has a vibrant arts community and several world-class museums.  Top your visit off with a meal at Heartland Restaurant, featuring regional cuisine prepared by James Beard Best Chef Midwest nominee, Lenny Russo.  For more ideas on things to do in Saint Paul, go to http://www.visitsaintpaul.com.

The American Evaluation Association is celebrating Minnesota Evaluation Association (MN EA) Affiliate Week with our colleagues in the MNEA AEA Affiliate. The contributions all this week to aea365 come from our MNEA members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Archives

To top