AEA365 | A Tip-a-Day by and for Evaluators

Apr/16

7

CREATE Week: Doug Wren on Developing K-12 Performance Assessments

Greetings. I’m Doug Wren, Assistant Adjunct Professor at Old Dominion University and Assessment Specialist with Virginia Beach City Public Schools (VBCPS). Extensive state-mandated testing of public school students has been roundly criticized in recent years; at the postsecondary level, over 850 four-year institutions across the nation—including Bowdoin College, Brandeis University, and George Washington University—accept applicants without ever seeing their ACT or SAT scores (see 850+ Colleges and Universities That Do Not Use SAT/ACT Scores). One shortcoming of tests such as the ACT, SAT, and NCLB-era tests is their multiple-choice format.

The use of performance assessment in Virginia K-12 classrooms has increased since 2014, due primarily to a law that reduced state-mandated multiple-choice testing and encouraged “age-appropriate, authentic performance assessments and portfolios with rubrics.” Beginning in 2008, VBCPS experienced an upsurge in performance assessment after the school board approved a strategic plan that called for the implementation of a balanced assessment system.

Lessons Learned:

  • Proceed with caution. Most educators do not have a comprehensive understanding of performance assessment. Recruit teachers who use performance assessments and rubrics regularly to help you with the development process. If you plan to use assessments and rubrics that were developed externally, find out if there is validity evidence for the assessment (i.e., does it measure what it’s supposed to measure?).
  • Conduct pilots and field tests.   Regardless of whether you use someone else’s assessment or create your own, “try out” the assessment on a small sample of students before you administer it to a large group. Students will provide you with some of the best feedback you’ll ever receive. In addition, give the rubric a tryout by scoring sample student products. If your teachers have sufficient training and a common understanding of the rubric, independent scores from different teachers for the same products will be similar.
  • Revise, retest, and repeat. No assessment is perfect. Even established tests require modifications (e.g., the Stanford Achievement Test, now in its tenth version). Consider every comment, suggestion, and critique from students, teachers, and scorers, and find an outside expert to review your work. Make revisions and conduct additional field tests before the assessment is rolled out to the masses. Based on your large-scale administration, is there anything you could change to make your assessment or rubric even better for the next go-around?

Rad Resources:

The American Evaluation Association is celebrating Consortium for Research on Educational Assessment and Teaching (CREATE) week. The contributions all this week to aea365 come from members of CREATE. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

1 comment

  • Enes Selimovic · April 10, 2016 at 7:59 am

    An excellent article. I wish this type of testing would became mandated in Georgia. Thank you.

    Reply

Leave a Reply

<<

>>

Archives

To top