Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

E. Jane Davidson on Evaluative Rubrics

My name is Jane Davidson and I run an evaluation consulting business called Real Evaluation Ltd. In my work, I advise and support organizations on strategic evaluation; provide evaluation capacity building and professional development; develop tools and templates to help organizations conduct, interpret, and use evaluations themselves; and conduct independent and collaborative evaluations and meta-evaluations.

Over several years’ working with clients and reviewing (at clients’ request) disappointing evaluation reports, I have noticed several critically important elements that make or break evaluation work but are often missing from evaluators’ methodological toolkits.

Hot tip: Clients find it incredibly frustrating to wade through an evaluation report full of evidence and still be none the wiser at the end whether the documented outcomes (let alone the entire program/policy/etc) are any good or not. A key part of an evaluator’s work is to say clearly and explicitly how practically, educationally, socially, or economically (not just statistically) significant outcomes are (severally, and as a set). This is what makes evaluation ‘e-VALU-ation’!

Hot tip: A useful tool for generating real evaluative conclusions is an evaluative rubric. This is a table describing what different levels of performance, value, or effectiveness ‘look like’ in terms of the mix of evidence on each criterion. Grading rubrics have been used for many years in student assessment. Evaluative rubrics make transparent how quality and value are defined and applied. I sometimes refer to rubrics as the antidote to both ‘Rorschach inkblot’ (“You work it out”) and ‘divine judgment’ (“I looked upon it and saw that it was good”)-type evaluations.

Hot tip: Collaborative development of rubrics is a great way to get stakeholders thinking about how ‘quality’ and ‘value’ should be defined for the work they do. It helps build the evaluative thinking needed to generate, understand, accept, and use evaluation findings.

Rad resources:

This contribution is from the aea365 Daily Tips blog, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org. Want to learn more from Jane? She’ll be presenting as part of the Evaluation 2010 Conference Program, November 10-13 in San Antonio.

1 thought on “E. Jane Davidson on Evaluative Rubrics”

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.