PD Presenters week: John Owen on Rapid Response Evaluation: Concepts and Practice

Hello, I am John Owen, Principal Fellow at the Centre for Program Evaluation in Melbourne Australia.

For quite a while I have been interested in how evaluation can influence the quality of social policy and program delivery.  The lesson I wish to convey is that, fundamentally, evaluation should be seen as a servant of good decision-making about policy and programs.

Mapping evaluation influence has been a key conceptual theme over the past decade or so.  Attention initially concentrated on use at the end of a program cycle in pursuit of external validity.  Here, influence can be thought of in terms of traditional approaches to evaluation practice.

However we could claim that utilization within the traditional context has been problematic.  In many instances policies or programs are ‘one-off’, unlikely to be offered again by the sponsoring agency.

Lesson Learned: Sound evaluative enquiry has a greater chance of influencing a given intervention as it is being rolled out, because adaptations can be made in real time.  An obvious advantage is that evaluations of this nature can have immediate pay-off in terms of the implementation of an intervention and its effects on participants.    These are basic principles that underlie Responsive Approaches to evaluative enquiry.

Responsive Approaches adopt an epistemological stance that stands between that of traditional approaches, and that which underlies models more consistent with action research.

How does this stance affect evaluator roles?  Adoption of a Responsive Approach brings a range of challenges.  If you are thinking of becoming a Responsive Evaluator, you must feel comfortable to adopt roles that vary from traditional evaluation practices.

Hot Tip: An evaluation design is key to success.  You must decide:

  • How you will interact with the client over the period of delivery of the intervention, with the possibility of taking both active and passive stances at different stages
  • What methods of data collection are feasible, given time pressures associated with the need to affect program rollout
  • Whether the frame of reference you bring will be important in making decisions about the adequacy of program delivery
  • What ethical standards of practice will apply, given that findings may have implications for delivery personnel
  • The degree to which the client is disposed to use the information to make changes to the intervention during delivery.

These design issues should be agreed upon with the client by negotiation before data collection begins.

For most of these issues, there is a paucity of practical advice in the evaluation literature.

Rad Resource: The workshop I am offering, Rapid Response Evaluation: Concepts and Practice, at Evaluation 2103 in Washington, DC is designed to improve the knowledge of participants wishing to undertake evaluations consistent with a Responsive Approach.

This week, we’re featuring posts by people who will be presenting Professional Development workshops at Evaluation 2013 in Washington, DC. Click here for a complete listing of Professional Development workshops offered at Evaluation 2013. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.