AEA365 | A Tip-a-Day by and for Evaluators

Jul/10

19

CAP Week: Sandra Eames on Utilization Focused Evaluation

My name is Sandra Eames, and I am a faculty member at Austin Community College and an independent evaluation consultant.

For the last several years, I have been the lead evaluator on two projects from completely different disciplines.  One of the programs is an urban career and technical education program and the other is an underage drinking prevention initiative.  Both programs are grant funded, yet; they require very different evaluation strategies because of the reportable measures that the funding source requires.  Despite the obvious differences within these two programs’ such as deliverables and target population, they still have similar evaluation properties and needs. The evaluation design for both initiatives was based on a utilization-focused (UF) approach which has universal applicability because it promotes the theory that program evaluation should make an impact that empowers stakeholders to make data grounded choices (Patton, 1997).

Hot Tip: UF evaluators want their work to be useful for program improvement, and increase the chances of stakeholders utilizing their data-driven recommendations.  Following the UF approach could avoid the chance of your work going on a shelf or in a drawer somewhere.  Including stakeholders in the early decision making steps is crucial to this approach.

Hot Tip: Begin a partnership with your client early on that will lay the groundwork for a participatory relationship and it is this type of relationship that will ensure that the stakeholder utilizes the evaluation. What good has all your hard work done if your recommendations are not used for future decision-making? This style helps to get buy-in which is needed in the evaluation’s early stages.  Learn as much as you can about the subject and intervention that they are proposing and be flexible.  Joining early can often prevent wasted time and efforts especially if the client wants feedback on the intervention before they begin implementation.

Hot Tip: Quiz the client early as to what they do and do not want evaluated, help them to determine priorities especially if they are under a tight budget or short on time for implementation of strategies.  Part of your job as evaluator is to educate the client on the steps that are needed to plan a useful evaluation. Inform the client that you report all findings both good and bad upfront might prevent some confusion come final report time.  I have had a number of clients who thought that the final report should only include the positive findings and that the negative findings should go to the place were negative findings live.

This aea365 contribution is part of College Access Programs week sponsored by AEA’s College Access Programs Topical Interest Group. Be sure to subscribe to AEA’s Headlines and Resources weekly update in order to tap into great CAP resources! And, if you want to learn more from Sandra, check out the CAP Sponsored Sessions on the program for Evaluation 2010, November 10-13 in San Antonio.

· ·

No comments yet.

Leave a Reply

<<

>>

Archives

To top