My name is Michael Quinn Patton and I am an independent evaluation consultant. That means I make my living meeting my clients’ information needs. That’s how I came to engage in Utilization-Focused Evaluation. Utilization-focused evaluation does not depend on or advocate any particular evaluation content, model, method, theory, or even use. Rather, it is a process for helping primary intended users select the most appropriate content, model, methods, theory, and uses for their particular situation. In considering the rich and varied menu of evaluation, utilization-focused evaluation can include any evaluative purpose (formative, summative, developmental), any kind of data (quantitative, qualitative, mixed), any kind of design (e.g., naturalistic, experimental) and any kind of focus (processes, outcomes, impacts, costs, and cost-benefit, among many possibilities). Utilization-focused evaluation is a process for making decisions about these issues in collaboration with an identified group of primary users focusing on their intended uses of evaluation.
Hot Tip: Involve primary intended users in methods decisions. This enhances their understanding and capacity to make sense of and use findings. Because different methods involve different timelines and require different amounts of resources, for example, qualitative inquiry being especially labor-intensive because of the fieldwork involved, deliberation on and negotiation of methods decisions should be collaborative, not decided autonomously by the evaluator, despite her or his methodological expertise.
Hot Tip: Present methods options. In order for primary intended users to participate in methods and design deliberations, and make an informed decision about priority evaluation questions and appropriate methods, the utilization-focused evaluation facilitator must be able to present the primary data collection options, their strengths and weaknesses, and what makes them more or less appropriate for the evaluation issues at hand.
Hot Tip: Qualitative inquiry is always on the menu of options. The evaluator needs to understand and be sufficiently proficient at conducting qualitative evaluations to present it as a viable option and explain its particular niche and potential contributions for the evaluation being designed.
Hot Tip: Keep up-to-date with new developments in qualitative inquiry. New directions include uses of social media for data collection and reporting findings, increased use of visual data, and many new purposeful sampling options.
Rad Resources:
- Patton, M.Q. (2014) Qualitative inquiry in utilization-focused evaluation. In Goodyear, L., Jewiss, J., Usinger, J., & Barela, E. (Eds.), Qualitative inquiry in evaluation: From theory to practice. Jossey-Bass, pp. 25-54.
- Patton, M.Q. (2015) Qualitative Research and Evaluation methods, 4th Sage Publications.
- Patton, M.Q. (2014) Top 10 Developments in Qualitative Evaluation for the Last Decade.
The American Evaluation Association is celebrating Qualitative Evaluation Week. The contributions all this week to aea365 come from evaluators who do qualitative evaluation. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Hi Michael,
I am currently doing a Program Analysis and Evaluation course as part of my Professional Master of Education degree at Queen’s University, in Canada. We have been reading a variety of articles and resources from different evaluators in the field and I have become a fan of your work 🙂 For one of the last assignments on our course, we have to respond to one of the blog posts on the AEA 365 site, so I chose yours!
We are supposed to write something thoughtful, so here goes…
Having watched your presentation Utilization-focused evaluation for equity-focused and gender-responsive evaluations, I reflected on the importance of involving primary users at every stage of the evaluation process. This is something that is of particular relevance to my work as a teacher in a high school setting. For the most part, my work with evaluations will be engaging with students and parents to identify and build on areas for development in different aspects of school life. One challenge I have experienced is how best to win the support of students and staff for initiatives that go against our traditional ethos. I work for a century old, academically selective independent school, so there is a strong element of “if it has always worked ok, why change it?”
Do you have any advice on how to motivate and engage stakeholders in order to foster a collective growth mindset for future improvements?
Another aspect of your approach that struck a chord with me is your focus on the immediate usability of any recommendations. I read an additional blog post from this site to deepen my understanding of potential methods for this.
Post-Eval Action Plan Week: Enhance Utilization with Post-evaluation Action Planning by Kylie Hutchinson https://aea365.org/blog/post-eval-action-plan-week-enhance-utilization-with-post-evaluation-action-planning-by-kylie-hutchinson/
I was wondering whether you had any other suggestions of how to engage primary users in the action planning phase to help them create their own road map for how best to implement any recommendations?
Thank you in advance – I am sure I will continue to revisit some of your work for ideas on future projects.
Kind regards,
Laura Ross