Greetings! I’m Katrina Bledsoe and I’m a research director at the Missouri-based DeBruce Foundation. The Foundation is currently working on starting a research institute that addresses issues related to education, community, and economic development. In my years of working with and in communities, I’ve found that qualitative inquiry is a foundational tool in being able to provide a good overview and to help tell a solid, systematic and robust story. I use the theory-driven evaluation (TDE) approach quite a bit and so have learned how to really make use of qualitative inquiry within that particular approach.
Hot Tip #1. Meeting with Stakeholders is Part of the Qualitative Inquiry Process. Many folks use meetings as a way to keep in touch, update funders and stakeholders, hammer out the contract and scope of work of the evaluation, and get a sense of the organization. But interactions and regular meetings with stakeholders also provide valuable qualitative information that can be used in the analysis of the data. They’re also great for developing the program theory. Meeting notes can demonstrate an overall picture and describe key players, key concerns, and provide a qualitative baseline by which change can be tracked.
Hot Tip #2 and a Rad Resource. The ubiquitous logic model has morphed and changed over time. Now the logic model can and is designed to represent the complex systems in which the programs, etc. operate. The linear logic model is becoming a bit 20th century; now the trend is to show multiple levels, multiple inputs, and multiple outcomes all in one system. My colleague and AEA member Tarek Azzam of Claremont Graduate University has some great models that are more interactive, see http://interactiveconcepts.info/files/LACOE_Logic_Model_try_2.swf for an example.
Hot Tip #3: Norms and Values Undergird Any Theory-driven Evaluation and A Good Graphic Facilitator Can Help Tease Them Out! Norms and values are really at the heart of an evaluation and the TDE approach is often used to articulate the qualitative and descriptive norms and values of the community, the context, and the program itself. Strategies that can get stakeholders to work on a more descriptive and qualitative level can guide them to develop more accurate outputs and outcomes. Because people often operate visually, we’ve been using visual/graphic facilitation to visually create the values and norms that one holds. A Rad Resource I’ve recently come across is Image think, a group devoted to understanding visual facilitation, and providing resources and services.
Rad Resource: Bledsoe, K. (2014). Qualitative inquiry within theory-driven evaluation: perspectives and future directions. . In Goodyear, L., Jewiss, J., Usinger, J., & Barela, E. (Eds.), Qualitative inquiry in evaluation: From theory to practice. Jossey-Bass, pp. 77-98.
The American Evaluation Association is celebrating Qualitative Evaluation Week. The contributions all this week to aea365 come from evaluators who do qualitative evaluation. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Hello Sheila, and Katrina! Thank you for posting this blog in regards in to qualitative inquiry in the evaluation process. My name is Kyle, and I in currently learning program evaluation inquiry through Queen’s University. As part of the course, we are to connect with evaluators in order to expand our knowledge regarding evaluation design. I really appreciate the concept of meeting with stakeholders as a means of qualitative inquiry. I agree with you that it is an important evaluation step to build a positive relationship and receive some insight into the context of the program design. The context of the program is an integral component on determining the methods to evaluate and what data is important on reporting. I am curious as to what sort of qualitative evaluation you would suggest when evaluating the mature implementation stage of program? Do you tend to use surveys, interviews, round table discussion, or meetings? How do you go about presenting that data?
Thank you for your discussion regarding logic models. It is interesting to learn about the evolution of the logic model. As logic models grow in complexity and comprehensiveness, is there any limitation in those models? Is it important to use different types of model when presenting to different stakeholders?
Thank you for the posting and information, it is very resourceful! Hope to hear from you.
Kyle
The link for Tarek Azzam’s interactive logic models does not work. Can the post be updated with a working link? Thank you.