Hi! I’m Don Glass a learning designer and developmental evaluator. I’ve had the privilege of being on the Kennedy Center‘s internal Research and Evaluation team for several years. Of special interest to me has been working with our collective impact initiative to develop shared measurement for equitable access to arts learning opportunities.
It has become very clear to me that this work is not the same as typical program evaluation because of variation in the system and continuous change across the network. The early collective impact work at Big Thought with WolfBrown laid the foundation for our field. Since then, FSG’s collective impact forum has promoted shared measurement for strategic learning, and the Carnegie Foundation‘s networked improvement science approach provides principles, processes, and tools to build a technical network hub to support learning and improvement.
- Understand the User-based Problem in the System: Participatory and utilization-focused approaches have long advocated for the participation of the evaluation user in the process. Engage users in defining the problem and understanding it in the system. Use group interviews and tools like fishbone diagrams and process maps to better understand how something is or is not working in the system. Gather any available baseline data- numbers and stories are a good start!
- Organize Disciplined Inquiry to Systematically Build Know-How: Let’s admit that we don’t know how to do everything at the beginning. Set a well intentioned specific aim to address the problem, and then develop a provisional theory of action. Start with some well-informed change ideas, but then test, adapt, improve, or even abandon them based on user-feedback and data from Plan-Do-Study-Act (PDSA) cycles. Protocols are great tools for making meetings into active data analysis and learning sessions!
- Value Productive Human Variance, and Reduce Non-Productive System Variance: Remove barriers to inclusion to value diverse voices, experiences, and perspectives around the problem and its solutions. Work to utilize human and cultural diversity as assets. Use disciplined inquiry to find the core practices that work for many, and then systematically pay attention to those in the margins to adapt the solutions so they work for everyone.
- Pay Attention to Relationships and Network Well-being: Just like we are now paying more attention to SEL in education, we need to pay more attention to it in networks. Doing collaborative work over time is hard. What can we learn about the health of our networks by paying attention to who knows who, how ideas flow, and what the levels of interest, motivation, and persistence are towards the collective aims? Social Networking Analysis could be our best friend here!
- More Than Measuring: Setting the stage for rethinking the focus of arts education evaluation with communities.
- Evaluating Complexity: Principles to consider when evaluating complex social initiatives.
- Evidence for Improvement: The role of analytic partner in the hub of an Networked Improvement Community.
- Practical Measurement: A process for creating rigorous yet manageable measures using Item Response Theory.
- Strategic Learning: Protocols to support intentional group learning from data.
The American Evaluation Association is celebrating Arts, Culture, and Museums (ACM) TIG Week. The contributions all week come from ACM TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to email@example.com. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.