Evaluation without learning is akin to dutifully planting and watering a garden-to-be, without staying to reap the benefits of its harvest. My name is Elizabeth McGee (she/her), I am the Founder and Senior Consultant at LEAP Consulting.
Throughout my evaluation career, I have witnessed learning efforts fall short because of the tendency to see learning as an item to be checked off a to-do list rather than as a process that requires time, attention, expertise, and depth.
I routinely see groups focus too heavily on process (and not outcomes) or vise-versa, groups relying too heavily on quantitative measure without using qualitative measures to help contextual to findings, and I have seen groups that ask a lot of learning questions, but none that offer insights into root causes.
Surface learning and deep learning are concepts that grew out of the research that found that learners approach learning in different ways based on a series of internal and external motivating factors. In revisiting these concepts, I was struck by how often we as evaluators and learning practitioners facilitate learning processes that produce surface learning results.
|Learners focus upon details and parts of the information deemed important.
|Learners look at the significance of what is being taught and attempt to make sense of it.
|There is an emphasis upon memorizing pieces of information to signify comprehension.
|Learners look for the overall meaning and attempt to process information in a holistic way.
|Learners focus on unconnected facts instead of holistic narrative.
|Learners develop interpretations of the content by integrating it with existing knowledge and new thinking.
Here are 3 ideas to get us started in transforming our evaluative learning practices to facilitate deeper learning.
- Leave the blame game for children. When mistakes in our work materialize, assigning blame can be a natural impulse. Blame takes the space that should be filled by analysis and the desire to understand the structural and relational elements at play in a given problem.
- Motivate the learning. Why should people want to learn about this topic? Why should they spend their busy time doing this? Why should they want to be vulnerable and ask hard questions? Don’t underestimate the importance of getting buy-in and motivation from your learning group about learning!
- Ask different questions. Dissatisfaction with snacks at a community support group is a finding, but it’s not particularly profound. Don’t be afraid to ask deeper questions, that get at more systemic causes of organizational and community challenges, such as: poor leadership, power dynamics, lack of community engagement or relationship building, ineffective facilitative processes, lack of time and space for learning, hierarchy, and countless others. Working to address these findings could help to truly combat the status quo.
- Pay attention to your external and internal learning environments. Remember, that systemic causes of organizational and community challenges (referred to in #3) are at play not only in the context of the problem we are trying to solve, but also in the contexts in which the group of learners are learning. So, for example, be mindful of power dynamics that impact the problem in question (e.g., homelessness), but also those impacting the dynamics of those involved in the learning process (key community players at a municipal town hall).
The American Evaluation Association is hosting Organizational Learning and Evaluation Capacity Building (OL-ECB) Topical Interest Group Week. The contributions all this week to AEA365 come from our OL-ECB TIG members. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.