Hello! We’re Barbara Szijarto and Kate Svensson, graduate students at the University of Ottawa. We’re involved in a Government of Canada funded knowledge mobilization project called ‘Engagement: Mobilizing Community Involvement’ through the Centre for Research in Educational and Community Services (CRECS). Our role is to aid in the evaluation of the initiative.
Evaluating a knowledge mobilization initiative is challenging, and we’d like to share some of what we’ve learned so far!
Knowledge mobilization (KMb) seeks to help people and organizations create, share and use knowledge in order to achieve an outcome. It tends to focus on active exchange between stakeholders with a view of learning as a social process; iterative co-production of knowledge in the context of use; and systems-level change.
Hot Tip: Multiple terms are used for ‘knowledge to practice’ initiatives, like knowledge transfer, translation and exchange. These are linked to different conceptual models for how the process can take place that may be implicit and embedded in an organization’s culture. Because terms are often used interchangeably and the models themselves have evolved over time, different stakeholders in the same project may have very different perspectives about what is going on, even if they use the same terms.Take time to reach a shared understanding of what is taking place. In particular, make explicit what is meant by ‘knowledge’ and ‘use’ and what determines success.
Hot Tip: These initiatives often refer to ‘knowledge products’ as objects. This puts a focus on explicit knowledge, sometimes confusing ‘knowledge’ with ‘information’. ‘Knowledge products’ need to be interpreted, and may be adapted by users to be put into practice. Help stakeholders agree on the extent to which knowledge can change in the process and still be considered successfully ‘used’.
Hot Tip: Most evaluations of such initiatives that we have read have focused on instrumental use, often with disappointing findings. Studies point to conceptual, symbolic and process use as more common and potentially just as interesting. Be sure to consider these other forms of use.
Hot Tip: Be realistic about the evaluation’s objectives. These interventions are complex, especially when they are long-term, iterative, disseminated through multiple networks, and adapted to local contexts. Effects can be unpredictable and difficult to trace. Demonstrating cause & effect (attribution) for an impact is unlikely to be a realistic objective. With Professor Brad Cousins we’re developing a Contribution Analysis approach to evaluating this KMb project.
Rad Resource: Using Evidence by Nutley, Walter & Davies (2007) – a superb primer on theory and evidence behind knowledge exchange.
Rad Resource: A 2009 New Directions for Evaluation volume, Knowledge utilization, diffusion, implementation, transfer, and translation: Implications for evaluation.*
Rad Resource: The 2012 Evaluation Special issue: Contribution analysis.
*American Evaluation Association (AEA) members have free access to this, and other archival issues of NDE, through the members-only section of the AEA website.
AEA is celebrating GSNE Week with our colleagues in the Graduate Student and New Evaluators AEA Topical Interest Group. The contributions all this week to aea365 come from our GSNE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Thanks for this post! It is quite timely as I was just thinking about this topic this morning.