AEA365 | A Tip-a-Day by and for Evaluators

TAG | learning communities

We are Sonia Worcel, VP for Strategy and Research, and Kim Leonard, Senior Evaluation Officer, from The Oregon Community Foundation. Our post shares highlights from the roundtable session at Evaluation 2015 during which we discussed the ways we are working within and assessing the success of several OCF learning communities.

The Research team at The Oregon Community Foundation (OCF) is currently conducting program evaluations of several OCF funding initiatives, including an out-of-school-time initiative, an arts education initiative, and a children’s dental health initiative. Learning communities are an integral component of each of these initiatives.

“A learning community is a group of practitioners who, while sharing a common concern or question, seek to deepen their understanding of a given topic by learning together as they pursue their individual work….”

– Learn and Let Learn, Grantmakers for Effective Organizations

Hot Tip: Learning communities are a tool used by grant makers to support ongoing learning and sharing among grantees and other stakeholders.

One goal for learning communities is evaluation capacity building. Through learning community events, the evaluation team provides training and technical assistance to grantees about logic modeling, evaluation planning, data collection and use of data.

Lesson Learned: Learning communities are also an important resource for evaluators; a way to access grantees to plan evaluation activities inclusively, to gather data, and to make meaning of and disseminate evaluation results.

OCF’s evaluations of these initiatives are utilization-focused and aim to be as culturally responsive as possible. As such, we rely heavily upon the learning communities to communicate with grantees to ensure appropriateness and usefulness of evaluation activities and results.

Rad Resource: The learning communities are also subject to evaluation themselves. In addition to focusing on outcomes for the grantee organizations and participating children and youth, OCF is evaluating the success of the overall initiatives, including their learning communities. As we explore what success will look like, we are developing a framework and rubric, to define and assess the quality and value of each learning community. The draft rubric is included in the materials we’ve posted in the AEA e-library from our session.

Lesson Learned: One important takeaway for us from the roundtable session came through questions about how we will engage our grantees in using the rubric, rather than using it primarily internally at the foundation, as we’ve done so far. (Answer: we don’t know yet!)

Hot Tip: There are a number of potentially handy resources that can help evaluators work with learning communities that are part of funding initiatives. Here are two of our recent favorites:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Beverly Parsons and I’m the executive director of InSites,  a non-profit research, evaluation, and planning organization. We use a systems orientation and conduct evaluations related to education, social services, community change, and health. I’m an AEA board member. I have a tip about how to build evaluation capacity through a type of Community of Practice.

Hot Tip: Consider using Communities of Learning, Inquiry, and Practice (CLIPs) to build evaluation capacity and develop a culture of inquiry across an organization.

CLIPs (a type of Community of Practice) are informal, dynamic groups of organizational members who learn together about their professional practice. They gather and analyze data about a question of importance to them. CLIP members learn an evaluative inquiry process with three basic steps: (1) design the inquiry; (2) collect data; and (3) make meaning and shape practice. The process has some special features to create continual renewal in the organization.  At Bakersfield College where we developed this process under a National Science Foundation grant, the CLIP members are faculty and staff. They focus their inquiries on student learning and success.

Typically, each CLIP consists of three to seven people with one person as the group facilitator. An overall CLIP Guide supports the work of multiple CLIPs at the organizational level, builds strategic linkages among the CLIPs, and connects the whole process appropriately to the organization’s other processes and initiatives. CLIPs support, and are supported by, the broader organization’s goals. CLIPs are adaptable for use in a variety of settings.

Hot Tip: The following features of CLIPs are especially important:

  • Within general parameters including a focus on the organization’s core mission, CLIPs have the freedom to select their own members and topics; set their schedules; determine their budget allocations; and tailor the inquiry process. This freedom builds internal motivation among participants and helps ensure use of results.
  • The CLIPs simultaneously focus on collaboration and inquiry, building a synergy that motivates completion of their investigation.
  • The CLIPs use guiding principles that create an energizing learning environment and promote a natural flow from inquiry to change in practice. The CLIP members are learning at all stages of the inquiry process and readying themselves for a natural shift in practice.

Rad Resources: An overview video and modules about the CLIP process are free through InSites at www.insites.org/clip. Also an article, Evaluative Inquiry in Complex Times, that addresses the link to complexity science is available at http://www.insites.org/clip/clip_reports.html .

Feel free to contact me if I can be of assistance (bparsons@insites.org). I love working with the CLIP process. Perhaps part of the reason is it’s the only time I got a standing ovation from faculty (CLIP members) for my work related to evaluation!

· · ·

Feb/10

19

Catherine Jahnes on Learning Communities

Hi my name is Catherine Jahnes. I am the Research and Evaluation Associate at the Virginia G. Piper Charitable Trust. Over the last three years, the Trust has attempted to develop a learning community around evaluation capacity and use. I would like to share with you some of the lessons learned from my experience with this group.

  1. Require commitment. – As funders, we were concerned about the funder-nonprofit power differential and therefore required little from the organizations we invited to participate in this project. In the end, we were disappointed at the lack of commitment they showed to the data and to the group. Make it easy for nonprofits to decline an invitation without recourse. Even when this is well done, nonprofits have a difficult time saying no. Without requiring some commitment, it is too easy for nonprofits to say yes, we are interested, but then not commit the necessary staff time and resources to the project. Reward commitment.
  2. Hire an independent facilitator. – All of the nonprofits invited to participate had long-term relationships the involved funders, so we dismissed the idea of a power differential between funders and their grantees. Not so. When the funders were present, the nonprofits were quiet and the group wasn’t able to grow together. The role of the funder in developing a learning community is to start it and then remove themselves from the process.
  3. Let the group lead. – Let the group speak! It is easy for meetings to become a series of presentations rather than a learning circle.
  4. Meet frequently. – Err on the side of meeting too frequently over too rarely, especially in the early stages of the learning circle.
  5. Make sure the group has a common level of capacity. – Organizations at vastly different levels of capacity have difficulty learning from each other. Building individual organizations’ capacity might need to be a first step to ensure a common foundation among the group (and a common need around which the group is organized). Each organization must be ready to learn.
  6. Collaboration is hard and cannot be forced. – Even if it seems to make more sense to the funders.
  7. Prioritize goals first, then strategies. – Flexibility is key.
  8. Clearly define deliverables and expectations. – This is true for the organizers of a learning circle as well as for the participants. Revisit expectations frequently.

This contribution is from the aea365 Daily Tips blog, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org.

·

Archives

To top