AEA365 | A Tip-a-Day by and for Evaluators

TAG | organizational learning

Hi! I’m Kelci Price, Senior Director of Learning & Evaluation at the Colorado Health Foundation. I used to think learning was an event that happened when evaluation results were ready, but I’ve come to realize that learning is actually at the core of good strategy, and it’s the critical link to whether evaluation actually gets used. We evaluators pay a lot of attention to assessing organizational strategy, and we need to pay just as much attention to creating effective learning practices in our organizations to inform strategy.

Lesson Learned: Implement an effective learning practice.

We wanted a learning approach that would help us assess evidence and connect that to our decisions. It needed to be simple, flexible, accommodate multiple types of evidence, and link insights directly to action. We came across a platform called Emergent Learning, and it has become the indispensable core of our learning practice. Emergent Learning isn’t just a set of tools. It’s a practice that deepens our ability to make thinking visible so that we can more effectively test our thinking and evaluate our results.
Lesson Learned: Start small, stay focused.
Don’t start with a huge plan for learning – focus on smaller learning opportunities to begin with. We started by understanding what strategic decisions staff needed to make in the very near future, and we offered to use our learning practice to help them reflect on past work and assess what might be effective approaches moving forward. They loved it! The most successful learning will happen when you focus on questions that are relevant right now in your organization – these may be related to internal processes, culture, funded programs, or strategies. Approaching learning this way keeps it compelling and relevant to current decisions.

Lesson Learned: Learn deliberately.
The most effective learning takes planning and prioritization. You can start by capitalizing on emergent opportunities, but over time you should move towards being planful about how your organization learns. Know what decisions you want to inform, then work backwards to determine what you need to learn, when, and how you’ll do that. Seek to build learning into organizational practices and routines so it’s not an add-on item. Look for opportunities to change the content of meetings, documents, planning processes, etc. to embed better learning.

Rad Resource: Emergent Learning is an incredibly powerful learning framework, and integrates seamlessly with evaluation practices.

Rad Resource: This paper from Fourth Quadrant Partners overviews learning in foundations, FSG talks about learning and evaluation systems, and this piece gets you thinking about foundation learning as central to strategy under conditions of complexity.

The American Evaluation Association is celebrating Nonprofits and Foundations Topical Interest Group (NPFTIG) Week. The contributions all this week to aea365 come from our NPFTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Hi, we are Julie Slay and Marissa Guerrero, and we specialize in evaluation and learning at Arabella Advisors, a philanthropic consulting firm that helps donors and impact investors be more strategic and effective. Many of our clients come to us to learn from their grant making and to share those lessons throughout their organization and with the broader community of funders, researchers, and nonprofits.

Lessons Learned: We know from experience that it’s not always easy to create a culture of learning and implement learning systems within an organization, but we’ve identified several essential elements that, when present, create an environment that’s far more conducive to learning. The following are critical to fostering learning in any organization, but particularly in philanthropic ones.

  • Flexibility in approach: There is no gold standard for learning systems, and as such, successful systems can range from highly structured and predictable learning plans to ones that are responsive, reactive, and organic. We always prioritize developing a learning system that reflects the needs and personality of the organization.
  • Staff and leader buy-in: Setting aside time to reflect and process what you are learning requires resources, so it is critical that leaders buy into the process and prioritize it. Additionally, staff must be engaged and interested learners to not only support but also benefit from a learning system.
  • Permission to be vulnerable: We respect that program officers and board members are learning all the time in both formal and informal ways. We find that organizations are often curious and want to hear more about the experiences of their grantees, as well as their peer organizations. Deepening learning culture requires inviting staff to be vulnerable and open up to new ways of learning, particularly in ways that might threaten their assumptions about what is working.
  • Change in processes and culture: We have found that, to create an environment where learning is a primary goal, it is crucial to have and follow a set of procedures that guide learning and reinforce a learning culture. Procedures such as regular and scheduled reviews or reflection will institutionalize organizational learning, giving staff a clear path to learn and share those lessons with others.

Rad Resource: We found the graphics in this article to be effective tools in helping staff visualize and understand what a learning culture requires. Source: Katie Smith Milway & Amy Saxton, “The Challenges of Organizational Learning.” Stanford Social Innovation Review, 2011.

 

The American Evaluation Association is celebrating Nonprofits and Foundations Topical Interest Group (NPFTIG) Week. The contributions all this week to aea365 come from our NPFTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello. I am Karen Widmer, a 4th year doctoral student in the Evaluation program at Claremont Graduate University. I’ve been developing and evaluating systems for performance (business, education, healthcare, and nonprofits) for a long time. I think organizations are a lot like organisms. While each organization is unique, certain conditions help them all grow. I get enthusiastic about designing evaluations that optimize those conditions!

Theme: My master’s research project looked at evaluation-related activities shared by high-performing organizations. For these organizations, evaluation was tied to decision making. Evaluation activity pulled together knowledge about organizational impact, direction, processes, and developments, and this fed the decisions. The challenge for evaluation is to pool the streams of organizational knowledge most relevant for each decision.

Hot Tip:

  • Evaluative thinking identifies the flow of organizational knowledge and this provides decision makers with a point of reference for quality decisions.
  • In technical language, Knowledge Flow may mediate or moderate the relationship between evaluative thinking and decision quality. Moreover, the quality of the decision could be measured by the performance outcomes resulting from the decision!

Widmer.aea365.graphic.quality decisions 4 3 13

Cool Trick:

  • Design your evaluation to follow the flow of knowledge throughout the evaluand lifecycle.
  • Document what was learned when tacit knowledge was elicited; when knowledge was discovered, captured, shared, or applied; and knowledge regarding the status quo was challenged. (To explore further, look to the work of: M. Polanyi, I. Becerra-Fernandez, and C. Argyris and D. Schon.)
  • For the organizations I looked at, these knowledge activities contained the evaluative feedback desired by decision makers. The knowledge generated at these points told what’s going on.
  • For example, tacit perceptions could be drawn out through peer mentoring or a survey; knowledge captured on a flipchart or by software; or a team might “discover” knowledge new to the group or challenge knowledge previously undisputed.

Conclusion: By design or still shot, evaluative thinking can view the flow of knowledge critical to decisions about outcomes. Knowledge Flow offers a framework for connecting evaluation with the insights decision makers want for reflection and adaptive response. Let’s talk about it!

Rad Resource: The Criteria for Performance Excellence is a great government publication that links evaluative thinking so closely with decisions about outcomes that you can’t pry them apart.

Rad resource: Neat quote by Nielsen, Lemire, and Skov in the American Journal of Evaluation (2011) defines evaluation capacity as  “…an organization’s ability to bring about, align, and sustain its objectives, structure, processes, culture, human capital, and technology to produce evaluative knowledge [emphasis added] that informs on-going practices and decision-making to improve organizational effectiveness.”

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· · ·

Hello! We are Johanna Morariu, Kat Athanasiades, and Ann Emery from Innovation Network. For 20 years, Innovation Network has helped nonprofits and foundations evaluate and learn from their work.

In 2010, Innovation Network set out to answer a question that was previously unaddressed in the evaluation field—what is the state of nonprofit evaluation practice and capacity?—and initiated the first iteration of the State of Evaluation project. In 2012 we launched the second installment of the State of Evaluation project. A total of 546 representatives of 501(c)3 nonprofit organizations nationwide responded to our 2012 survey.

Lessons Learned–So what’s the state of evaluation among nonprofits? Here are the top ten highlights from our research:

1. 90% of nonprofits evaluated some part of their work in the past year. However, only 28% of nonprofits exhibit what we feel are promising capacities and behaviors to meaningfully engage in evaluation.

2. The use of qualitative practices (e.g. case studies, focus groups, and interviews—used by fewer than 50% of organizations) has increased, though quantitative practices (e.g. compiling statistics, feedback forms, and internal tracking forms—used by more than 50% of organizations) still reign supreme.

3. 18% of nonprofits had a full-time employee dedicated to evaluation.

Morariu graphic 1

4. Organizations were positive about working with external evaluators: 69% rated the experience as excellent or good.

5. 100% of organizations that engaged in evaluation used their findings.

Morariu graphic 2

6. Large and small organizations faced different barriers to evaluation: 28% of large organizations named “funders asking you to report on the wrong data” as a barrier, compared to 12% overall.

7. 82% of nonprofits believe that discussing evaluation results with funders is useful.

8. 10% of nonprofits felt that you don’t need evaluation to know that your organization’s approach is working.

9. Evaluation is a low priority among nonprofits: it was ranked second to last in a list of 10 priorities, only coming ahead of research.

10. Among both funders and nonprofits, the primary audience of evaluation results is internal: for nonprofits, it is the CEO/ED/management, and for funders, it is the Board of Directors.

Rad Resource—The State of Evaluation 2010 and 2012 reports are available online at for your reading pleasure.

Rad Resource—What are evaluators saying about the State of Evaluation 2012 data? Look no further! You can see examples here by Matt Forti and Tom Kelly.

Rad Resource—Measuring evaluation in the social sector: Check out the Center for Effective Philanthropy’s 2012 Room for Improvement and New Philanthropy Capital’s 2012 Making an Impact.

Hot Tip—Want to discuss the State of Evaluation? Leave a comment below, or tweet us (@InnoNet_Eval) using #SOE2012!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · · ·

I’m Karen Anderson, AEA’s Diversity Coordinator Intern, and in this role I support AEA’s diversity programs, TIGs, and the AEA Public Statement on Cultural Competence in Evaluation Dissemination Working Group.

The baton has been passed from the Cultural Competence in Evaluation Task Force, the Statement developers, to the Cultural Competence in Evaluation Dissemination Working Group to translate the Statement from paper to practice. One strategy for its broader dissemination and use is integrating the Statement into the policies and procedures of organizations that conduct and commission evaluations.

The AEA Public Statement on Cultural Competence in Evaluation has several core concepts, including the implications culture has for all phases of evaluation, including staffing of evaluation efforts and ensuring that members of the evaluation team collectively demonstrate cultural competence in the context for each evaluation. How does an evaluation practitioner or commissioner begin to do this?

Rad Resource:

Hot Tips:

  • Share the Statement and supplemental resources like Building Culturally Competent Organizations, Key Components to a Culturally Competent System, and It All Starts At The Front Desk with human resources and decision makers in organizations. Recommend the development of a cultural competence committee to monitor and make recommendations for policy revision, program development, and evaluation.
  • Include cultural competence language in the development and response to requests for proposals (RFPs). Check out this post How to Spot a Lip Service Approach to Culturally Responsive Evaluation from Patricia Rogers and Jane Davison’s Genuine Evaluation blog for tips on pointing out when a client may not be walking the walk in relation to culture and program development, theory, and evaluation.
  • If you or other employees at your organization belong to an AEA affiliate, organize an event at your office around the theory or practical applications of the Statement. The Atlanta Area AEA affiliate group hosted one recently, Taking a Stance Toward Culture: Cultural Competence in Evaluation. Reflections from the event can be found in the AEA Newsletter diversity article.
  • Set up a series of lunch and learns to begin having dialogue with colleague to increase awareness and to encourage relationship building, or start a book club discussion using the Statement, and branch out to other reading material to light the spark for cultural competence in evaluation.

The American Evaluation Association will be celebrating Cultural Competence Week. The contributions all this week come from the Cultural Competence committee. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

· · · ·

I’m Sarah Griffiths, Senior Partner, Wholonomy Consulting, in Tucson, AZ.  I’m going to share some highlights of the recent AZENet conference which focused on Building Capacity for Organizational Effectiveness.  The learning goals for participants were to:

  • Understand what research shows about organizational effectiveness in three key areas: Adaptive Capacity, Leadership, and Management Practices.
  • Explore the connection between organizational effectiveness and sustainability.
  • Learn about elements of successful capacity building.
  • Think about application in the social sector in Arizona.

Co-sponsors from multiple local organizations joined AZENet in this effort, including Young Nonprofit Professionals Network, Phoenix chapter, Grant Professionals Association, Arizona Chapter; Community Foundation of Southern Arizona; ASU Lodestar Center for Philanthropy and Nonprofit Innovation; and Arizona Alliance of Nonprofits.  Participants reported a deeply engaging experience brought on by both the content and the opportunity to share learning and reflection with others from different parts of the social sector.  This type of cross sector involvement promotes community learning – a goal of the conference.

Keynote speaker Peter York from the TCC Group shared his current work which builds on the TCC authored publication entitled, The Sustainability Formula:  How Nonprofit Organizations Can Thrive in the Emerging Economy.  From it, we learned the following formula:

Effective Leadership + Adaptability + Program Capacity = Sustainability.

To summarize, effective organizations exhibit the following behaviors:

    • Leadership (capable of timely strategic decision making);
    • Adaptability (ability to constantly learn and change)  both financial adaptability and program adaptability; and
    • Program Capacity (have resources to run effective programs)

A key element of sustainability is strategic learning.  Peter shared some guiding principles of strategic learning which began with “listen to the client”, followed by these 7 steps:

  1. Determine the “real” outcomes,
  2. Find or create the metrics,
  3. Gather data, quantitative then qualitative,
  4. Make meaning out of patterns,
  5. Create/modify program models,
  6. Re-design programs, and
  7. Repeat steps 2-6.

Peter’s introduction was followed by small group discussions, led by conference participants, who shared examples of strategic learning in their work. Hosts included funders, consultants, university center staff, state agency staff, and YNPN Phoenix Board members.

Hot Tip: Partner with young leaders near you – find them through the Young Nonprofit Professional Network.

Rad Resource: The Sustainability Formula.

Rad Resources: To find nonprofit associations in your area to collaborate on learning activities, go to the Council of Nonprofits.

The American Evaluation Association is celebrating Arizona Evaluation Network (AZENet) Affiliate Week with our colleagues in the AZENet AEA Affiliate. The contributions all this week to aea365 come from our AZE members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Jeff Sheldon and I am a doctoral student at the School of Behavioral and Organizational Sciences at Claremont Graduate University concentrating in evaluation and applied research methods.  A challenge I’ve encountered in my own practice is organizations that simply aren’t ready to be engaged in the evaluation process.

Rad Resource: Readiness for Organizational Learning and Evaluation (ROLE).  The ROLE (Preskill & Torres, 2000) was designed to help us determine the level of readiness for implementing organizational learning, evaluation practices, and supporting processes.  I’ve implemented the ROLE with schools and other types organizations, using the results to: 1) identify the existence of learning organization characteristics; 2) diagnose interest in conducting evaluation that facilitates organizational learning; 3) identify areas of strength to leverage evaluative inquiry processes; and 4) identify areas in need of organizational change and development.  ROLE items reflects the research on organizational learning, evaluative inquiry processes, and evaluation practices, and suggest that an organization must have certain internal elements in place if it is to support and encourage organizational learning.  Evaluation research indicates an organization’s culture and context influence the extent to which evaluative inquiry occurs in support of learning and decision making.  The ROLE consists of 78 items grouped into six major constructs: 1) Culture, 2) Leadership, 3) Systems and Structures, 4) Communication, 5) Teams, and 6) Evaluation. Four of these – Culture, Leadership, Systems and Structures, and Communication – are further divided into subconstructs.  Individual responses to ROLE items created a composite perspective of the internal context of an organization that determines the extent to which organizational learning, evaluation practices, and systems are present.

The ROLE instrument is available as an appendix in Russ-Eft, D. & Preskill, H. (2001) Evaluation in organizations: A systematic approach to enhancing learning, performance, and change.  New York, NY: Basic Books.

This contribution is from the aea365 Daily Tips blog, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org.

·

My name is Stephen J. Gill. I’m author of Developing a Learning Culture in Nonprofit Organizations (SAGE, 2010)*. This might not sound like a book for evaluators, but it is. The book is about how the process of collecting, reflecting on, and learning from evaluative information can help nonprofits become more successful organizations. Nineteen evaluation tools from the book are available for downloading at http://www.sagepub.com/learnculstudy/chapters.htm.

Rad Resource: One of these tools that you can put to use immediately with any organization is the Organizational Learning Readiness Worksheet. This tool asks your clients to indicate to what extent seven principles are characteristic of their organizations. Do they observe each principle “not at all; never see it,” “occasionally see evidence,” or is “evidence all around us”? Then, most importantly, you should discuss the meaning of the ratings with your clients. Help them decide what they need and want to do to become more ready for organizational learning and change.

The seven statements are:

  1. We integrate and align our organization’s mission, people, processes, resources, structures, and culture.
  2. Each of our organization’s activities is an element of a process that is continuously improved through knowledge enhancement.
  3. We don’t rely on quick fixes to our performance deficits.
  4. Learning is continuous over the long term in order to achieve meaningful results.
  5. Learning is leveraged so that relatively small interventions result in long-term major changes for the organization.
  6. Each of our employees and volunteers is responsible for the system in which he or she works.
  7. The collective learning of all employees and volunteers is an essential aspect of capacity building.

You can download a pre-formatted PDF worksheet with the above seven items here: http://bit.ly/orglearningreadiness

*remember that as an AEA member you would receive 20% off on this title if you order directly from SAGE – sign into the AEA website with your username and password to look up the passcodes.

This contribution is from the aea365 Daily Tips blog, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org.

Archives

To top