AEA365 | A Tip-a-Day by and for Evaluators

CAT | Nonprofits and Foundations Evaluation

Hello! I am Dawn Helmrich, Director of Research and Evaluation at United Way of Greater Milwaukee & Waukesha County. I work with over 100 nonprofit programs in a four county area around program evaluation. I train nonprofit organizations on how to create and implement logic models, how to design evaluation plans and what outcomes measures work best for their organization to demonstrate impact, but also to improve program quality and program services provided to the community.

Over the past 10 years the demand for outcomes evaluation has grown at a rapid speed. During the recession in 2008, programs were asked by funders to diversify their funding in an effort to sustain programs. Many funding sources had to pull back money, leaving organization to scrabble for dollars. While this was happening, funders began to seek greater accountability from organizations, while also providing less money and little to no training on how to better provide that accountability.

From 2008 to present day funders don’t always recognize the burden on organizations to provide quality level data and analysis. Funders themselves often don’t take into account that organizations are often being funded by upwards of 5 different funding sources all looking for different things. This problem is two-fold, an organizational capacity issue and a funder’s issue.

Hot Tips:

It is important to recognize capacity as a real and relevant issue for organizations. Oftentimes, evaluation is put on the back burner and/or is being done by someone as an “other duties as assigned” task. There are some very simple things that can be done to rectify this situation.

  • First, encourage your agency to identify whose role will include providing evaluation and add a few sentences to the job description. This alerts the person applying for the job that program evaluation is a component of their job and it helps the agency get the right person in the door.
  • Second, talk to your local Universities and find out what kind of evaluation classes they offer for Human Service disciplines. We know that students majoring in sociology, social work, psychology and other human service disciplines often find themselves seeking work in nonprofits. If these disciplines are provided with the foundations in program evaluation both the student and the hiring organization will have an increased chance to improve capacity.
  • Finally, talk with funders about working with each other to reduce the burden on overlapping funding. If funders can ask for the same accountability measures and/or provide extra training and technical assistance, we can help increase the quality of data and information that is being collected.

Accountability standards and practices are not going away anytime soon. Most evaluation practitioners are concerned about the quality of information being provided. By increasing the capacity of organizations and helping funders understand the need for consistency, we can improve the overall impact nonprofits have on their community.

The American Evaluation Association is celebrating ¡Milwaukee Evaluation! Week with our colleagues in the Wisconsin statewide AEA Affiliate. The contributions all this week to aea365 come from our ¡Milwaukee Evaluation! members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi! I’m Kelci Price, Senior Director of Learning & Evaluation at the Colorado Health Foundation. I used to think learning was an event that happened when evaluation results were ready, but I’ve come to realize that learning is actually at the core of good strategy, and it’s the critical link to whether evaluation actually gets used. We evaluators pay a lot of attention to assessing organizational strategy, and we need to pay just as much attention to creating effective learning practices in our organizations to inform strategy.

Lesson Learned: Implement an effective learning practice.

We wanted a learning approach that would help us assess evidence and connect that to our decisions. It needed to be simple, flexible, accommodate multiple types of evidence, and link insights directly to action. We came across a platform called Emergent Learning, and it has become the indispensable core of our learning practice. Emergent Learning isn’t just a set of tools. It’s a practice that deepens our ability to make thinking visible so that we can more effectively test our thinking and evaluate our results.
Lesson Learned: Start small, stay focused.
Don’t start with a huge plan for learning – focus on smaller learning opportunities to begin with. We started by understanding what strategic decisions staff needed to make in the very near future, and we offered to use our learning practice to help them reflect on past work and assess what might be effective approaches moving forward. They loved it! The most successful learning will happen when you focus on questions that are relevant right now in your organization – these may be related to internal processes, culture, funded programs, or strategies. Approaching learning this way keeps it compelling and relevant to current decisions.

Lesson Learned: Learn deliberately.
The most effective learning takes planning and prioritization. You can start by capitalizing on emergent opportunities, but over time you should move towards being planful about how your organization learns. Know what decisions you want to inform, then work backwards to determine what you need to learn, when, and how you’ll do that. Seek to build learning into organizational practices and routines so it’s not an add-on item. Look for opportunities to change the content of meetings, documents, planning processes, etc. to embed better learning.

Rad Resource: Emergent Learning is an incredibly powerful learning framework, and integrates seamlessly with evaluation practices.

Rad Resource: This paper from Fourth Quadrant Partners overviews learning in foundations, FSG talks about learning and evaluation systems, and this piece gets you thinking about foundation learning as central to strategy under conditions of complexity.

The American Evaluation Association is celebrating Nonprofits and Foundations Topical Interest Group (NPFTIG) Week. The contributions all this week to aea365 come from our NPFTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Greetings! We are Kelly Hannum, Jara Dean-Coffey, and Jill Casey and as part of the Luminare Group we offer strategy and evaluation services and we regularly work with foundations. Evaluation is part of the work to advance the aims of foundations. Ideally evaluation is part of and flows from a foundation’s strategy. Evaluation provides the means to both understand and to demonstrate value and to do so in a manner that reflects the values of the foundation. The connections between the value created and the values by which that impact is created should be meaningful and explicit. This isn’t a new or particularly revolutionary idea, but the practice of explicitly incorporating values into evaluation work continues to lag. While making values an explicit part of work is important for any kind of organization, it is critical for foundations. Why? Because by their nature foundations are values-driven organizations and that should be explicitly reflected in all facets of their work. Doing so in evaluation work is particularly important because it enables foundations to hold themselves accountable for living their values in meaningful ways and to demonstrate to others that they are living their values. It also sets an example for others to do the same.

Hot Tip: Be intentional and explicit about your organizational or programmatic values

Hot Tip: Incorporate values into frameworks used to guide or describe efforts, such as your Theory of Change

Hot Tip: Reflect on the ways in which how you currently engage in evaluation may or may not be aligned with your organizational values and your grantmaking strategy

Hot Tip: Think about how you have conducted and used evaluation findings in the past and identify what worked, why and how to get more of that (and less of the other stuff). What implicit values are suggested in how you’ve conducted or used evaluation – both in terms of the processes used as well as what stakeholders were involved and how they were involved?

Hot Tip: Clarify how evaluation will be used and regularly communicate that to stakeholders (it also never hurts to be transparent when using evaluation so stakeholders see you doing what you said you would and so they see how data they provided are being used)

Rad Resource: Developing an evaluative mindset in foundations – This two-part post provides more information about our perspective on why having an evaluative mindset and being explicit about it is important in foundations.

Rad Resource: The Role of  Culture in Philanthropic Work – This is a collection of resources from Grantmakers for Effective Organizations

Rad Resource: Raising the Bar  This article discusses how philanthropy can use an equitable-evaluation approach to apply the principles of the AEA statement, present the concept of equitable evaluation alongside an approach for building equitable-evaluation capacity, and apply equitable evaluation capacity building to philanthropy.

 

The American Evaluation Association is celebrating Nonprofits and Foundations Topical Interest Group (NPFTIG) Week. The contributions all this week to aea365 come from our NPFTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Greetings, we are Kristina Jamal and Jacqueline Singh. In addition to being NPFTIG members, we serve on the PDTIG leadership team. Kristina is founder of Open Hearts Helping Hands (OH3), a nonprofit that collaborates with student-focused organizations and community members. Jacqueline is an evaluation/program design advisor and founder of Qualitative Advantage, LLC. We started working together to help OH3 move from being a young nonprofit “flying by the seat of its pants” to becoming a viable organization that competes for funds to bring about common outcomes between formal and informal secondary education organizations.

Because foundations and grantors look for promising programs that can get results, we wanted to move beyond logic model linearity to show a complementary and easy-to-understand way of how a nonprofit program is strategic and intentional. From a nonprofit’s perspective, this AEA365 article addresses the utility of conceptual frameworks and models for front-end evaluation activities, measurement, and strategic planning. 

Lesson Learned: Collecting evidence for improvement, decision-making, and accountability continues to intensify. Funders expect recipients to partner with other organizations and provide evidence of program outcomes. Young nonprofits are overwhelmed at the thought of where to begin. Indeed, navigating disciplinary fields, paradigms of inquiry, and complex environments that commingle evaluation with research can be daunting. Conceptual frameworks can reveal program alignment with other operating mechanisms that logic models alone may miss—and, help bridge the relationship evaluation has with strategic planning, measurement, program management, and accountability. They are often used within the context of evaluability assessment (EA) and prospective evaluation synthesis (PES) as exemplified within these links. Similarly, nonprofits can use conceptual frameworks to clarify their purpose, questions and build evaluation capacity.

Program designs are merely abstractions unless conceptualizations are made explicit and understood by stakeholders. Creating conceptual frameworks is developmental and experiential. The process involves document analysis, reading literature, asking questions, describing and defining relationships, capturing or proposing plausible links between components or emerging factors—dependent upon what is to be evaluated. Conceptual frameworks such as the OH3 Conceptual Framework take “context” into account and help nonprofits to expand their view of what logic models capture.

Hot Tip: Do not undervalued or overlook conceptual frameworks. They come in a variety of forms, serve different purposes, and help figure out what is going on. Conceptual frameworks provide an aerial view and are useful for connecting multiple areas of disciplinary work (e.g. research, theory, policy, technology, etc.). They help guide the selection of useful data collection tools and evaluation strategies.

Rad Resources: What we have found to be useful for understanding how to create conceptual frameworks, thinking through overlapping aspects of program design, measurement, and focusing future evaluations are: 1) James Jaccard. & Jacob Jacoby’s Theory Construction and Model-Building Skills, 2) Joseph Maxwell’s Qualitative Research Design: An Interactive Approach, 3) Joseph Wholey’s Exploratory Evaluation approach (EA) in the Handbook of Practical Program Evaluation, and 4) Matthew Miles & Michael Huberman’s Expanded Sourcebook: Qualitative Data Analysis.

 

The American Evaluation Association is celebrating Nonprofits and Foundations Topical Interest Group (NPFTIG) Week. The contributions all this week to aea365 come from our NPFTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Dr. Shanesha Brooks-Tatum, Vice President of Creative Research Solutions and I have worked with foundations such as the United Nations Foundation and UNCF.  In my work with foundations, I have found that many struggle with ways to create a more solid internal evaluation framework, especially with regards to shifts in internal leadership. Today, I will describe ways that evaluators can work with foundations to strengthen their evaluation framework, which is especially important given the current social and political climates.

Hot Tip:  Highlight the organization’s successes beyond awarded grant dollars. What does the foundation uniquely bring to the table? Identify any processes the foundation uses to engage with grantees and other stakeholders within its areas of focus.

Hot Tip:  Help executive staff test and refine the organization’s fluency in their model of engagement. It is extremely important for leadership and all staff to become fluent in the unique model an organization offers. For example, how does the foundation support its grantees, and has this process been tested over time and in different contexts? A related question is: What philanthropic models have been the most successful for the foundation? How has this model been tested with rigorous research and evaluation?

Hot Tip:  Clearly document evaluation outcomes to communicate impact. Oftentimes evaluators are asked to produce a report or presentation, which few if any of the internal constituents see. Think creatively about other ways to present data in a user-friendly fashion, such as with infographics with an accompanying executive summary, or as data charts with photos that illustrate the major outcomes or project activities. Likewise, ensure that these materials are housed on the foundation’s website and are easily and clearly reachable from the homepage.

To delve more into the nitty-gritty of presenting evaluation outcomes, evaluators are encouraged to answer the following questions in partnership with foundation leadership:

1) In what ways might the philanthropic model need to shift based on evaluation outcomes?
2) How can we best articulate any shifts in our approach or model internally and externally in order to leverage lessons learned?
3) How can we refine our current grantee reporting system to save time, money and energy on understanding impact on a regular (i.e., quarterly or semi-annual) basis?
4) How can we best highlight what variables most contribute to our organization’s success?

Making sure that the answers to these questions are clear and that the aforementioned documentation components are in place will better ensure the visibility of foundations’ progress and successes.

The American Evaluation Association is celebrating Nonprofits and Foundations Topical Interest Group (NPFTIG) Week. The contributions all this week to aea365 come from our NPFTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hi, we are Julie Slay and Marissa Guerrero, and we specialize in evaluation and learning at Arabella Advisors, a philanthropic consulting firm that helps donors and impact investors be more strategic and effective. Many of our clients come to us to learn from their grant making and to share those lessons throughout their organization and with the broader community of funders, researchers, and nonprofits.

Lessons Learned: We know from experience that it’s not always easy to create a culture of learning and implement learning systems within an organization, but we’ve identified several essential elements that, when present, create an environment that’s far more conducive to learning. The following are critical to fostering learning in any organization, but particularly in philanthropic ones.

  • Flexibility in approach: There is no gold standard for learning systems, and as such, successful systems can range from highly structured and predictable learning plans to ones that are responsive, reactive, and organic. We always prioritize developing a learning system that reflects the needs and personality of the organization.
  • Staff and leader buy-in: Setting aside time to reflect and process what you are learning requires resources, so it is critical that leaders buy into the process and prioritize it. Additionally, staff must be engaged and interested learners to not only support but also benefit from a learning system.
  • Permission to be vulnerable: We respect that program officers and board members are learning all the time in both formal and informal ways. We find that organizations are often curious and want to hear more about the experiences of their grantees, as well as their peer organizations. Deepening learning culture requires inviting staff to be vulnerable and open up to new ways of learning, particularly in ways that might threaten their assumptions about what is working.
  • Change in processes and culture: We have found that, to create an environment where learning is a primary goal, it is crucial to have and follow a set of procedures that guide learning and reinforce a learning culture. Procedures such as regular and scheduled reviews or reflection will institutionalize organizational learning, giving staff a clear path to learn and share those lessons with others.

Rad Resource: We found the graphics in this article to be effective tools in helping staff visualize and understand what a learning culture requires. Source: Katie Smith Milway & Amy Saxton, “The Challenges of Organizational Learning.” Stanford Social Innovation Review, 2011.

 

The American Evaluation Association is celebrating Nonprofits and Foundations Topical Interest Group (NPFTIG) Week. The contributions all this week to aea365 come from our NPFTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

I’m Cheryl Milloy, Associate Director of Evaluation at Marguerite Casey Foundation in Seattle. We believe no family should live in poverty and that those who experience poverty know best what the solutions are. We provide consistent, significant, long-term general operating support grants to community-based organizations to work together across issues, regions, race and ethnicity, and egos to bring about long-term change that has a positive impact on the lives of families.

Foundations strive to be learning organizations, and one of their best sources of learning is the organizations they support.

Hot Tip: Ask. Listen. Act.”  This is our brand promise and our approach to learning. Grantees are our partners on the ground and we are committed to asking them and listening to them in order to learn before we act. We cannot completely eliminate the power imbalance between funder and grantee, but we can be conscious of it and mitigate this differential as much as possible. One important way Marguerite Casey Foundation does this is by providing grants almost exclusively as multiyear general operating support. This demonstrates trust in organizations and their “big ideas” and allows them to decide how to spend the funds. We encourage organizations to invest in their own infrastructure – leadership, staff, governance, evaluation and learning, technology, etc. – to build their capacity and effectiveness.

Hot Tip: Learn by asking grantees for their feedback! Any evaluation of a funder’s work should include feedback from grantees. For Marguerite Casey Foundation’s 15th anniversary, we commissioned a summative evaluation to understand stakeholders’ perceptions of our operations to facilitate learning going forward. An article summarizing the evaluation findings and our initial responses was recently published in The Foundation Review (A Rad Resource for nonprofit and foundation evaluators!). We learned that while we had achieved substantial progress, we could further strengthen our relationships with grantees and clarify our messages to them.

Rad Resource: The Center for Effective Philanthropy’s Grantee Perception Report® (GPR).  Again, there is much foundations can learn from grantees – what funders are doing really well that has a positive impact on organizations as well as areas in which they need to improve.  The GPR is a tested survey that asks grantees to give confidential feedback and suggestions. It also gives funders helpful comparative data so they can make assessments relative to the field and customized cohorts of other foundations. We commissioned a GPR earlier this year. Our grantees took the time to respond and include thoughtful comments, and we intend to incorporate their recommendations into our work.

The American Evaluation Association is celebrating Nonprofits and Foundations Topical Interest Group (NPFTIG) Week. The contributions all this week to aea365 come from our NPFTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hi, we are Andrew Taylor from Taylor Newberry Consulting and Ben Liadsky from the Ontario Nonprofit Network. For the past couple of years we have been working to identify and address the systemic issues that impede useful evaluation in the nonprofit sector. We’ve shared part of our journey with AEA365 previously and now we want to share our latest report.

Nonprofits do a lot of evaluation work. However, it isn’t always meaningful or useful. The purposes and intended uses of evaluation work are not always made clear. The methodologies employed are not always aligned well with purposes or with available resources. Sometimes, there is more focus on the process of data collection and reporting than on learning and action.

While a lot of attention has been paid to the ways in which nonprofits can alter their practices internally to improve their evaluation capacity, there has been less discussion of the ways in which external factors enable or inhibit good nonprofit evaluation. Funding practices, reporting requirements, information sharing channels, and evaluation standards all help to shape the “ecosystem” within which nonprofit evaluation work takes place.

Rad Resource:

Making Evaluation Work for the Nonprofit Sector: A Call to Action consists of seven recommendations designed to improve the nonprofit evaluation ecosystem. It also includes existing examples of action for each recommendation that can be built on or provide a starting point for next steps.

These recommendations have emerged from over two years of dialogue with nonprofits, public and private funders, government, evaluators, provincial networks, and other evaluation stakeholders around the challenges and opportunities to cultivating evaluations that work.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Lessons Learned: 

Evaluation has the potential to do much more than hold nonprofits accountable. It can enable them to be constantly listening and learning. It can equip them to gather and interpret many types of information and to use that information to innovate and evolve.

Without a serious rethinking of the current evaluation ecosystem, nonprofits, governments, and other funders may be unintentionally ignoring key questions that matter to communities and failing to equip the sector to respond in more impactful ways. Ultimately, this position paper should be seen as a conversation starter and a way for all users of evaluation to begin to envision an evaluation ecosystem that, at its core, is more rewarding and engaging for good evaluation work to take place.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Greetings fellow evaluators! We are Veena Pankaj, Kat Athanasiades, Deborah Grodzicki, and Johanna Morariu from Innovation Network. Communicating evaluation data effectively is important—it can enhance your stakeholders’ understanding of evaluative information and promote its use. Dataviz is an excellent way to communicate evaluation results in an engaging way!

pankaj_image_01_soe_coverToday’s post provides a step-by-step guide to creating effective, engaging dataviz, using Innovation Network’s visual State of Evaluation 2016 as an example. State of Evaluation 2016 is the latest in our series documenting changes in evaluation capacity among nonprofits across the U.S.

Step 1: Identify your audience. For State of Evaluation, our audience was nonprofits, foundations, and their evaluators across the U.S.

Step 2: Select key findings. Analyze your data. Which findings are most relevant to your study and your audience? As evaluators, this is the easy part! We found that organizations funded by philanthropy are more likely to measure outcomes, and thought that would be interesting to our readers.

pankaj_image_02_people

pankaj_image_03_logic_modelspankaj_image_04_housesStep 3: Grab paper and pencil. Start drawing different ways to display your data. What images or concepts does your data evoke? Thinking beyond generic chart formats may help your audience better understand the meaning behind the data. Brainstorming as a team can really help keep creative ideas flowing!

 

Step 4: Gather feedback. Show your initial sketches to others and get their first impressions. Ask questions like:

  • What does this visualization tell you?
  • How long did it take you to interpret?
  • How can it be tweaked to better communicate the data?

Third party feedback can provide additional insights to sharpen and fine-tune your visualizations.

Step 5: Think about layout and supporting text. Once you’ve selected the artistic direction of your visualization, it’s time to add supportive text, label your visualization features, and think about page layout.

pankaj_image_05_layout pankaj_image_06_layout

Hot Tip: For inspiration, check out Cole Nussbaumer’s Storytelling with Data gallery.

Step 5. Digitize your drawings. If you are working with a graphic designer, it’s helpful to provide them with a clear and accurate mock-up of what you want your visualization to look like. We worked with a designer for State of Evaluation, but for the bulk of dataviz needs this is unnecessary. Digitizing is accomplished by translating your initial renderings into a digital format. Basic software such as PowerPoint, Word, or Excel is often all you need.

pankaj_image_07_digial1

pankaj_image_08_digital2

Rad Resource: Interested in seeing how our dataviz creations evolved? Check out State of Evaluation 2016!

The American Evaluation Association is celebrating Data Visualization and Reporting (DVR) Week with our colleagues in the DVR Topical Interest Group. The contributions all this week to aea365 come from DVR TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

 

No tags

Hello, fellow data enthusiasts! I’m Jennifer Glickman, manager on the research team at the Center for Effective Philanthropy (CEP). Over the past two years, CEP has partnered with the Center for Evaluation Innovation (CEI) to answer the question, how are foundations assessing their performance?

Rad Resource: Benchmarking Foundation Evaluation Practices

This past month, CEP and CEI released the most comprehensive review to date of evaluation systems at foundations. Our report presents data collected from the most senior staff with evaluation-related responsibilities at 127 foundations. Although a variety of information was gleaned from this research, I found two findings particularly noteworthy.

Lesson Learned: Foundation Leadership Matters

We asked respondents how engaged their foundation’s senior management is in certain aspects of evaluation. Only about half of respondents say senior management engages the appropriate amount in modeling the use of evaluation information in decision making, and even fewer say senior management engages the appropriate amount in supporting adequate investment in the evaluation capacity of grantees. This level of engagement may pose a problem, seeing as respondents who say their foundation’s senior management engages less than the appropriate amount in evaluation also say their foundation has found aspects of its evaluation efforts more challenging.

Board support for evaluation plays a role in the challenges foundations face, as well. When respondents say their foundation’s board is less supportive of evaluation, they also say the foundation is significantly more likely to experience challenges in its evaluation efforts. Yet, only 40 percent of respondents say there is a high level of board support for the role of evaluation staff at their foundation, and only one-third say there is a high level of board support for foundation spending on evaluation.

Lesson Learned: Information is not shared externally

Over three-quarters of respondents say evaluation findings are shared quite a bit or a lot with their foundation’s CEO, and two-thirds say evaluation findings are shared quite a bit or a lot with their foundation’s staff. This transparency, however, does not seem to extend beyond foundation walls.

Nearly three-quarters of respondents say their foundation invests too little in disseminating evaluation findings externally. This lack of dissemination applies to grantees, other foundations, and the general public. In fact, only 28 percent of respondents say evaluation findings are shared quite a bit or a lot with their foundation’s grantees, and fewer than 20 percent say evaluation findings are shared with other foundations or the general public.

These two findings represent only some of the data discussed in our report. To learn more about the structures foundations have in place for evaluation, including staffing practices and the use of evaluation results, download the report on CEP’s website here.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Older posts >>

Archives

To top