I’m Paul Mattessich, Executive Director of Wilder Research. Our staff includes about 30 applied research professionals who study program effectiveness and community trends related to: children; education; public health; housing; homelessness; aging; social disparities; and other topics. The mission of Wilder Research is to improve the lives of individuals, families, and communities through human services research. We often have philanthropists as evaluation clients.
Most evaluators have evaluated a foundation grant-funded program. Fewer have evaluated a foundation as a whole. Foundations constitute a type of program; they engage in activities (both grant making and other activities) intended to produce effects. We, as evaluators, might think that giving away money seems easy. However, Foundation board members and staff face the same issues that other organizations face in delivering products and services – and they ask the same basic evaluation questions: How effective are we? What return do we receive on our investment? How can we improve?
Hot Tip: A good search engine will reveal much literature on “evaluation and foundations” covering topics ranging from effective evaluation of individual grants, to the evaluation of portfolios of grants, to the evaluation of a foundation as a whole.
Hot Tip: Foundations don’t just deliver money. They sometimes manage initiatives. They sometimes influence policy. Consider these activities with your foundation clients to determine whether to evaluate the outcomes they produce.
Lessons Learned:
- Logic models can organize thinking to assist foundations to understand both the impacts of their grants and the impacts of their organizations as a whole.
- Foundation board members and staff usually have little time to read evaluation reports; they experience “information overload” from many sources. Help these two groups to see themes quickly. Creative reporting styles will increase the amount of use of your evaluation.
- Foundations understand that they often cannot measure the independent effects of their own grants upon community outcomes, relative to many other factors influencing those outcomes. Nonetheless, they value understanding how many lives they touch directly through their grants, and what initial outcomes transpire.
- Foundations LOVE to learn and to promote learning. Design research, and organize results, which can tangibly inform decisions and increase understanding of best practices.
Hot Tip: Conference Session The Use of Effective Evaluation for Responsive Philanthropy. Staff from 3 major foundations will discuss how they like to work with evaluators.
Hot Tip: Restaurants at this year’s conference: Consider The Bachelor Farmer, The Butcher and the Boar, Saffron, Sanctuary, The News Room, The Dakota (Jazz), and Vincent. Or venture toward Saint Paul, where Asian food abounds on University Avenue (take the #16 bus); consider Hoa Bien, Ngon, Cheng Heng, and Mai Village.
The American Evaluation Association is celebrating with our colleagues from Wilder Research this week. Wilder is a leading research and evaluation firm based in St. Paul, MN, a twin city for AEA’s Annual Conference, Evaluation 2012. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Foundations also receive a great deal of evaluation data from grantees. Many foundations are interested in ways to look across grantees, funding streams and initiatives to find common lessons. Some have internal capacity to do so, others seek guidance from evaluators or experts in the fields they fund. I have found logic models to be helpful here, too – they can help focus analysis across disparate sources of information.