AEA365 | A Tip-a-Day by and for Evaluators

Oct/16

10

NPF Week: Dana Powell Russell on Guess Who’s Coming to Dinner? Engaging Funders Around the Data Interpretation Table

Hello! I’m Dana Powell Russell, Ed.D., a planning and evaluation consultant supporting nonprofits in the arts, museum, and K-12 sectors. I’m here to promote the value of engaging funders together with other stakeholders in making meaning of program evaluation results.

When interpreting data, I facilitate gatherings I call “Interpretation Workshops.” The purpose is to:

  • Create shared understanding of the results among stakeholders;
  • Ground conclusions in the data and in stakeholder wisdom;
  • Identify realistic recommendations; and
  • Generate buy-in and motivation around program improvement strategies.

HOT TIPS

Understand the existing client/funder relationship.
The invited funders can fall anywhere on the spectrum from prospective funders to longtime allies. It’s important to understand the client/funder backstory in advance in order to engage the funder effectively in the conversation.

Ensure that everyone has reviewed the data.
Send out a data preview report well in advance and open the workshop with a refresher on the data. The data preview report does not propose conclusions or recommendations—the group will generate these during the workshop.

Clarify the process and intent of the conversation, then let it unfold.
Keep everyone’s eyes on the prize, and understand that this group may rarely (if ever) have engaged face-to-face. They will undoubtedly hit on a mother lode topic that requires a time-consuming mining expedition; a spacious agenda allows complex conversations to play out.

LESSONS LEARNED

Reframe the conversation.
Inviting the funder to the table can broaden the conversation. Funders often have multiple grantees in the same space and actively follow trends in the field. As such, they can offer insights and solutions from a bird’s eye view.

Redefine a funder’s idea of program success.
The more a funder understands the inner workings of a program, the less likely they are to over-value indirect or generic measures (e.g., test scores or participation numbers). The workshop can help funders grasp the importance of program-specific indicators that chart a course to program growth and improvement.

Recommit and refocus a funder’s support.

An Interpretation Workshop helped the CMA Foundation shift its focus from musical instruments to teacher professional development in Metro Nashville Public Schools. CMA CEO Sarah Trahern told the Tennessean, “All of our dialogue kept coming back to the importance of music teachers,” she said. “You can put an instrument in a school, but if nobody knows how to play it, it goes quiet.” (Read the full story)

RAD RESOURCES

  • There are many approaches to group facilitation and meeting design; I use Technology of Participation methods created by the Institute of Cultural Affairs.
  • Want more details on what an Interpretation Workshop looks like? Download an overview.

The American Evaluation Association is celebrating Nonprofit and Foundations TIG Week with our colleagues in the NPF AEA Topical Interest Group. The contributions all this week to aea365 come from our NPFTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

3 comments

  • Michelle · November 17, 2016 at 5:58 pm

    Dear Dana Powell Russell,

    You make an excellent point of how important it is to involve the stakeholders in the interpretation of data! In many programs there may be underline causes or concerns that aren’t brought to light through more quantitative data, or perhaps even errors in the data collected that the funders and stakeholder should be able to catch. Reviewing the data with stakeholder, increases their commitment to seeing the gaps, and then utilizing the data to implement the program findings (Shulha & Cousins, 1997).

    I am currently in the midst of a process evaluation, evaluating a university Careers/Internship Program. I have been collecting and tracking data, including engagement in career prep resources, jobs applied to, interviews attended etc. The data that I was collecting, was pointing to gaps in the current program, so I decided to involve the students and program staff in the review of this data. Before the meeting, I was left speculating, “maybe students are not interested in the positions posted”, or “maybe students do not find the career resources available valuable”, but after the meeting, it became unanimous that the main obstacle in students engaging with the program was the lack of time they were able to commit (something that I would have considered less). In addition, this was the first time the staff was hearing this feedback direct, and they were able to suggest solutions. This discussion will significantly change my program recommendations.

    From my experience, I wanted to share a few additional suggestions:

    Ensure that when reviewing the data as a team, you have plenty of quantitative data comparing similar programs, and create visuals where possible to show the gaps. Understand that some member attending the meeting may not look at the data ahead.
    If you are looking for a positive change to the program to address a concern (perhaps a change in behavior from a stakeholder) have this meeting as soon as you can. Involving stakeholders in the evaluation process can create unintended positive results (Alkin & Taut, 2003). I am hopeful that the students will begin investing more time, and the staff will free up time in other places. I am sure sharing this with funders, may allow even for additional support in some cases similar to my example.

    Thank you for sharing your Rad resources. I found the link https://icausa.memberclicks.net/ very helpful!

    Warm regards,

    Michelle

    Alkin, M. C., and Taut, S. (2003). Unbundling evaluation use. Studies in Educational Evaluation, 29, 1-12.

    Shulha, L., & Cousins, B. (1997). Evaluation use: Theory, research and practice since 1986. Evaluation Practice, 18, 195-208.

    Reply

  • Kylie Hutchinson · October 12, 2016 at 9:19 am

    This is a great resource Dana, thanks so much.

    Reply

    • Dana · October 13, 2016 at 11:45 pm

      Thank you so much, Kylie!

      Reply

Leave a Reply

<<

>>

Archives

To top