Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

TRE TIG Week: Results-Based Accountability (RBA): An Innovative Framework to Empower Collaborative Evaluation and Continuous Improvement by Keith Herzog, Jennifer Cooper, and Kristi Holmes

We are Keith Herzog, Jennifer Cooper, and Kristi Holmes, and we are excited to share some lessons learned from our experience implementing Results-Based Accountability (RBA) in two large, interdisciplinary programs, the Northwestern University Clinical and Translational Sciences (NUCATS) Institute and the Chicago Cancer Health Equity Collaborative (ChicagoCHEC), a cancer health equity partnership between three academic centers in Chicago.

Initially developed by Mark Friedman of the Fiscal Policy Studies Institute for governmental/social services sectors, the RBA framework has been implemented across a wide range of sectors and organizations, including the 64 institutions that comprise the NIH-funded Clinical and Translational Sciences Award (CTSA) Program as part of the NCATS Common Metrics Initiative. Locally, our two evaluation teams collaborated to implement RBA to inform evaluation and continuous improvement, while fostering a community of practice.

Rad Resources:  Headline Metrics & Turn the Curve Frameworks 

RBA is an intuitive, practical, and broadly applicable framework that empowers interdisciplinary team members to collaboratively identify meaningful and actionable performance metrics, in order to convey the scope and impact of the team’s efforts to key audiences. RBA provides approachable frameworks for identifying program-level performance metrics, also known as headline metrics, and for developing evidence-based Turn the Curve plans to inform strategic management efforts.

At its core, Results-Based Accountability empowers teams to identify program-level performance metrics (headline metrics) by considering three performance measures:

  • How much are we doing?
  • How well are we doing it?
  • Is anyone better off?

These three simple questions enable teams to identify powerful and actionable performance metrics that convey the scope (how much), satisfaction (how well), and impact (better off) of programs and initiatives internally and to key audiences.

The RBA Turn the Curve (TTC) framework then enables teams to work from “ends” to “means” through an evidence-based, step-wise process that improves strategic management, enhances accountability and reporting, and maximizes impact. Through the TTC exercise, teams assess progress to date on a particular metric, identify contributing and constraining factors underlying performance to date (the “story behind the curve”), and brainstorm strategies and partners to leverage contributing factors and/or overcome constraining factors.

Lessons Learned:  RBA Empowers Collaborative Evaluation

Our programs have seen direct benefits of RBA:

  • Fosters collaboration. RBA is straightforward, intuitive, and jargon-free. As a result, we find that colleagues across organizational levels and sectors are quick to embrace RBA and utilize the framework to engage in practical and productive conversations about evaluation and continuous improvement.
  • Broadly applicable. Although initially developed for governmental/social services sectors, RBA is broadly applicable across sectors (including foundations, academic institutions, and grant-funded centers and programs).
  • Flexible. RBA is flexible and may be applied at all levels of an organization, informing both program-specific evaluations and top-level strategic management efforts. This framework was created to foster a living evaluation strategy, meaning performance metrics can evolve as the needs and mission of the organization develop.
  • One tool in the toolbox. RBA is a stand-alone, comprehensive framework. The simplicity and flexibility of RBA enable evaluators to combine aspects of RBA with other approaches (e.g., logic models, balanced scorecard).

The American Evaluation Association is celebrating Translational Research Evaluation (TRE) TIG week. All posts this week are contributed by members of the TRE Topical Interest Group. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.