Hi, we are Catherine Kelly and Jeanette Tocol from the Research, Evaluation and Learning Division of the American Bar Association Rule of Law Initiative (ABA ROLI) in Washington D.C.
Democracy, rule of law, and governance practitioners often speak about the benefits of “holistic” and “systems-oriented approaches” to designing and assessing the effectiveness of programming. Yet in the rule of law community, there is a tendency for implementers, who are often knowledgeable legal experts, to focus on the technical legal content of programs, even if these programs are intended to solve problems whose solutions are not only legal but also political, economic, and social.
While technical know-how is essential for quality programming, we have found that infusing other types of expertise into rule of law programs and evaluations helps to more accurately generate learning about the wide range of conditions that affect whether desired reforms occur. Because of their state and society-wide scope, systems-based approaches are particularly helpful for structuring programs in ways that improve their chances of gaining local credibility and sustainability.
Hot Tip #1: Holistic program data collection should include information on alternative theories of change about the sources of the rule of law problems a program seeks to solve. For instance, theories of change about judicial training are often based on the assumption that a lack of legal knowledge is what keeps judicial actors from advancing the rule of law. A holistic, systems-oriented analysis of justice sector training programs require gathering program data, but not only the data that facilitates analysis of improvements in, for example, training participants’ knowledge that is theorized to improve their enforcement of the law. Additional data on other factors likely to influence the rule of law reforms sought through the program, like judges’ perceptions of pressure from the executive branch to take certain decisions, or citizens’ perceptions of the efficacy of formal justice institutions should also be gathered. The analysis of such data can facilitate adaptive learning about whether the favored factor in a program’s theory of change is the factor that most strongly correlates with the desired program outcomes, or whether alternative factors are more influential.
Hot Tip #2: Multidisciplinary methods add density and richness to DRG research. This enhances the rigor with which evaluators can measure outcomes and illustrate a program’s contributions to long-term objectives. Multidisciplinary work often combines the depth of qualitative understanding with the reach of quantitative techniques. These useful but complex approaches are sometimes set aside in favor of less rigorous evaluation methods due to constraints in time, budget, or expertise. Holistic research does indeed require an impressive combination of actions: unearthing documentary sources from government institutions (if available), conducting interviews with a cross-section of actors, surveying beneficiaries, and analyzing laws. Participatory evaluations are useful in this context. They facilitate the placement of diverse stakeholders, beneficiaries, and program analysts into productive, interdisciplinary, and intersectional conversations.
The American Evaluation Association is celebrating Democracy & Governance TIG Week with our colleagues in the Democracy & Governance Topical Interest Group. The contributions all this week to aea365 come from our DG TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.