This week is sponsored by our colleagues in the Research, Technology, and Development Evaluation (RTD) TIG. The contributions this week are evergreen posts contributed by RTD TIG members about topics so important, they’re worth a second read.
-Liz DiLuzio, Lead Curator
Hello! I am Yaw Agyeman, Program Manager at the Lawrence Berkeley National Laboratory. I am joined by my writing partner Kezia Dinelt, Presidential Management Fellow at the U.S. Department of Energy’s Office of Energy Efficiency and Renewable Energy (EERE), to share how EERE developed and institutionalized a rigorous evaluation practice to quantify the impacts of EERE programs and investments.
Here’s the premise: Imagine you are brought into a federal agency with multiple energy programs, each of them with multiple portfolios encompassing investments in research, development, demonstration, and deployment (RDD&D) projects. Now you’re tasked with developing a rigorous evaluation process. What would you do?
We developed a holistic framework for program evaluation—a systemic approach that borrows from organizational psychology, institutional change, and principles of persuasion. Elements of the framework include
- Developing resources—guidance and tools for conducting and reviewing evaluation studies, including a guide on program evaluation management, a peer review method guide, a uniform method for evaluating realized impacts of EERE R&D programs, a non-RD&D evaluation method guide, and a quality assurance protocol to guide evaluation practice.
- Providing program evaluation training for organizational staff.
- Developing institutional links with the organization’s technology offices, budget office, communications team, stakeholder engagement team, project management office, and others.
- Developing data collection protocols for ongoing tracking of routine evaluation data.
- Developing an impact results repository and reporting tool for use across the organization.
- Partnering with the technology offices to plan and conduct evaluations involving third party experts, feed the results back into program improvement, and communicate findings to target stakeholders.
Lessons Learned:
Seeding these pillars of evaluation practice within the federal organization has involved varying applications of the principles of organizational change, which scientists at the Lawrence Berkeley National Laboratory have filtered down to a dynamic interaction between the “roles, rules, and tools” for behavioral change within an institution. The implementation has been undulating, nonlinear, and has taken more than 8 years with fits and starts. But, EERE’s evaluation team successfully built evaluation capacity within EERE by tapping into the vast pool of evaluation expertise across the nation to help frame and mold this institutional change.
Over time, the victories have piled up: (1) nearly one-third of all R&D portfolio investments across EERE have been evaluated, revealing spectacular returns on investments; (2) program staff are increasingly conversant in the language of evaluation, and there is an active and abiding interest in commissioning evaluations and using results; (3) the organization has established a set of core evaluation metrics and measures that are adaptable for use by most program investments; (4) the guides and tools developed for evaluation are being used; and (5) a growing culture of evaluation through the guides and tools is leading to innovations in evaluation practice, such as the “Framework for Evaluating R&D Impacts and Supply Chain Dynamics Early in a Product Life Cycle,” which is the first of its kind anywhere across the federal government. It can be done.
The American Evaluation Association is celebrating Research, Technology and Development (RTD) TIG Week with our colleagues in the Research, Technology and Development Topical Interest Group. The contributions all this week to aea365 come from our RTD TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.