Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

RTD TIG Week: Synthesis Across Copy Cat Studies Helps Answer Really Big Policy Questions by Gretchen Jordan

I’m Gretchen Jordan, a long-time advocate of logic modeling. I do logic models on weekends for fun. My almost 30 years of work as an evaluator of government-funded science, technology, and innovation programs has convinced me that the big questions we are asked to answer require us to move toward use of agreed upon logical evaluation frameworks tailored to the program type and context in order to learn from synthesis of findings across evaluations.  Here is an overview of what that means, why it is important, and how a manager would go about implementing the recommendation. My example is for technology programs, but the ideas likely apply to your area of evaluation work as well.

The U.S. Government Accountability Office defines Evaluation synthesis as a method that brings together a group of existing evaluation studies screened for relevance, quality and strength of evidence, and organizes the data in order to answer evaluation questions with the assembled data. The evaluation questions often include those of broader scope and pertaining to larger portfolios and related policy. These questions may focus on overall effectiveness, identifying which areas have been found to work better or worse than others; the comparison of programs and portfolios of programs; or specific program/portfolio features. 

There are persistent challenges to carrying out evaluation to inform policy that the use of agreed upon evaluation frameworks and evaluation synthesis can largely overcome. Challenges identified in a review by RAND Europe that examined 40 years of studies of how scientific research drives innovation are:

  • Apparent contradictions between the conclusions due to differences in study design such as types of innovations studied and timeframes considered;
  • Biases in the selection of cases to examine;
  • A lack of clarity and unity in the definitions of explored concepts across studies;
  • Unclear descriptions of study methodology for data collection and analysis;
  • The challenge of setting boundaries for data collection and analysis;
  • Issues of sector idiosyncrasies with respect to innovation processes; and
  • Challenges in impact attribution.

Five Steps to a Synthesis Evaluation to Inform Research and Innovation Policy

1. Agree on the question(s) to be answered by the synthesis evaluation. An example is “What technical, economic and societal impacts have occurred and what program and other factors contribute to those?”

2. Design a set of evaluations in such a way that their findings can be credibly synthesized. See the generic logic models, indicators and glossary proposed in “Evaluating Outcomes of Publicly Funded Research…”, the Research, Technology and Development Topical Interest Group 2015 paper.

3. Choose multiple R&D programs and/or program areas to participate with involvement of key stakeholders, particularly program managers.

4. Implement the individual evaluations using the common evaluation design. Ideally there would be as many as 30 individual studies.

5.  Put all of the findings of the individual studies, including the data on context, into a database. Conduct a synthesis analysis.  Present the new evidence and conclusions.


The American Evaluation Association is celebrating RTD TIG Week with our colleagues in the Research Technology and Development TIG. All of the blog contributions this week come from our RTD TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.