Hello! I am Asher Beckwitt, Research and Evaluation Project Director with Ripple Effect, and I have expertise in qualitative methods including interviews, focus groups, and content analysis. Hi! I am Alexis (Lex) Helsel, Lead Program Evaluator with Ripple Effect, and I have extensive experience in quantitative methods including survey design and analysis, administrative data analyses, and bibliometric analysis. Today we are sharing some tips based on our complementary expertise and experiences integrating mixed methods in support of large and small-scale holistic evaluations of federally funded biomedical research programs.
Incorporating multiple and mixed analytic methods into your evaluations helps to provide a comprehensive understanding of program outcomes and impacts. Each method has its own strengths and limitations and using multiple methods helps mitigate the limitations of any one method. Comparing and contrasting findings across multiple methods is referred to as triangulation. Triangulating findings across methods allows evaluators to demonstrate the validity of data, enhance confidence in the results, and gain deeper insight into the data from multiple perspectives.
- Hot Tip: Start from the beginning with your evaluation questions and choose the methods best suited to address them. Use an evaluation matrix that maps and keeps track of which methods are aligned with each evaluation question to ensure sufficient coverage to allow for triangulation. Below is a snapshot of a sample evaluation matrix that we create for these purposes.
- Lesson Learned: Schedule regular weekly or bi-weekly check-ins with all team members supporting the evaluation to share findings and insights to inform the data collection and analysis. Don’t wait until the end of the project when it’s time to write the report to bring everything together! Triangulating findings can be most effective and meaningful as an ongoing process and not a throw-away last step.
- Hot Tip: Organize reports and findings according to evaluation questions rather than by separate methodologies. Help your audience understand the findings by using visualizations to show how data from one method supports data from other methods. Explore and try to explain why any findings may contradict the others – interviews with stakeholders or having them review your results can be a great resource for helping to interpret the findings and investigate any discrepancies.
- Hot Tip: If you don’t have the in-house expertise, skills, or resources to include multiple methods, you may compare and contrast the literature with your findings. Are they the same or different? If they are different, ask “why” they are different. This can provide additional evidence and support in the absence of multiple methods and can also be an additional resource to help better understand any discrepant findings.
The American Evaluation Association is celebrating RTD TIG Week with our colleagues in the Research Technology and Development TIG. All of the blog contributions this week come from our RTD TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to firstname.lastname@example.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.