AEA365 | A Tip-a-Day by and for Evaluators

Jul/12

10

MME Week: Leanne Kallemeyn , Daniela Schiazza, and Ann Marie Ryan on Using Mixed Methods to Conduct Integrated Data Analysis

Our evaluation team from Loyola University Chicago, School of Education–Leanne Kallemeyn, Assistant Professor; Daniela Schiazza, doctoral candidate and project coordinator; and Ann Marie Ryan, Associate Professor–have been working on an external evaluation of a U.S. Department of Education Teaching American History grant. For two years, we used mixed methods to report GPRA indicators—surveys, tests, tracking databases—and to address the information needs of program providers— interviews, observation, case studies.  We found that historians appreciated the approach of building arguments about implementation and impact from both qualitative and quantitative data sources, but we were only mixing at the level of interpretation.   So, we experimented with conducting an integrated data analysis.  Daniela used this opportunity to document and study our process for her dissertation.

Resources:

Lessons Learned:

  • Develop the question appropriately. Develop an evaluation question that points to the evaluand, and requires both qualitative and quantitative methods to address it.
  • Decide the purpose. Decide on the mixed methods purpose (Refer to Chapter 6 in Greene’s book)
  • Use visual tools. Utilize a Venn diagram to display the overlapping and unique facets of the program that the qualitative and quantitative methods will address to guide integrated analyses.  Refer to it often.
  • Analyze carefully. Initially analyze each data source based on techniques in its own tradition.  Organize preliminary findings by evaluation question(s), displaying qualitative and quantitative data side-by-side to engage in iterative reflections. Include stakeholders in these reflections as they gain valuable insights.
  • Expect dissonance. Do not be concerned when quantitative and qualitative results do not corroborate.  Dissonance provides an opportunity to explore why the conflict exists, which can lead to new insights.  We found dissonance especially helpful during preliminary data analysis.
  • Map findings. When conducting a final integrated data analysis, consider ways in which the findings from one method can be mapped to the findings of the other method.  For example, we had four case studies of teachers.  We conducted a cluster analysis of survey responses from all participating teachers.  We then identified which survey cluster the case study participants were situated within.
  • Be patient and creative. There are no roadmaps for integrated data analysis.  Not every analytic approach will yield useful results.  For example, in comparison to the cluster analysis, we did not find it as helpful to quantify codes from the case studies, and compare them to survey responses.

The American Evaluation Association is celebrating Mixed Method Evaluation TIG Week. The contributions all week come from MME members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

· ·

No comments yet.

Leave a Reply

<<

>>

Archives

To top