AEA365 | A Tip-a-Day by and for Evaluators

TAG | history

Hello, I am Maxine Gilling, Research Associate for Gaining Early Awareness and Readiness for Undergraduate Programs (GEAR UP). I recently completed my dissertation entitled How Politics, Economics, and Technology Influence Evaluation Requirements for Federally Funded Projects: A Historical Study of the Elementary and Secondary Education Act from 1965 to 2005. In this study, I examined the interaction of national political, economic, and technological factors as they influenced the concurrent evolution of federally mandated evaluation requirements.

Lessons Learned:

  • Program evaluation does not take place in a vacuum. The field and profession of program evaluation has grown and expanded over the last four decades and eight administrations due to political, economic, and technological factors.
  • Legislation drives evaluation policy. The Elementary and Secondary Education Act (ESEA) of 1965 established policies to provide “financial assistance to local educational agencies serving areas with concentrations of children from low-income families to expand and improve their educational program” (Public Law 89-10—Apr. 11, 1965). This legislation also had another consequence: it helped drive the establishment of educational program evaluation and the field of evaluation as a profession.
  • Economics influences evaluation policy and practice. For instance in the 1980’s evaluation took a downturn due to the stringent economic policies. Program evaluators resorted to lessons learned through writing journals and books.
  • Technology influences evaluation policy and practice. The rapid emergence of new technologies all contributed to changing goals, standards, and methods and values underlying program evaluation.

Resources:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · · · · · · ·

Our evaluation team from Loyola University Chicago, School of Education–Leanne Kallemeyn, Assistant Professor; Daniela Schiazza, doctoral candidate and project coordinator; and Ann Marie Ryan, Associate Professor–have been working on an external evaluation of a U.S. Department of Education Teaching American History grant. For two years, we used mixed methods to report GPRA indicators—surveys, tests, tracking databases—and to address the information needs of program providers— interviews, observation, case studies.  We found that historians appreciated the approach of building arguments about implementation and impact from both qualitative and quantitative data sources, but we were only mixing at the level of interpretation.   So, we experimented with conducting an integrated data analysis.  Daniela used this opportunity to document and study our process for her dissertation.

Resources:

Lessons Learned:

  • Develop the question appropriately. Develop an evaluation question that points to the evaluand, and requires both qualitative and quantitative methods to address it.
  • Decide the purpose. Decide on the mixed methods purpose (Refer to Chapter 6 in Greene’s book)
  • Use visual tools. Utilize a Venn diagram to display the overlapping and unique facets of the program that the qualitative and quantitative methods will address to guide integrated analyses.  Refer to it often.
  • Analyze carefully. Initially analyze each data source based on techniques in its own tradition.  Organize preliminary findings by evaluation question(s), displaying qualitative and quantitative data side-by-side to engage in iterative reflections. Include stakeholders in these reflections as they gain valuable insights.
  • Expect dissonance. Do not be concerned when quantitative and qualitative results do not corroborate.  Dissonance provides an opportunity to explore why the conflict exists, which can lead to new insights.  We found dissonance especially helpful during preliminary data analysis.
  • Map findings. When conducting a final integrated data analysis, consider ways in which the findings from one method can be mapped to the findings of the other method.  For example, we had four case studies of teachers.  We conducted a cluster analysis of survey responses from all participating teachers.  We then identified which survey cluster the case study participants were situated within.
  • Be patient and creative. There are no roadmaps for integrated data analysis.  Not every analytic approach will yield useful results.  For example, in comparison to the cluster analysis, we did not find it as helpful to quantify codes from the case studies, and compare them to survey responses.

The American Evaluation Association is celebrating Mixed Method Evaluation TIG Week. The contributions all week come from MME members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

· ·

Archives

To top