Hi! I’m Jodie Galosy, Ph.D., an internal evaluator at the Knowles Teacher Initiative, a nonprofit organization that supports a national network of approximately 450 mathematics and science educators in the US. Each year, we choose a cohort of about 35 beginning high school math and science teachers to participate in our five-year Teaching Fellows program that emphasizes collaborative study of teaching and teacher leadership development. Knowles Teaching Fellows who complete the five-year program become Senior Fellows who continue to participate in the Knowles community and receive support for education leadership and improvement.
The internal evaluation component of our programs is a joint effort between evaluation and program staff to monitor, improve and measure the impact of Knowles. One structure that we use for collaborative data analysis is something we call a “data huddle” at the end of each yearly evaluation cycle (i.e., summer, fall, spring). During the data huddle, evaluation and program staff/leadership review data together and draw implications for practice. I plan the data huddle with program staff and facilitate the data huddle process. As we have tried out the huddle process together, we’ve landed on some key elements that make data huddles an evaluation plus.
Key elements of a data huddle:
- Focusing question: We use a focusing question to structure the data discussion. A good focusing question is related to a program goal and phrased as a yes or no question. For example, one of our program goals for Fellows’ spring work in year 4 of the Fellowship is: Fellows develop teacher leadership skills and dispositions to support collaboration with school colleagues. The focusing question for this year’s spring data huddle is: Are Fellows able to describe what they have learned about teacher leadership skills for collaborating and working with colleagues in their local context? An important note: The focusing question is agreed upon at the beginning of the evaluation cycle so we make sure we have the data needed to answer the question at the huddle.
- Curated data: We carefully select and prepare the data we review for the focusing question so that it is manageable, informative, and provocative. We include multiple data sources, whenever possible, that may include quantitative survey responses, Fellows’ reflections, or program artifacts. We decide prior to the data huddle what data to use, if preliminary analysis is needed and who will do it, and what analysis questions we will use. Well curated data gives us multiple perspectives on the focusing question and doesn’t take more than about 15 minutes to explore in the huddle.
- Claims and evidence: We answer the focusing question with claims backed by evidence from the data. We often do this work during the huddle in small groups working with the same set of data or data from different sources assigned to each group. For example, after reviewing the data, a claim to the focusing question above might be: Most Fellows are able to describe leadership dispositions but fewer describe leadership skills. The evidence might include Fellows’ survey ratings about their leadership skills/dispositions and excerpts from Fellows’ plans to collaborate with school colleagues.
- Protocol: We use a protocol for each focusing question taken up in the data huddle. The protocol includes: Overview (to review the focusing question, background information and data sources/analysis); data review; discussion; and; implications. Each segment of the protocol is timed and the entire process for each question is limited to approximately 30 minutes. The protocol helps keep participants on track and ensures we come away with insights and action steps.
We really enjoy huddling up around data and have found the structure very useful for collaborative evaluation! Feel free to contact me with questions or for further information: email@example.com.
The American Evaluation Association is hosting Internal Evaluation (IE) Topical Interest Group Week. The contributions all this week to AEA365 come from our IE TIG members. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.