Wendy Tackett and Joseph Trommater on Local Evaluation Capacity Building
1 Comment · Posted by jgothberg in Independent Consulting, Organizational Learning and Evaluation Capacity Building, Prek-12 Educational Evaluation
Hi! I’m Wendy Tackett, President of iEval, an evaluation consulting firm in Michigan, and I’m Joseph Trommater, Project Director for S.P.A.R.K.S., a 21st Century Community Learning Centers (CCLCs) program funded in northern Michigan through the Michigan Department of Education. iEval serves as the external evaluators for S.P.A.R.K.S. and has maintained that relationship for ten years. The S.P.A.R.K.S. program operates with the belief that evaluation should be part of the way we do business and drives change. In fact, when the original Project Director was hired, she was given Wendy’s business card as a starting point. Carpe diem!
Lesson Learned: When you build evaluation into the process from the very beginning, it becomes a necessary part of the program instead of an “add on” later. Staff and clients see the integration of evaluation as seamless, which means they’re more likely to respond to requests for data (e.g., surveys, interviews, focus groups) and be open and honest in their sharing. When the evaluation team for 21st CCLCs conducted their first statewide surveys of afterschool staff, several questions asked about staff knowledge and use of evaluation. Our program sites scored low compared to the rest of the state. After digging into that, we found that it was because the staff didn’t think of the data and use of data as evaluation – they just saw it as an important part of their work. Results from the intentional focus on using data show improvements in student attendance, behavior, and academic achievement.
Hot Tip: Instead of sharing evaluation data in huge reports then walking away, we take the time to carefully plan using the data. We receive three primary evaluation reports throughout the year from iEval as well as supplemental evaluation data from the state evaluation and technical assistance teams. That’s a lot of data! We take the time to go through the reports when we first get them, but we’ve also created a data calendar for the year. Each month we guide staff to operationalize the data around a specific theme (e.g., student recruitment, student retention, academic achievement), and we identify where in each report they can find supporting data for that theme. We discuss it at staff meetings and create small goals that can be met each month that will lead to our overall goal of students being successful in school. The use of data in improving programs is always on our staff meeting agendas!
- “21st Century Community Learning Centers Local Evaluator Guide – Second Edition” by Michigan’s Local Evaluator Advisory Committee gives tips for working with your local evaluator in your after school programs.
- The “Evaluation Exchange” through the Harvard Family Research Project has great information, including a section on participatory evaluation.
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to firstname.lastname@example.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
- WMU Week: Tammi Phillippe on Personal Lenses and Bias
- Chris Michael Kirk on Negotiating the Value of Evaluation
- WMU Week: Brandy Brown on Evaluation to Benefit the Public Good
- WMU Week: Kelly Robertson on the Depths of Culture and What it Looks Like in Practice
- WMU Week: Diane Rogers on AEA’s Statement of Cultural Competence and the Program Evaluation Standards