TAG | Administration
My name is Susan Kistler. I am AEA’s Executive Director and I contribute each Saturday’s post to aea365. I sent the final version of the Evaluation 2010 hardcopy conference program off to the printers. After days of typing, proofing, paginating, and picking nits, I want to say a big ‘thank you’ to all of the staff who worked so diligently to pass this annual milestone. Now, it’s time to lighten the day and let off a little steam. But we’re still at work, so I’m going to focus this week’s post on evaluation humor. If you are receiving this via email, you may need to click through back to the site to see the cartoons, but it is worth the trip!
Hot Tip – From the Archives: Alexey Kuzmin reminded us back in January of the value of humor for evaluation reporting (see his post). Humor is also a great icebreaker and can offer a ‘breather’ and transition during presentations.
Hot Tip: Patricia Rogers and Jane Davidson’s Genuine Evaluation Blog regularly features the “Friday Funny” – always worth a look.
Hot Tip – Dilbert: This past week, Dilbert featured a great strip on the “Enhanced Assessment Methodology”
Hot Tip – Wondermark: I love the 19th century woodcuts that serve as the basis for the Wondermark comic. And, Wondermark’s talented author, David Malki ! (yes, the exclamation point is part of the name) allows embedding of his work in blogs. He has also been kind enough to note via email “You may consider my online archive free for use by your members or yourself in your presentations, so long as attribution is given.” Which is very kind of him to extend this courtesy. Regular readers know of my love of a good font (see July post) when developing reports and presentations. In August, David shared this take on fonts…er typefaces:
I’ll return to the issue of humor in a later post to tell you about two other strips of interest and a project we’re working on at AEA to bring a little levity to us all.
The above represents my own interests and ideas and not necessarily that of the American Evaluation Association.
No comments · Posted by Susan Kistler in Collaborative, Participatory and Empowerment Evaluation, College Access Programs, Evaluation Managers and Supervisors
My name is Michelle Jay and I am an Assistant Professor at the University of South Carolina. I am an independent evaluator and also an evaluation consultant with Evaluation, Assessment and Policy Connections (EvAP) in the School of Education at UNC-Chapel Hill. Currently I serve with Rita O’Sullivan as Directors of AEA’s Graduate Education Diversity Internship (GEDI) program.
Lessons Learned: A few years ago, EvAP served as the external evaluators for a federally-funded Gaining Early Awareness and Readiness for Undergraduate Programs (GEAR UP) state-wide grant housed at University of North Carolina (UNC) General Administration. Part of our work involved assisting project coordinators in 20 North Carolina counties to collect student-level data required for their Annual Performance Review reports as well as for program monitoring, assessment, and improvement. For various reasons, project coordinators experienced numerous difficulties in obtaining the necessary data from their Student Information Management Systems (SIMS) administrators at both the school and district levels. As collaborative evaluators, we viewed the SIMS administrators not only as “keepers of the keys” to the “data kingdom,” but also as potentially vested program stakeholders whose input and “buy-in” had not yet been sought.
Consequently, in an effort to “think outside the box,” the EvAP team seized an opportunity to help foster better relationships between our program coordinators and their SIMS administrators. We discovered that the administrators often attended an annual conference each year for school personnel. The EvAP team sought permission to attend the conference where we sponsored a boxed luncheon for the SIMS administrators. During the lunch, we provided them with an overview of the GEAR UP program and its goals, described our role as the evaluators, and explained in detail how they could contribute to the success of their districts’ program by providing the important data needed by their district’s program coordinator.
The effects of the luncheon were immediate. Program coordinators who had previously experienced difficulty getting data had it on their desks later that week. Over the course of the year, the quality and quantity of the data the EvAP team obtained from the coordinators increased dramatically. We were extremely pleased that the collaborative evaluation strategies that guided our work had served us well in an unanticipated fashion.
Hot Tip: The data needs of the programs we serve as evaluators can sometimes seem daunting. In this case, we learned that fixing “the problem” was less a data-related matter that it was a “marketing” issue. SIMS administrators, and other keepers-of-the-data, have multiple responsibilities and are under tremendous pressure to serve multiple constituencies. Sometimes, getting their support and cooperation are merely a matter of making sure they are aware of your particular program, the kinds of data you require, and the frequency of your needs. Oh, and to know that they are appreciated doesn’t hurt either.