Jean King and Laura Pejsa, Minnesota Evaluation Studies Institute (MESI), here, with broad smiles on our faces. We are the proud coaches who are wrapping up this week of posts written by our creative student consultants about ways to evaluate a conference (using exit surveys of Ignite sessions, network visualizing, Twitter, and video clips). Progressive educators long ago documented the value of experiential learning–“learning by doing”–and our experiences during this year’s AEA conference again provide support for the idea as a means of teaching evaluation. Thoughts about how to use a conference setting to engage evaluation students follow.
Hot Tips:
- Create an evaluation team. Our experience at MESI confirms the value of having students collaborate on projects. Not only do they learn how to do evaluation tasks, but they also learn how to collaborate, an important skill set for evaluators, regardless of their eventual practice.
- Encourage innovation. Our charge was to think broadly about conference evaluation. At our first meeting, students brainstormed many possible ways to collect data at the conference, no holds barred, the more creative, the better. As we sought to be “cutting edge,” technology played a role in each of the four methods selected.
- Make assignments and hold people accountable. Social psychology explains the merit of interdependence when working on a task. We divided into four work groups, each of which operated independently, touching base with us as needed. Work groups knew they were responsible for putting their process together and being ready at the conference. As coaches, we did not micromanage.
- Make the process fun. University of Minnesota students take evaluation seriously, but their conference evaluation work generated a great deal of laughter. In one sense it was high-stakes evaluation work (we knew people would use the results), but without the pressure of a full-scale program evaluation.
Lessons Learned:
- Students can learn the evaluation process by collecting data at a conference or other event. Unlike programs, short-term events offer an evaluation venue with multiple data-collection opportunities and fewer complexities than a full-scale educational or social program.
- A week-long conference offers numerous opportunities to engage in creative data collection. It is a comparatively low-stakes operation since most conference organizers opt for the traditional post-conference “happiness” survey, and any data gathered systematically will likely be of value.
- Innovative data collection can generate conversation at an evaluation conference. Many people interacted with the students as they collected data. Most were willing to engage in the process.
- Minnesota evaluation students really are above average. Garrison Keillor made this observation about Minnesota’s children in general, but this work provided additional positive evidence.
We’re learning all this week from the University of Minnesota Innovative Evaluation Team from Evaluation 2012. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.
I’ve learned so much this week and am so proud to be associated with these innovative evaluators and innovative evaluation instructors at the University of Minnesota! Thank you!