DEOET TIG Week: Michael Culbertson on Getting Beyond Satisfaction with a Rubric for Webinar Quality

Hello! I’m Michael J. Culbertson, team member on the Technical Evaluation Assistance in Mathematics and Science (TEAMS) project and research associate at RMC Research in Denver. We at TEAMS noticed that most webinar evaluations rely almost exclusively on post-webinar surveys of participant satisfaction. It’s true that satisfaction is an important indicator of webinar quality, but we felt that evaluators also need a more comprehensive picture of the different aspects of quality that come together in a great webinar.

We devised a rubric that covers seven key components:

  • Recruitment: Exciting participants before the event begins.
  • Technology: Wrangling the software and hardware that bring us together.
  • Content: Starting with a solid foundation.
  • Organization: Knitting everything together to make sense.
  • Delivery: Conveying a captivating message.
  • Visual Aids: Stimulating both sides of the brain.
  • Participant Interaction: Bringing the best out of the people in the virtual room.

We welcome feedback on the rubric and how it was helpful (or not so helpful) in your own project or evaluation! Tweet @teams4msp or visit the project TEAMS website to get in touch!

Lesson Learned: Make sure the technology is working. It is too easy for participants at their desk to get distracted, check an email, and 10 minutes later decide they are hopelessly lost.

Lesson Learned: A bad presentation makes for a bad webinar, but a good in-person presentation doesn’t necessarily translate directly into a good webinar without some modification. The webinar provides unique opportunities to connect at a distance, but also demands attention to flow, timing, imagery, content, and interaction.

Lesson Learned: One key quality of webinars is the potential for participants to interact with one another and with the presenters. The rubric examines whether each component of the webinar (from invitations to software) supports or hinders participant engagement.

Cool Trick: The TEAMS Webinar Rubric includes a user’s guide, the webinar rubric, and a set of considerations and self-reflection questions for webinar developers and evaluators.

Rad Resource: Look at 18 Tips on How to Conduct an Engaging Webinar for quick tips for upping your webinar game.

Rad Resource: To put on a great webinar from start to finish, look at Best Practices for Webinars which provides a thorough tour and great examples.

The American Evaluation Association is celebrating DEOET TIG Week with our colleagues in the Distance Education and Other Educational Technologies Topical Interest Group. The contributions all this week to aea365 come from our DEOET TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

1 thought on “DEOET TIG Week: Michael Culbertson on Getting Beyond Satisfaction with a Rubric for Webinar Quality”

  1. Hi Michael,

    I agree with you that the delivery of the speaker and the interaction with the audience has a large impact on the quality of a webinar.

    I’ve found that if I start with an icebreaker and prompt engagement early in the webinar by asking open-ended questions and using polls I end up getting much more participation from the audience.

    I’ve put on more than 500 webinars with my team and have written down what has helped me. Thought you might find it helpful: https://danielwaas.com/expert-webinar-tips-and-tricks-for-presenters/

    Daniel

Leave a Reply to Daniel Waas Cancel Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.