AEA365 | A Tip-a-Day by and for Evaluators

TAG | webinar

Hello. I am Angie Aguirre from the INDEX program of the Eunice Kennedy Shriver Center at the University of Massachusetts Medical School. INDEX specializes in designing web sites, online courses, learning-management systems, and online databases, all of which are accessible to people with disabilities. Many of you are developing the same as part of your evaluation efforts. At INDEX, accessibility comes first! Since my colleagues and I have concentrated on accessibility issues in a previous blog (see here), I’m continuing here with that theme!

Webinars enable you to present, lecture, or deliver a workshop over the web. Webinars incorporate audio and visual elements, and can sometimes include audience interaction. It can’t be asked too often what makes your webinar accessible? It also can’t be said enough that accessibility promotes a culture of inclusion, as well as supports people with disabilities. The more people we can bring to the table, the better our evaluation efforts and the better we become as a society. Moreover, it’s the law!

Depending on what kind of webinar you’re providing and your audience, you should ask participants at registration if accommodations are needed.

Hot Tips: Choosing the Right Platform

Several features are needed for a webinar platform to be accessible. Be sure to look for:

  • integrated captioning;
  • screen reader compatibility; and
  • multiple ways of communicating with and engaging participants.

Providing Accommodations

  • For Auditory
    • Use Remote CART (Communication Access Real-time Translation). It is a service in which a certified CART provider listens to the webinar presenter and participants, and instantaneously translates all the speech to text. Most CART services are familiar with various types of webinar platforms, and can walk you through set-up.
    • If you are showing a video, be sure you provide captions.
  • For Visual
    • Webinar platform controls should be able to be accessed using keyboard commands.
    • All content should be readable by a screen reader, including the text content of a PowerPoint slide.
    • Provide accessible copies of the entire presentation, including handouts, before the webinar. This enables webinar participants to review the information ahead of time so they can focus on listening to the presenters.
  • For Cognitive
    • Provide a way for participants to respond verbally by phone/microphone, or by typing in a chat pod.
    • Participants should have ability to:
      • use the caption pod and adjust it to their liking;
      • listen to the recorded content at a later time;
      • control the speed of the content that is being delivered; and
      • (presenters/moderators may need to slow down a bit).

Rad Resources:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.


My name is Stephanie Evergreen and I am the eLearning Initiatives Director at the American Evaluation Association and the stand-in for Susan Kistler today. I am the host of our Coffee Break Demonstrations, 20-minute webinars designed to teach about a tool or resource of use to evaluators.

Last month our Coffee Break lineup featured a series of webinars on international monitoring and evaluation. It was cosponsored by Catholic Relief Services, American Red Cross, USAID, and AEA’s International and Cross-Cultural Topical Interest Group. Each week, Scott Chaplowe, Guy Sharrock, Susan Hahn, or Alice Willard walked the audience through a set of tools to help evaluators and evaluation managers successfully plan their work.

But our audience was a little different this time around. While our usual Coffee Break webinars are only accessible to AEA members, for this series we opened registration to the general public. And some cool things happened:

  • our average audience size more than doubled (no small feat in vacation season),
  • we consistently attracted viewers from more than 15 countries, and
  • our international attendance was 4 times higher than normal.

Rad Resource: The recordings of each webinar, though normally also a benefit of AEA membership, are available for anyone to view. The ICC TIG has graciously posted all 4 of them in one location for your convenience.

Rad Resource: The webinars were based on a complete set of modules originally developed by Catholic Relief Services and American Red Cross with funding from USAID. Even if your field of focus is not international monitoring and evaluation, I was impressed by the versatility of the materials and how easy it would be to adapt module content to a variety of evaluation contexts.

Hot Tip: Some of these awesome presenters will be discussing international monitoring and evaluation at greater length at the conference. Get in on the conversation with Guy Sharrock and Alice Willard at their panel session or stop in to the multipaper session where Scott Chaplowe is discussant.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi, we are Tom Archibald and Jane Buckley with the Cornell Office for Research on Evaluation. Among other initiatives, we work in partnership with non-formal educators to build evaluation capacity. We have been exploring the idea of evaluative thinking, which we believe is an essential, yet elusive, ingredient in evaluation capacity building (ECB). Below, we share insights gained through our efforts to understand, describe, measure, and promote evaluative thinking (ET)—not to be confused with the iconic alien!

Lesson Learned: From evaluation

  • Michael Patton, in an interview with Lisa Waldick from the International Development Research Center (IDRC), defines it as a willingness to ask: “How do we know what we think we know? … Evaluative thinking is not just limited to evaluation projects…it’s an analytical way of thinking that infuses everything that goes on.”
  • Jean King, in her 2007 New Directions for Evaluation article on developing evaluation capacity through process use, writes “The concept of free-range evaluation captures the ultimate outcome of ECB: evaluative thinking that lives unfettered in an organization.”
  • Evaluative thinkers are not satisfied with simply posing the right questions. According to Preskill and Boyle’s multidisciplinary model of ECB in the American Journal of Evaluation in 2008, they possess an “evaluative affect.”

Lesson Learned: From other fields

Notions related to ET are common in both cognitive research (e.g., evaluativist thinking and metacognition) and education research (e.g., critical thinking), so we searched the literature in those fields and came to define ET as comprised of:

  • Thinking skills (e.g., questioning, reflection, decision making, strategizing, and identifying assumptions), and
  • Evaluation attitudes (e.g., desire for the truth, belief in the value of evaluation, belief in the value of evidence, inquisitiveness, and skepticism.)

Then, informed by our experience with a multi-year ECB initiative, we identified five macro-level indicators of ET:

  • Posing thoughtful questions
  • Describing and illustrating thinking
  • Active engagement in the pursuit of understanding
  • Seeking alternatives
  • Believing in the value of evaluation

Rad Resource: Towards measuring ET

Based on these indicators, we have begun developing tools (scale, interview protocol, observation protocol) to collect data on ET. They are still under development and have not yet undergone validity and reliability testing, which we hope to accomplish in the coming year. You can access the draft measures here. We value any feedback you can provide us about these tools.

Rad Resource: Towards promoting ET

One way we promote ET is through The Guide to the Systems Evaluation Protocol, a text that is part of our ECB process. It contains some activities and approaches which we feel foster ET, and thus internal evaluation capacity, among the educators with whom we work.


Tom and Jane will be offering an AEA Coffee Break Webinar on this topic on May 31st. If you are an AEA member, go here to learn more and register. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

I’m Corinne Ranney-Philbrick and this year I have a new daughter and my available funding for profession development has been cut. Taken together, I’m looking online for inexpensive training opportunities and thought that I would share a few free webinar-based options that I have found:

Rad Resource – Equity-Focused Evaluations Webinar Series: Sponsored by a collaborative group that includes UNICEF, UNWOMEN, the Rockefeller Foundation, Claremont Graduate University and the International Organization for Cooperation in Evaluation (IOCE), the free Equity-Focused Evaluation series begins on September 6 and continues with one or two live hour-long webinars each month through June of 2012. They  have upcoming sessions on Human Rights and Gender Quality in Evaluations, Systems Approaches to Address Ethical Issues, Values-Engaged Evaluation, and Culturally Responsive Evaluation among many others. See the list and sign up for the first few here

Rad Resource – Evalua|t|e Webinar Series: These webinars are from the Evaluation Resource Center for Advanced Technological Education operated by The Evaluation Center at Western Michigan University. Aimed specifically at Advanced Technological Education (ATE) National Science Foundation grantees, their series is very focused on this population, yet with some wider application. They have recorded webinars on Strong Evaluation Plans, Assessing Grant Outcomes, and Making Sense of Data Recording available at

Rad Resource – CDC Coffee Breaks Evaluation Mini-Trainings: The recordings of this ongoing series of 20-minute webinars are available to the public. They’ve covered such topics as “What should I evaluate?” “How do I develop a logic model?” and “What counts as evidence?” Although examples are drawn from heart disease and stroke prevention programs and they focus on parts of the CDC Evaluation Framework Model, I have found the underlying content to be useful in contexts inside and outside of public health. See the recordings at

Rad Resource – AEA Coffee Break Webinars: If you are an AEA member, you probably know about this webinar series that is free for AEA members. Three or four times each month, on Thursdays from 2:00 to 2:20 Eastern Time, an AEA member makes a short, useful, presentation on a topic related to some aspect of evaluation. The fall schedule is now up with Coffee Break Webinars on Data Screening, Fuzzy Logic Models, Measuring Institutional Capacity, Social Network Analysis, Video in Evaluation, and more. They are all free to AEA members and a good way to introduce yourself to a new topic in 20 minutes or less. See

Hot Tip: The Coffee Break Webinars are recorded and the archive is free for AEA members. There are over 50 Coffee Break Webinar recordings available at

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

I’m Mario Lurig, the Technical Training Manager at SurveyGizmo, an online survey software tool. Translation: I help evaluators become survey rockstars for their clients.

Hot Tips: There are 3 unique questions that evaluators need to keep in the forefront of their mind when surveying respondents:

  • Is there an easier way to get my data?
  • How can I communicate the results effectively?
  • Is there an opportunity for further research?

Is there an easier way? Probably. When you first start learning a new skill, your brain thinks in tangents, trying to find various ways of accomplishing the same thing. Imagine building a fort in your home with just the items in the house. Kids thrive at this game because the ‘how’ is not already defined, while adults start thinking about the items that fit a particular picture they already have created. Just because it has always been done a particular way doesn’t mean that there isn’t a faster or easier way, you just want to be open to the idea of doing it a little differently, especially if it saves you time or energy. I didn’t write this on a typewriter.

How can I get my results across? If you’ve never heard of or read the book, Made to Stick by Dan and Chip Heath, I strongly recommend giving it a read. Inevitably, when it is time to present your data, you need to make the overall point a SUCCESs: Simplicity, Unexpectedness, Concreteness, Credibility, Emotions, and Stories. Get your client hooked on your results, then dig into the data. Many times presenters justify their results before ever presenting the final findings, losing their audience instead of grabbing them from the beginning. Besides, in the end, you want to impress your clients with your work, which leads to the final question.

Is there an opportunity? Data is a funny thing, because it is likely that in all the results you were looking to gather, you sometimes stumble against something interesting and most likely outside of the scope of your current goals. How is this information typically treated? “That’s interesting; Moving on…” Do you hear that? It’s opportunity knocking. Put the data aside, and when the main project is completed, spend a little time seeing if there is an opportunity for further research for the client. An evaluator’s job is to not only gather and make sense of the information, but provide some direction for the future. It’s always better to be the driver than to be a passenger who can be removed from the car at any point in time.

Mario will be offering an AEA Coffee Break Demonstration Webinar this coming Thursday, September 16, from 2:00-2:20 PM EST. The webinar is free to AEA Members – learn more and register at SurveyGizmo will be exhibiting at Evaluation 2010 this November in San Antonio. Take a moment to stop by their table if you will be at the conference to see their software in action!

· · · ·

My name is Susan Kistler and I am AEA’s Executive Director. I contribute each Saturday’s post to the aea365 blog. This week, Lois Ritter and Tessa Robinette gave a great (free!) webinar for AEA comparing Surveymonkey and Zoomerang, helping to compare and contrast the two for potential new users. Building on their presentation, and from my own experiences having worked with both of these programs as well as two others over the past two years, I wanted to share this week a tip and lesson learned in using online survey software.

Hot Tip: Take Advantage of the Free Trial – The majority of survey platforms offer a free trial, usually allowing for only a few questions and respondents. Create and try out a sample survey from beginning to end, including sending invitations, collecting data, and completing the analysis, usually by exporting the data into other software for further analysis. Walk through the entire process with data parallel to that which you anticipate for your actual survey.

Lessons learned – Permissions: Does your survey platform help you to comply with IRB expectations? With the CAN-SPAM act? While using the built-in invitation functions of many online survey platforms can help you with sending, tracking, and compliance, it can also distort your sample and limit your access to potentially viable respondents because opt-out treatment varies from platform to platform. Using the two platforms discussed in this week’s webinar as examples, when survey recipients opt-out of a SurveyMonkey survey invitation, by clicking the opt-out button on the bottom, they are directed to this address and they opt-out of ALL surveys from SurveyMonkey, not just a particular survey or those from a particular sender. Alternatively, when users opt out of a survey from Zoomerang, the default is that they opt out of surveys only from that particular sender (see Notably though, it is much easier to opt back in to SurveyMonkey than Zoomerang. Lesson learned? Research how your platform treats opt-outs and determine how this is likely to impact your respondent pool.

Hot Tip: If you are an AEA member, review the Ritter and Robinette Webinar Recording in the AEA webinars archive at to gain a better understanding of other considerations when choosing a survey platform.

This contribution is from the aea365 Daily Tips blog, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to The above comments reflect my own opinion and not necessarily that of the American Evaluation Association.

· · · ·

Hello, my name is Scott Cody and I’m Deputy Director of the What Works Clearinghouse (WWC). I’m also the Associate Director of Human Services Research at Mathematica Policy Research. I’d like to share with you an important resource for researchers, educators, and policy makers. Evaluators know how important study design is to the validity of study results. Valid research can educate the public and empower them to make better decisions about everything from healthcare to education.

Resource: The WWC, founded by the U.S. Department of Education’s Institute of Education Sciences, is a central source for comprehensive reviews of education research. Each WWC review takes a thorough look at the research on a particular topic or education program, product, or practice. Our goal is to identify well-designed studies and summarize those studies’ findings for decision-makers. To do this, we measure each study against the WWC research standards. These standards apply to the study methodology, the strength of the study’s data, and the adequacy of the study’s statistical procedures. We then summarize the findings of all studies that meet WWC standards, and develop an overall rating of effectiveness. In this way, WWC reports tell educators what the highest-quality research says about the effectiveness of individual education interventions. The WWC may be accessed online at

Want to learn more about the WWC?  Join Scott for an AEA Coffee Break Webinar, Thursday, May 13, 2:00-2:20 PM EST. He’ll demonstrate how to get the most out of the WWC website and locate important information for decision-making in education. Sign up at

Want to learn more about the WWC’s standards for research?  Join Neil Seftor, Deputy Director of the WWC for an AEA Coffee Break Webinar Thursday, May 20, 2:00-2:20 PM EST. He’ll cover the design and reporting requirements researchers must demonstrate to meet WWC standards. Sign up at

· ·


To top