AEA365 | A Tip-a-Day by and for Evaluators

Jul/15

28

Awab on How to Gauge Learning of a Training

My name is Awab and I am working as Monitoring & Evaluation Specialist with the Higher Education Commission (HEC), Islamabad.

To gauge the learning of a training is always a challenge. Recently, we faced this challenge when the HEC conducted the training of about 1600 top managers of Pakistani universities. The trainings were conducted through some implementation partners (IPs). We asked the IPs to conduct pre and post-training tests so that we now how much the participants could learn from these trainings. The IPs conducted the pre & post-tests. They analyzed the data and told the difference between the scores in pre-tests and post-tests. Since the post-test scores are always greater than the pre-test scores (in some of our cases, more than 100%) , the analysis painted a rosy picture of the trainings and everything looked fine (as shown in figure 1).

Figure 1: Comparison of Pre & Post-tests, shared by one of the IPs.

Awab 1

As the training reports were passed on to the M&E Unit, we rejected the analysis, because it did not give us sufficient information to know the quality of training and plan for the future.

Hot Tips: We started with asking the right questions. We told the IPs that, from the pre & post-tests analyses, we were rather interested in knowing the answers to three questions: (i) what was the pre-existing learning level of the participants?; (ii) what is the net learning attributable to the training?; and (iii) what is the learning gap we need to bridge in future training?

Cool Tricks: The answers to the three question could be given by analyzing the pre & post test scores in a very simple manner and putting the data in a stacked bar chart. We developed a model for analysis and shared it with the IPs. The results were surprisingly interested. The model gave a clear picture of the pre-existing learning, net learning and the learning lag. Thus, we were able not only to appreciate the IPs for the net learning attributable to them but also hold them accountable for the learning gap and plan for the future training.

Figure 2: Learning-based Model of Pre & Post-tests analysis.

Awab 2

Lessons Learned:

In evaluations, it is always good to ask yourself how you are going to use the data. Asking the right questions is half the solution.

For further details on how to gauge learning in a training and downloading the Excel sheets for data analysis on the given model, please click on the following links:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

 

 

2 comments

  • Heather Moore · July 30, 2015 at 9:51 am

    Great topic and article, but I am disappointed about having to pay $8.99 to access the article. My agency will not allow these subscriptions. While I would like to present the idea to my supervisor, and possibly use the workbook for an upcoming cohort training series, access in this manner is a problem. I feel teased by a good idea.

    Reply

  • Jason Ravitz · July 29, 2015 at 7:16 am

    This is a really great illustration of providing more meaningful findings by asking better questions. I also like how you visualize the results so someone can understand what happened without needing statistics (residual gain scores controlling for pre-tests). I think this means that people who liked the first example, less useful presentation of basic data, will be able to appreciate this too. Thank you for sharing.

    Reply

Leave a Reply

<<

>>

Archives

To top