My name is Jonathan Margolin, and I am a senior researcher in the Education Program at American Institutes for Research, where I work primarily in the State and Local Evaluation Center. One common challenge when evaluating the implementation of educational programs is to understand how the program is interpreted and adapted by teachers and schools. This issue is particularly challenging when the program is being implemented in dozens of sites across the country, where it is often not feasible to conduct in depth case studies or collect other implementation data. One low cost and highly efficient approach to capturing data on implementation is to provide teachers with online logs with which to record classroom activities. We used this approach in our recent evaluation of The CryptoClub, an informal program involving cryptography and mathematics (more information about the program is available here).
Lessons Learned:
- Make the online log as simple and easy-to-use as possible. The activity leaders involved in the CryptoClub were typically operating in an afterschool setting with little time to fill out a lengthy log. To simplify things, we recommend you think about the key elements of the program to keep track of. For the CryptoClub program, the key elements were the cryptography and math topics addressed during the session. To determine these key elements, it is helpful to think forward to the evaluation report itself, and whether each element on the form would add appreciably to the report’s usefulness.
- Track response rates and follow up with teachers. In an earlier phase of our evaluation, we used paper logs. By the time we had collected the logs, we discovered that many, if not most, of the activity leaders did not regularly complete their logs. By putting these logs online, however, we were able to track and monitor completion rates of each teacher. We were able to follow up with non-responders via email to provide a polite reminder to complete them.
- Speaking of response rate, we realized that it was important to communicate our expectations clearly with activity leaders. We told them that they would need to complete logs for at least 80 percent of their sessions for us to have an accurate picture of their experiences. We used the data only from the activity leaders who reached this threshold.
Hot tips: For our log, we used a form in Microsoft SharePoint, but there are many highly accessible approaches to creating these logs. For example, in SurveyMonkey and other similar services, it is possible to set the survey to “kiosk” mode, to allow respondents to complete the survey multiple times. By asking the respondents to indicate the date and their program location, it is easy to generate compliance reports.
Rad Resource: See our handout from Evaluation 2013 for a step-by-step approach to developing logs that includes a picture of the online log and data summary table.
The American Evaluation Association is celebrating Chicagoland Evaluation Association (CEA) Affiliate Week. The contributions all this week to aea365 come from CEA members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.