Greetings from Toronto! I’m Sandra Nkusi, an Impact Measurement and Evaluation Analyst with the Ontario Trillium Foundation in Ontario, Canada.
Lessons Learned: As the field of evaluation has matured and program evaluations have become more commonplace, a major challenge that I – and, I’m sure, many of you – have encountered, is the ability to collectively interpret results from across evaluation reports. Reports tend to be rich with both structured and unstructured data, the latter of which is more time-consuming and difficult to analyze. Using open-source platforms, such as R, it is becoming increasingly easier to navigate unstructured data, for meaningful insights and findings.
Hot Tip: While each evaluation report may vary in format and presentation, oftentimes they contain similarly labeled fields; Headings such as “major challenges”, “what worked”, etc… help to highlight areas containing text that can be analyzed across reports. Using frequency of terms/phrases, we can highlight trends in the common successes and challenges experienced by the evaluated programs. Further, clustering programs according to a “program description” field can help to refine these lessons for a more targeted analysis of lessons learned through the evaluation.
The American Evaluation Association is celebrating International and Cross-Cultural (ICCE) TIG Week with our colleagues in the International and Cross-Cultural Topical Interest Group. The contributions all this week to aea365 come from our ICCE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to email@example.com. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.