Greetings! I’m Natalie Trisilla and I lead the International Republican Institute’s Office of Monitoring, Evaluation, and Learning. We lead dozens of internal evaluations annually. These occur throughout the project lifecycle – from needs assessments and baseline evaluations to mid-point and final evaluations. We always strive to involve our program staff colleagues, who actually implement the projects we evaluate, throughout the evaluation process. For example, their firsthand expertise is key to developing a feasible, culturally sensitive methodology. Further, their role as project designers and implementers makes them ideally suited to utilize the results of the evaluation. Here are two additional ways that we involve program staff in the data analysis and recommendations/conclusions development process. These practices might be useful for you, too!
- Daily After-Action Review: If our program staff colleagues are also in the field for data collection, we have a quick, daily after-action review session. These are informal, often happening during dinner or in the taxi back to the hotel, but have proven useful to elicit rapid, yet systematic, reflection. We use three basic prompts:
- What was the most surprising or unexpected aspect of today?
- What was the most common topic or issue you think you heard/observed today?
- From your perspective, did any other notable things happen today?
This daily debrief allows us the opportunity to identify gaps or challenges with data collection tools or protocols early, and adapt accordingly. It also provides an important opportunity for program staff to practice critical thinking and analysis skills, which builds their overall evaluative thinking capacity.
- Data Validation & Analysis Session: After initial data analysis and before we develop recommendations, we convene a 60 minute data validation session for interested program staff. This includes program staff with relevant thematic experience or regional expertise who are not part of the project being evaluated. During this interactive session, we write key evaluation findings on flip chart paper that we post around a conference room. Next, staff to walk around the room, read the findings, and note their questions, reflections, reactions to the findings somewhere on the corresponding flipchart paper. We spend the remainder of the time exploring their reactions and/or reflections to the findings as a group. This analysis process is valuable to us as evaluators since our program staff colleagues’ insights put our data and resultant findings in the broader sociopolitical context, making them more specific and meaningful. Additionally, the sessions provide us with rich details and examples from other projects to craft more detailed, actionable recommendations. Program staff also appreciate these sessions since they are exposed to learning across projects and feel more invested in evaluations, especially the recommendations.
The American Evaluation Association is celebrating Democracy & Governance TIG Week with our colleagues in the Democracy & Governance Topical Interest Group. The contributions all this week to aea365 come from our DG TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on theaea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to email@example.com. aea365 is sponsored by theAmerican Evaluation Association and provides a Tip-a-Day by and for evaluators.