Welcome to Internal Evaluation Week, hosted by the Internal Evaluation TIG! This week’s blogs focus on the theme of improvement: as internal evaluators, how are we ensuring that insights are being used to inform program and organizational improvement? Topics include improvement science, communities of practice, and participatory data interpretation.
Hi, my name is Ali Holstein and I’m a consultant and data translator working with youth programs. In my previous role as a program analyst at a non-profit, I would often bring people together to discuss our outcome data, reflect, and set goals for improvement. On paper, it sounds like a solid performance management plan. In reality, those meetings could fall flat. We once asked staff how they felt about data, and their responses included: “anxious”, “stressed”, and “annoyed.” This year, we’re trying a new approach, piloting a continuous improvement fellowship as a different way of learning.
In our Good Shepherd Services Improves Fellowship, we invited staff from Good Shepherd’s school programs to examine a shared problem they had identified as ongoing, hard, and frustrating: chronic absenteeism. Staff are pressed for time, but we still had five programs apply to participate in the 6-month fellowship. They were motivated to find better approaches to solving a problem that many had been dealing with their entire careers, and hoped to do so by sharing ideas among a network of peers and learning to solve problems in a disciplined way.
Lessons Learned:
The fellowship has led us to examine how Good Shepherd uses data to think about improvement:
Old Way of thinking: If we bring people together to look at data, it will spark change.
New Way: Invite staff to use improvement science methods to look at a specific problem. Spend time understanding the problem and researching solutions before creating any change ideas. Going slow is okay.
Old Way: If we present the data in different ways, it will lead to insights.
New Way: There are endless ways to use data to understand a problem. Start with insights and follow with the data to drill deeper. Staff already have instincts about what’s working, what hasn’t worked, and what might be worth trying.
Old Way: Staff are too busy for data stuff, so we should do the analysis and present our findings to them.
New Way: Invest time in building the data skills of your staff. They will feel more empowered to ask key improvement questions like: what is the problem we’re trying to solve? What change should we introduce? How will we know if the change is an improvement? [link]
Rad Resources:
- Don’t leave participants/constituents out of your improvement process. As homework, our fellows were asked to do empathy interviews with students to learn more about chronic absenteeism.
- While specific to the school context, Learning to Improve: How America’s Schools Can Get Better at Getting Better by Bryk, Gomez, Grunow, and LeMahieu is a great resource for anyone looking to learn about improvement science.
The American Evaluation Association is celebrating Internal Evaluation (IE) Topical Interest Group Week. The contributions all this week to aea365 come from our IE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Hello Ali,
I think it is necessary to take the time for something to be done properly, especially when it comes to organizing a program.
Hi Ali,
Just wanted to say thank you for the insightful post! What I found interesting was the connection to evaluation to your staffs reaction- they often felt “anxious”, “stressed”, and “annoyed”. All in all it feels like a performance management plan, as you had mentioned! I think about my own interpretation with data collection in my school and the evaluative process that follows and you hit the nail on the head! It does feel like every little thing is being scrutinized and analyzed. I could see how your shift on evaluation approach could greatly impact the staff’s engagement and you’re invested in supporting your staff rather than just throwing a bunch of data at them and telling them what to do with it.
My question for you is how did staff react to the shift from the “old way” of thinking versus the “new way of thinking”? Did you see a shift in engagement and utilization of the information collected by the data? How did you begin this process of changing the conversation with your staff?
Thanks very much for sharing your blog,
Rachelle
Hi Rachelle,
Thanks for your message and questions! To answer your question, I started changing the conversation by listening and noticing how staff felt about current “data meetings.” I taught a graduate class recently and realized I was terrified of the student evaluations I would receive. It’s a very vulnerable position, and it’s natural to be fearful. So, building trust is key. Being sensitive to what data and meetings feel safe and which don’t. I acknowledge all that with staff as well as the limitations of “accountability” data that tells you how well you did but not how to do it better. It often creates more questions than answers. When staff hear me echo their concerns their walls go down a bit.
For some, there was definitely a shift in thinking about data as a tool. What was harder (and what I’m focusing on more now) are (1) the DIY skills of developing or monitoring measures of success and (2) having structured meetings or spaces to slow down (a luxury in many schools settings) and plan/reflect.
Hope this is helpful!
Ali
Hi Ali, just a quick note to say that I appreciated reading your blog. The approach you described for engaging more productively with staff and other stakeholders around data and how it might be useful definitely fits with my experiences of good practices.
One related question for you: When you bring folks together to talk about data, do you provide any thought questions or other prep to help them do some thinking ahead of time and show up more ready for the conversation?
Cheers, Josh
Thanks for the note, Josh. Good question. I guess it depends on the meeting and the audience. I always send an agenda ahead of time, which typically includes a goal. And if the data is not too onerous I might include that too. For the fellowship I described, I did ask folks to do some homework. For one meeting it was finding a “positive variant” – a student who showed great gains in attendance since last year – and try to figure out why. I also try to start meetings with a warm-up exercise that connects people to why we’re even looking at data (our mission, goals, etc.).