Hi, I’m Barbara Klugman. I offer strategy support and conduct evaluations with social justice funders, NGOs, networks and leadership training institutions in South Africa and internationally. This blog is about the value of establishing an Evaluation Advisory Group when your task requires more skills that you have!
I’m currently the ‘learning and assessment partner’ of Tekano an organisation which runs the Atlantic Fellows Programme for Health Equity South Africa. I have found this experience challenging, not least because it’s my first experience in long term developmental evaluation, because I have multiple accountabilities – to funder, Tekano board, and staff, as well as to the public good given the country’s desperate need for a strong fellowship programme on social determinants of health; and because my primary expertise is in strategizing and evaluating social justice advocacy initiatives rather than leadership development. Finally, I am a white older woman playing the role of ‘critical friend’ to an initiative whose staff and fellows are mostly black and young and whose advocacy experience is taking place in a vastly different historical moment from when I built such experience.
With these constraints in mind, when I developed my terms of reference, I motivated a budget to allow me to establish an Advisory Group. I sought members to fill three gaps in my expertise (that I was aware of) – an organizational psychologist whose expertise traverses organisational behaviour and monitoring and evaluation; a fellowship evaluation expert; and an expert in community-building and evaluation in the South African context.
I engaged the group to help me review the vast quantity of data and analysis I had produced in order to hone in on priorities for my final 18 months in this role, and to help me clarify how to manage my layers of accountability. It was fantastic!! The group brought totally independent eyes to bear on my questions. They helped me distinguish the nature and hierarchy of my various accountabilities. They confirmed my own conclusions on achievements and challenges thus far while hauling me out of the data, insights and relationships I was buried in. This allowed me to hone in on powerful questions and ways of asking them, for the second half of my tenure in this role.
I loved every minute of the experience even while it was exhausting and challenging. I would encourage you to build in space for this kind of independent support if you’re conducting developmental evaluations.
Hot tips: You have to be a fairly confident person to open yourself up to criticism. Choose people who are ethical and don’t have oversized egos!
Lesson Learned:
- Building a formal advisory group into evaluation budget allows you to identify exactly
- what skills might complement your own, and to compensate people appropriately for their time
- My huge thanks to Drs. Jane Reisman, Mark Abrahams and Suki Goodman for their efforts on my advisory group.
Hi Erin, thanks for your comments. I couldn’t possibly generalise to other evaluations. There are times when a situation / context and the type of work a group is doing are very familiar and the evaluation questions are well within your scope of expertise; but there are times when even while you may be an appropriate evaluator you feel there are gaps. I have had the experience of being invited to partner in an evaluation by a highly skilled evaluator – in this case Ricardo Wilson-Grau, where he brought expertise on the method but in principle he liked to work with a content expert, and the evaluand was a women’s rights network. So he asked them to identify someone with that experience, and they asked me. Then Ricardo and I had a series of email conversations (him in Rio, me in Johannesburg) to see if our philosophies on life and on evaluation were in synch with each other, and then we did the evaluation together which took us on a whole new road of mutual support. So this is another possible route. I think your question raises another interesting issue which is whether having strong training or experience in evaluation principles and methods positions an evaluator well to evaluate anything. In my case I have chosen to work in a very narrow field of evaluating advocacy, where I have deep expertise, both in the theory and the practice of being an advocate and funding advocacy and from there I developed my thinking and practice on evaluating advocacy. I’m now working to build expertise in evaluating training that has social justice objectives. For me this works. I would not take on an evaluation that was entirely outside of this terrain. On the other hand, I’ve found that working on my particular area of expertise (sexual and reproductive justice) is a bigger challenge for me than working on other issues, because I have to be continually aware that I have my own strong views on these issues which I cannot inject into an evaluation; so working on other issues is easier!! But a lot of the current discussions about evaluation expertise focus on building the requisite expertise but not as much on if and how subject expertise or expertise in evaluating a type of intervention matters. Yours, Barbara
Pingback: Gov’t Eval TIG Week: Evaluation Advisory Groups are the way to go, but planning is key by Chris Voegeli – AEA365
Hi Barbara,
I found your article inspiring and interesting on many levels. First of all, what an amazing experience to be in South Africa working on this type of project – good for you! In reading how you have established your evaluation ‘team’ complete with individuals who, as you put it, can ‘fill the gaps in areas of expertise’ where you feel necessary,certainly speaks to your capability as an evaluator. I am currently enrolled in a graduate program at Queens University in Ontario, Canada, where the discussion of required skill sets for evaluators came up. One of the ‘dilemmas’ facing professional evaluators is that they are capable, competent and qualified to gather, analyze and report data, but may lack the skills and/or knowledge base to make recommendations for improvements/changes to programs. it’s great to see the way you have worked to combat this with the establishment of your advisory team. Do you think establishing such teams when engaging in evaluative processes on a regular basis could be beneficial to the quality of evaluations conducted?