AEA365 | A Tip-a-Day by and for Evaluators

TAG | social change

Introduction: I am Jennifer (Jenny) Cohen, community-organizer by training and program evaluator in practice, most recently in Israel and with projects designed to advance advocacy and policy change. Wearing several different hats, including for the last six years with Sikkuy, I facilitate evaluation capacity-building activities as an organizational management tool. Academic and field research increasingly reflects that when non-profit organizations systematically evaluate their work, they improve not only internal communications and planning, but also the outcomes and impact of their social change initiatives.

Lessons Learned: It is increasingly recognized that relatively simple efforts to understand the contribution that organizations make to social progress can lead to significantly more valuable findings and learnings than attempts to assign attribution to a specific stakeholder and/or intervention, especially in complex settings. A compelling example is where Jewish and Palestinian citizens of Israel, working as joint advocates to advance equality and shared society, seek to quantify the specific impacts of their own initiatives, alongside and/or isolated from the work of others. Innovative evaluation tools, implemented systematically, demonstrate that persistence pays off big time, even in the face of seemingly insurmountable barriers including limited resources and fears of criticism from all sides.

Hot Tips (or Cool Tricks):

To help social change organizations use evaluation as a regular part of their work, I recently developed this “Contribution/Attribution Scale”, which clarifies the connections between different advocacy interventions, within and across projects, organizations, and sectors. It’s a simple tool that can be used proactively for planning and/or retroactively for mapping how different advocacy efforts influence a particular policy change.

  • The (horizontal) X-axis shows the extent to which an intervention is employed cooperatively (towards the left) or individually by one stakeholder (towards the right).
  • The (vertical) Y-axis refers to where an intervention takes place, either in the public eye or behind closed doors (an important consideration in complex policy change work).
  • The colors symbolize an intervention’s understood impact as positive (green), harmful (red) or both positive and negative (yellow).
  • The shapes represent different stakeholders (NGO X is round, Local Municipality Y is square), and the size of the shapes indicate the scope of the intervention, relative to others.

Most simply, the tool can be used for self-reporting by team members posting stickers on a board in a planning or mapping/data analysis session. With a bit more technological investment, self-reporting by multiple players can occur simultaneously through front-end data loading into a centralized bank, allowing for challenges to or triangulation of data. Foundations have begun to utilize this tool for mapping their grantee contributions in a targeted policy area.

Rad ResourcesMayne gets an earlier tip of my hat and Garner & Brindis the latest and no less grateful nod for their significant contributions to this critical conversation.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi, my name is Bikash Kumar Koirala. I work as a Monitoring and Evaluation Officer in the NGO Equal Access Nepal, which is based in Kathmandu, Nepal.  I have been practicing monitoring and evaluation work for over five years, which is focused on development communication programs.  A research project that EAN has collaborated on Assessing Communication for Social Change (AC4SC) developed a participatory M&E toolkit based on our experiences.  One of the modules in this toolkit is the Communication Module, which is summarized as follows.

As a result of AC4SC, the communication systems in our organization improved a lot and became more participatory. We began to understand that effective communication and continuous feedback is essential to the success of participatory M&E. Communication inside organizations and outside can be quite challenging sometimes because different people have different perspectives and experiences.

Lessons Learned

Community Involvement: After the AC4SC project, the level of engagement with communities by the M&E team increased considerably. Their involvement in ongoing participatory research activities and providing critical feedback has proved very useful to our radio program development. This has increased community ownership of our programs. As well as work undertaken by the M&E team, this research is conducted by network of embedded community researchers (CRs).  These activities have produced research data, which is analyzed and triangulated with the other sources of data (such as listeners’ letters) to produce more rigorous results.

Internal Communication: Regular constructive feedback related to program impact and improvement is given to content teams by the M&E team.  This has increased dialogue and cooperation between the M&E and content team members.  Before the AC4SC project, content team members didn’t usually take M&E findings into account because they felt that they already knew the value of the program content through positive feedback from listener letters. The value of M&E has now been recognized by the content teams. They now ask for more in-depth data to generalize feedback they receive. The M&E team addresses this through research and analysis using many different forms of data from varied sources.

Use of New Communication Technology: The M&E team has been analyzing SMS polls, text messages, and letter responses, and triangulating these with the CRs research data and short questionnaire responses to present more rigorous results to program team members, donors and other stakeholders.

Some Challenges: In participatory M&E it is important to understand the roles of everyone involved in the process. Effectively presenting results for better communication and the utilization of M&E findings among different stakeholders is an ongoing challenge. Time to effectively undertake participatory M&E and is also an ongoing challenge.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · · · · · ·

Archives

To top