I’m Jewlya Lynn, CEO at Spark Policy Institute, where we combine policy work with real-time evaluations of advocacy, field building, collective impact, and systems building to achieve sustainable, meaningful change.
While advocacy evaluation as a field has developed tools and resources that are practical and appropriate for advocacy, it has done little to figure out the messy issue of evaluating actual changes in public will.
Most advocacy evaluation tools are too focused on the advocates and champions to learn about the impact on the public. Polling is one approach, but if you’re on the ground mobilizing volunteers to change the way the public is thinking about an issue, public polls are too far removed from the immediate impact of your work. So what do you evaluate?
Cool Trick: When evaluating a campaign to build public will for access to healthcare, polling results provided us with context on the issue, but didn’t help us understand the impact on the general public. Evaluating the immediate outcome of a strategy (e.g., how forum participants responded to the event) had value, but also didn’t tell us enough about the overall impact of the work on public will.
We decided to try a new approach, designing a “stakeholder fieldwork” technique that was a hybrid of polling and more traditional interviews and surveys:
- Similar to polling, the interviews took only 15 minutes, were by phone and were unscheduled and unexpected.
- Unlike typical polling, the participants were identified by sampling the phone numbers of the actual audience members of the various grantee activities. Participants were called by researchers with community mobilizing experience and the questions were open-ended, exploring audience experiences with the activity they had been exposed to and how they engaged in other parts of the strategy. We asked for the names and contact information of people they talked to about their experience, allowing us to call the people who represented the “ripple effect.”
The outcome? We learned about the ways that over 100 audience members benefited from multiple types of engagement and we learned about the impact of the “ripple effect” including the echo chamber that existed among audiences of the overall strategy.
Hot (Cheap) Tip: Polling companies use online software to manage the high volume outbound calling and to capture the data. Don’t have money to purchase this type of capacity? We adapted a typical online survey program into our very own polling software!
Rad Resource: The Building Public Will 5-Phase Communication Approach from The Metropolitan Group is a great resource to guide your evaluation design and give you language to help communicate your results.
The American Evaluation Association is celebrating APC TIG Week with our colleagues in the Advocacy and Policy Change Topical Interest Group. The contributions all this week to aea365 come from our AP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to email@example.com. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.