Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

Strategies for a High Social Network Analysis Survey Response Rate by Nubia Goodwin and Carolyn Fisher

Hello, AEA365 community! Liz DiLuzio here, Lead Curator of the blog. This week is Individuals Week, which means we take a break from our themed weeks and spotlight the Hot Tips, Cool Tricks, Rad Resources and Lessons Learned from any evaluator interested in sharing. Would you like to contribute to future individuals weeks? Email me at AEA365@eval.org with an idea or a draft and we will make it happen.


We are Nubia Goodwin (Research Associate) and Carolyn Fisher (Research and Evaluation Scientist) of the Institute for Community Health. We are both members of a project team that conducted a social network analysis as part of our evaluation of  a multi-year, national funding project focused on community power building.

As part of a larger mixed-methods evaluation, our team conducted a social network analysis (SNA) to analyze the impact of the Robert Wood Johnson Foundation’s Voices for Health Justice (VHJ) funding program to networks of organizing and policy advocacy organizations. Using a closed-network longitudinal SNA, we are able to visually describe the overall relational network structure and to test hypotheses about the ways that the power of individual organizations and the strength of relationships changed over time. One unexpected takeaway of this analysis involves our survey methodology. Our strategic and thoughtful survey planning helped us achieve response rates of 84% (first administration) and 86% (second administration). Here’s how we did it:

Hot Tips

Conduct a closed network SNA if possible.  SNAs often use snowball sampling where the focus is on identifying relationships that exist through the data collection.  A closed network analysis differs from traditional snowball sampling because the “nodes” or organizations being surveyed belong to a predefined network. In our case, we defined our network as the 93 organizations that were partners in the grant program. The use of a closed network analysis was ideal for our analysis, helping us answer our research question about the strength of these relationships within our funded network of grantees and to achieve a high response rate. At the same time, we were able to learn about other organizations that were important in the networks of the grantees through inviting them to list their other partners and describe those relationships. However, we did not take the next step of reaching out and attempting to survey these partners.

Engage stakeholders to create a relevant and non-burdensome survey. The survey was co-developed, using questions adapted from the Wilder Collaboration Factors Inventory, and piloted with an evaluation advisory committee (EAC) composed of paid volunteer staff from grantee organizations. This Committee provided insight into which questions felt most relevant and meaningful and which data would be most useful from the point of view of the community members and organizations most affected by the project and the evaluation. This engagement with the EAC helped us keep our “asks” purposeful and minimal.

Be strategic with survey design. Our data collection was conducted at two timepoints and included two complementary surveys.  In our first survey, we asked about two different timepoints (“before the grant started”, and “now”), using retrospective pre/post questions. In the second survey, we implemented a method called proactive dependent questioning, where respondents are reminded of the answer they gave in the previous survey, before being asked about their current status. This helped to keep our survey brief, reproduced the “comparison” mindset of the first survey, and reduced survey fatigue and recall bias among grantees.

Rad Resources


Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.