AEA365 | A Tip-a-Day by and for Evaluators

CAT | Social Network Analysis

My name is Rebecca Woodland and I have had the pleasure of working to evaluate and cultivate organizational collaboration in a range of contexts and for many different purposes. In this post I’ll share tips that evaluators can use in the developmental, formative, and summative evaluation of inter-organizational and inter-personal collaboration. I’m excited to be delivering an AEA 2015 pre-conference workshop that goes into detail about these hot tips – maybe I’ll see you there!

Hot Tip #1 – Make collaboration less messy. Though ubiquitous, “collaboration” persists as an under-empiricized concept. One of the first things that evaluators looking to assess collaboration will need to do is to operationlize the construct. Familiarize yourself with collaboration theory and find specific suggestions for facilitating a shared understanding of collaboration in…Utilizing Collaboration Theory to Evaluate Strategic Alliances, and the Collaboration Evaluation and Improvement Framework. 

Hot Tip #2 – More collaboration is not always better. Levels of integration between organizations matter, but the scope and scale of integration should match the purpose and goals of the alliance.

  • The Levels of Organizational Integration Rubric (LOIR) describes five possible levels of inter-organizational integration and the purposes, strategies/tasks, leadership/decision-making, and communication characteristics that tend to be present in each. Use the LOIR to measure and cultivate ideal levels of inter-organizational collaboration.

Hot Tip #3 – Avoid “co-blaboration.” The evaluation of inter-personal collaboration can help organizational stakeholders avoid “collaboration lite,” whereby mere congeniality and imprecise conversation are confused with the type of disciplined inquiry vital to the diffusion of innovation and attainment of essential outcomes.

  • The Team Collaboration Assessment Rubric (TCAR) describes four fundamental elements of high-quality inter-personal collaboration: dialogue, decision-making, action, and evaluation. Evaluators are encouraged to adapt and administer the TCAR in ways that are most feasible, useful, and appropriate for the context of their program evaluation.

Hot Tip #4 – Use Social Network Analysis (SNA) methods (if you don’t already). SNA is a sophisticated, yet accessible, means for assessing organizational collaboration. Evaluators can use SNA to mathematically describe and visually see how “ties” between organizations or people form, and how these “links” may influence program implementation and the attainment of desired outcomes.

Rad Resource:

Coalitions that Work® offers excellent tools for evaluating coalitions and partnerships that are available in .pdf format.

Want to learn more? Register for Evaluating and Improving Organizational Collaboration at Evaluation 2015 in Chicago, IL!

This week, we’re featuring posts by people who will be presenting Professional Development workshops at Evaluation 2015 in Chicago, IL. Click here for a complete listing of Professional Development workshops offered at Evaluation 2015. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Sophia Guevara, Program Co-Chair for the Social Network Analysis (SNA) TIG.  This week, several evaluation professionals have shared with this blog’s readers their thoughts on social network analysis. With posts discussing logic models to examples of the application of social network analysis on a wide-range of evaluation questions, you’ve hopefully gained a better understanding of it.

Rad Resource: The SNA in Evaluation LinkedIn group. This group provides TIG group members with an opportunity to discuss topics of interest for those utilizing or learning about social network analysis.

Rad Resource: Join the SNA TIG group. As a member, make sure to make use of the eGroup discussion option.

Rad Resource: SNA TIG business meeting. If you are thinking of joining the TIG or have already joined and are looking to connect with other evaluation professionals making use of SNA, the business meeting is an excellent place to do just that. The SNA TIG business meeting is held at the annual American Evaluation Association conference.

Rad Resource: AEA public eLibrary and the Coffee Break Archive. There are a variety of resources that can help you learn more about the topic. For example, if you are looking to learn more about the use of SNA related-programs, check out Dr. Geletta’s coffee break webinar focused on importing spreadsheet data into Gephi.

The American Evaluation Association is celebrating Social Network Analysis Week with our colleagues in the Social Network Analysis Topical Interest Group. The contributions all this week to aea365 come from our SNA TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

Lopez KemisA

Hello from Andres Lazaro Lopez and Mari Kemis from the Research Institute for Studies in Education at Iowa State University. As STEM education becomes more of a national priority, state governments and education professionals are increasingly collaborating with nonprofits and businesses to implement statewide STEM initiatives. Supported by National Science Foundation funding, we have been tasked to conduct a process evaluation of the Iowa statewide STEM initiative in order to both assess Iowa’s initiative and create a logic model that will help inform other states on model STEM evaluation.

While social network analysis (SNA) has become commonly used to examine STEM challenges and strategies for advancement (particularly for women faculty, racial minorities, young girls, and STEM teacher turnover), to our knowledge we are the first to use SNA specifically to understand a statewide STEM initiative’s collaboration, growth, potential, and bias. Our evaluation focuses specifically on the states’ six regional STEM networks, their growth and density over the initiatives’ years (‘07-’15), and the professional affiliations of its collaborators. How we translated that into actionable decision points for key stakeholders is the focus of this blog.

Lessons Learned: With interest in both the boundaries of the statewide network and ego networks of key STEM players, we decided to use both free and fixed recall approaches. Using data from an extensive document analysis, we identified 391 STEM professionals for our roster approach. We asked respondents to categorize this list by people they knew and worked with. Next, the free recall section allowed respondents to list professionals they rely on most to accomplish their STEM work and their level of weekly communication – generating 483 additional names not identified with the roster approach. Both strategies allowed us to measure the potential and actual collaboration along the lines of the well-known network of STEM professionals (roster) and individual’s local networks (free recall).

Lopez KemisB

Lessons Learned: The data offered compelling information for both regional and statewide use. Centrality measurements helped identify regional players that had important network positions but were underutilized. Network diameter and clique score measurements informed the executive council of overall network health and specific areas that require initiative resources.

Lessons Learned: Most importantly, the SNA data allowed the initiative to see beyond the usual go-to stakeholders. With a variety of SNA measurements and our three variables, we have been successful in identifying a diverse list of stakeholders while offering suggestions of how to trim down the networks’ size without creating single points of fracture. SNA has been an invaluable tool to classify formally and evaluate the logistics of key STEM players. We recommend other STEM initiatives interested in using SNA to begin identifying a roster of collaborators early in the development of their initiative.

The American Evaluation Association is celebrating Social Network Analysis Week with our colleagues in the Social Network Analysis Topical Interest Group. The contributions all this week to aea365 come from our SNA TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Hi, I’m Rebecca Woodland, an Associate Professor of Educational Leadership at UMass Amherst. If there is one thing that I know for certain it’s that relationships matter and how we are connected influences the quality and outcomes of our shared endeavors. Social Network Analysis (SNA) has had a profound influence on my evaluation work. I want to introduce and encourage evaluators (who may not know much about SNA) to consider integrating it into their own practice.

Simply put, SNA is all about telling the story of how “ties” between people or groups form, and how these “links” may influence important program objectives and outcomes. With SNA you can mathematically describe and visually see connections between people. You can use SNA to explain and predict how ties between “actors” influence the attainment of program goals.

Hot Tips: Evaluators can use SNA to address a wide-range of pressing program evaluation questions such as these:

  1. Want to know whether a program has the capacity to spread a new or novel intervention? SNA was used to evaluate school-level capacity to support or constrain instructional innovation.
  2. Want to know how large, inter-agency partnerships develop and how inter-agency collaboration correlates with intended program outcomes? Evaluators used SNA to track the development and impact of a Safe Schools/Healthy Students inter-agency community mental health network.
  3. Want to know who influences the budgeting and disbursement of funds for advocacy programs in fragile environments? SNA was used to map the flow of resources and funding patterns for new-born survival activities in northern Nigeria.

Lesson Learned: Possibly the biggest wow factor is that SNA enables the creation of illustrative visuals that display complex information, such as intra-organizational communication flow and the location of network “brokers,” “hubs,” “isolates” and “cliques”, in user-friendly ways.

WoodlandImage via under Creative Commons Attribution 3.0 License

Rad Resources

  • ®Visualyzer is an easy to use program (with a 30-day free trial) that enables you to create socio-grams on any network of interest to you.

The American Evaluation Association is celebrating Social Network Analysis Week with our colleagues in the Social Network Analysis Topical Interest Group. The contributions all this week to aea365 come from our SNA TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi, I’m John Burrett, of Haiku Analytics Inc., Ottawa. One serious problem with logic models is that they usually leave out external influences and feedback effects, even when they may be important, because they make the model “too complex”. It is good to simplify, but ignoring important influences on program success when planning an evaluation may lead evaluators to fail to collect important data and to misinterpret results.

Trying to embrace complexity by drawing a web of boxes and arrows is not helpful: it’s too complex to use and explain and will drive your audience away. This will probably come only from the mind of the evaluator or program manager, thereby easily missing important external influences and other complexities.

Hot Tip: I recently stumbled onto an alternative approach during a mapping of factors of cause and effect related to a complex policy problem. Data was obtained from an expert panel, developing a matrix linking a number of factors with an estimate of strength and direction of relationship between them. Mapping this with network analysis software helped the panel to visualize what they had created.

It followed that this form of data could generate outcomes chains and logic models. Here’s a simple example: a program supporting trades training by providing grants to students and developing state of the art teaching materials in collaboration with trade schools drives the immediate outcomes of…

  • Students gaining the ability to take training and
  • Currency and quality of the training being improved, in order to achieve
  • The ultimate outcome of increased employment.

Exogenous effects influencing these results include cost of living, demand for skills and technical changes affecting the training’s currency. The size of the nodes indicates betweenness centrality, identifying those factors that connect many influences, thus propagating certain effects. The width of the edges indicates the hypothesized strength of influence. Possible unintended effects and a feedback loop are also shown.

Burrett

Lesson Learned: A key advantage of this approach is that that it creates a logic model using expert knowledge, rather than simply an evaluator/manager’s understanding of a program. This could also include other sources of information like findings from literature and program stakeholders’ experiences. Importantly, you could do this without imposing any prior idea of the logic model on those providing the cause-effect data other than including the program/outputs/activities and specifying the immediate/intermediate and ultimate intended outcomes.

A second major advantage is that the logic model utilizes network metrics generated from the data, so how the program and influences are expected to be related can be analyzed. For instance, factors that are thought to have an important role in propagating effects across the system would show high betweenness/eigenvector centralities.

The American Evaluation Association is celebrating Social Network Analysis Week with our colleagues in the Social Network Analysis Topical Interest Group. The contributions all this week to aea365 come from our SNA TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

I’m Bethany Laursen, an independent consultant and evaluation specialist for several units at the University of Wisconsin. I fell in love with social network analysis (SNA) as a graduate student because SNA gave me words and pictures to describe how I think. Many evaluators can relate to my story, but one of the challenges to using SNA in evaluation is identifying what counts as a “good” network structure.

Hot Tip: Identify keywords in your evaluation question(s) that tell you what counts as a “good” outcome or goal for your evaluand. An example evaluation question might be: “How can we improve the collaboration capacity of our coalition?” The stated goal is “collaboration capacity.”

Hot Tip: Use published literature to specify which network structure(s) promote that goal. Social networks are complex, requiring rigorous research to understand which functions emerge from different structures. It would be unethical for an evaluator to guess. Fortunately, a lot has been published in journals and in gray and white papers,

Continuing our example, we need to research what kinds of networks foster “collaboration capacity” so we can compare our coalition’s network to this standard and find areas for improvement. You may find a robust definition of “collaboration capacity” in the literature, but if you don’t, you will have to specify what “collaboration capacity” looks like in your coalition. Perhaps you settle on “timely exchange of resources.” Now, what does the literature say about which kinds of networks promote “timely exchange of resources”? Although many social network theories cut across subject domains, it’s best to start with your subject domain (e.g. coalitions) to help ensure assumptions and definitions mesh with your evaluand. Review papers are an excellent resource.

Lesson Learned: Although a lot has been published, many gaps remain. Sometimes the SNA literature may not be clear about which kinds of networks promote the goals you’ve identified for your evaluand. In this case, you can either 1) do some scholarship to synthesize the literature and argue for such a standard, or 2) go back to your evaluation question and redefine the goal in narrower terms that are described in the literature.

In our example, the literature on coalition networks may not have reached consensus about which types of networks promote timely exchange of resources. But perhaps reviews have been published on which types of brokers foster diversity in coalitions. You can either 1) synthesize the coalition literature to create a rigorous standard for “timely exchange of resources,” or 2) reframe the overall evaluation question as, “How can brokers improve diversity in our coalition’s network?”

Rad Resources:

This white paper clearly describes network structures that promote different types of conversations in social media

This short webinar reports a meta-synthesis of which networks promote adaptive co-management capacity at different stages of the adaptive cycle

Different network structures promote different system functions. This is the take home slide from the Rad Resource on ACM capacity. In this case, the evaluand's network goal is timely social learning, collective action, and resilience.

Different network structures promote different system functions. This is the take home slide from the Rad Resource on ACM capacity. In this case, the evaluand’s network goal is timely social learning, collective action, and resilience.

The American Evaluation Association is celebrating Social Network Analysis Week with our colleagues in the Social Network Analysis Topical Interest Group. The contributions all this week to aea365 come from our SNA TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi, I’m Maryann Durland, an independent evaluator and Social Network Analysis (SNA) practitioner. In this post I will address the requirements for doing an SNA application, particularly in evaluation, and which we could also call the standard for an application. I will use the early literature that formed and grounded SNA thinking and that continues to be relevant.

Early on in the history of defining SNA, Linton C. Freeman, described four requirements for completing a social network analysis in his book, The Development of Social Network Analysis: A Study in the Sociology of Science:

  1. A structural perspective
  2. Empirical data
  3. Graphics
  4. Mathematical models with analysis

I believe and promote, particularly in evaluation applications, that these are still the requirements for meeting the standard for doing SNA. Evaluations using SNA are distinct from research on SNA theories and measures, which may have different requirements.

In evaluation applications structural perspective means that we can define relationships within the program and these relationships create a structure through which information flows, resources are found, barriers are identified, spaces are found that need connections, and so on. Data is the existence or non-existence of a relationship between two elements. Empirical data refers to verifiable data collected on the relationship between any two elements, also called the members of a set.

Just like traditional data collection, we collect relational data through a variety of methods from observations to surveys about experiences. The data we collect populates matrix cells, indicating the presence or degree of a relationship between two members. Graphics indicate that we can visualize the network, results and/or analysis in graphs such as sociograms. Mathematical models with analysis allow us to calculate SNA measures which are measures of the network, not attributes assigned to individuals. Models called algorithms or a set of procedures, are as much a description of the relationship as they are the algorithm for how to calculate a measure.

Lesson Learned: Clearly, these four requirements delineate a specific methodological basis that is different from traditional quantitative and qualitative analysis. These requirements mean evaluators must think differently, ask questions for a different purpose, and conceptualize an evaluation differently.

Rad Resource: Early literature on SNA sought to develop what we could call standards for applications and one of the most important resources is the work of Linton C. Freeman. Freeman’s work continues to set the standards for SNA applications and the reference for the requirements.

The American Evaluation Association is celebrating Social Network Analysis Week with our colleagues in the Social Network Analysis Topical Interest Group. The contributions all this week to aea365 come from our SNA TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello, my name is Jayne Corso and I am the community manager for American Evaluation Association and the voice behind the AEA Facebook page.

If you manage a company Facebook page, you might have noticed a drop off of “likes” recently. Facebook has begun removing memorialized and voluntarily deactivated accounts from Pages’ like counts. This change ensures that data on Facebook is consistent and up-to-date—but could mean a drop for your analytics. Although some Pages might lose “likes,” they could also gain a more accurate way to track their followers. I have compiled a few tips for tracking your analytics and gaining more visibility for your page.

Rad Resource: Take advantage of Facebook “Insights”

Facebook offers Page Insights after at least 30 people have liked your Page. Use this tool to understand how people are engaging with your Page. With this tool, you can see your Page’s growth, learn which posts have the most engagement, find demographic information about your audience, and identify when your audience is using Facebook.  This data is available for free and can easily be customizable for time frame and downloaded to excel.

Rad Resource:  Use Google Analytics to track effectiveness

Tracking your analytics through Google allows you to see how many people are coming to your site from social networks, understand the website pages they are most interested in, and gain a better understanding for how your audience is engaging with your web content.  To find this information, enter your Google analytics account and go to “Acquisitions”. From here you can look at the performance of your social networks as an overview or look more specifically at referrals, activity, and user flow. All of this data allows you to gage the effectiveness of your social campaigns.

Hot Tips: Increase your Facebook likes

Finally here are a few simple tips for increasing the likes on your Facebook Page—hopefully you can make up for any followers you lost when Facebook made their changes.

  • Add the Facebook icon to your website, so visitors know you have a presence on the social network (Place the icon high on the website page, near your navigation)
  • Add the Facebook icon to your email communication or blog to reiterate your presence on Facebook to your subscribers
  • Cross promote your Facebook page on your other social media sites. You may have followers on Twitter that have not liked your Facebook page or didn’t know you had a Page

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · ·

Hello from Florent and Margaret in Sydney! We are two seasoned evaluators from ARTD Consultants, an Australian public policy consultancy firm providing services in evaluation, research and strategy. As more Australian government services and programs are delivered through partnerships, evaluators need to find better partnership evaluation methods. Faced with the challenge of evaluating partnerships, we quickly realised that there are a number of methods out there: partnership assessment surveys of varying types, social network analysis, collaboration assessment, integration measure, etc.

But which one should we choose? Having looked at a number of these we felt that choosing one would not enable us to see what was really happening at all levels of the partnership.

So, in our most recent partnership evaluation, we combined some of these methods to get a more complete picture of the partnership. The three we chose were: a partnership survey (adapted from the Nuffield Partnership Assessment Tool), an integration measure (based on the Human Service Integration Measure developed by Brown and colleagues in Canada) and Social Network Analysis (using UCINET). The diagram below represents our conceptual framework, with each method looking at the partnership at a different level: overall, between organisations and departments, and between individuals.

Gomez

Lesson learned #1: A key benefit of combining partnership assessment methods is that it enables you to look at the partnership at different levels. Adding in-depth interviews or other qualitative methods to the mix will allow you to explore further and drill down into underlying mechanisms, perceptions of what works for whom, experiences of difficulties and suggestions for improvement.

Lesson learned #2: Partnerships are abstract/ intangible evaluation objects and evaluations of partnerships often lack data about what is happening on the ground. Adding methods to quantify and substantiate partnership activities and outcomes will make your evaluation more robust and the findings easier to explain to stakeholders.

Lesson learned #3: Combining methods sits within the good old mixed-methods tradition. Various metaphors are used to describe the benefits of integrated analysis in mixed-methods research (see Bazeley, 2010). In this case, the selected methods are combined ‘for completion’, ‘for enhancement’ and as ‘pointers to a more significant whole’.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello, my name is Jayne Corso. I am a Community Manager for the American Evaluation Association (AEA) and I manage AEA’s social media presence (@aeaweb). This past week at Evaluation 2014, our event hashtag (#eval14) took on a life of its own and racked up a total of 5,601 tweets thanks to your postings, retweets, and replies. In this post, I’d like to share a few exciting trends and stats that we noticed over the course of the week.

The following data points reflect Tweets sent with the #eval14 hashtag from Monday, October 13 – Sunday, October 19

Impact: 3,707,835—This is the potential number of times someone could have seen #eval14 hashtag on their Tweet stream.

506 Contributors—This number refers to the number of twitter users that sent tweets or retweets using the #eval14 hashtag.

Our twitter community was very active throughout the conference, especially on Friday, October 17. (Click image to enlarge) 

AEA tWITTER GRAPH

Thank you to all of our 506 twitter contributors! You helped AEA keep the conference relevant on twitter and we loved seeing your original tweets. (Click image to enlarge) 

Most Active Contributors:

  1. @StrongRoots_SK
  2. @KatHaugh
  3. @Broadleafc
  4. @InnoNet_Eval
  5. @KimFLeonard

Most Popular Contributors:

  1. @Education_AIR
  2. @TechChange
  3. @NPCthinks
  4. @BillNigh
  5. @FSGtweets

Aea top tweeters

 

Here are a few great tweets we collected from this week’s festivities. Thank you for helping AEA take over twitter for Evaluation2014!

· · ·

<< Latest posts

Older posts >>

Archives

To top