AEA365 | A Tip-a-Day by and for Evaluators

Greetings aea365 readers AND authors! I’m Sheila B Robinson, Lead Curator and sometimes Saturday contributor. Recently, I wrote this post asking what you would like to read on aea365 in 2018.

Thanks to those who offered ideas! Now, I’m writing to ask YOU to consider contributing on one of these, or any other evaluation-related topics.

Lesson Learned: Readers suggestions include:

  • How evaluators collaborate with stakeholders to apply findings of the evaluation to improving their initiative.
  • Presenting evaluation findings that are negative. How do evaluators have this conversation with their clients?
  • Strategies to help bridge evaluation with policy at various levels of implementation (e.g., organizational, community, state, federal, etc.).
  • Approaches to encourage deeper and more useful conversations about evaluation findings.
  • How to make the most of evaluator/client conversations about findings and recommendations. Ways to help prepare for those, how to organize the time, facilitating them more effectively, etc. Insights to help folks come away from these meetings with a genuine sense of “time well spent.”
  • Research on evaluation
  • Participatory methodologies, equitable evaluation, evaluation and evaluative thinking capacity-building (particularly in nonprofits), language access/justice in evaluation settings, and culturally-responsive evaluation
  • Accessible/universally-designed data collection, especially for mixed-audience groups. Not as in how-to collect data from specific audiences, but how to make data collection more inclusive overall.
  • Resources for “beginner” evaluators, overviews of guiding theories of evaluation, history of the evaluation field, subfields/types of evaluation, and important terms defined.
  • Developmental evaluation and evaluating across sectors and across agencies, with a focus on bringing together diverse interests and goals!
  • How evaluation can be used to support collective impact work.
  • Specific practices foundations are implementing to further evaluation and learning with their grantees. Commentary on power differentials as they relate to these kinds of relationships.
  • Specific techniques and practices to help give evaluation away through building evaluation capacity, instructional design, best practices in teaching of evaluation, etc.
  • SGD’s M&E system. Some kind of sharing of different countries’ experiences or perspectives from public and private sector engagement.
  • A series on managing evaluation using best practices for management such as project management principles, or management practices derived from latest developments in organizational theory and behavior.

Hot Tip: We’d love to hear from YOU!  Please send a draft for consideration.

Hot Tip: You don’t have to be an expert to contribute! Readers want to hear from everyday evaluators. You don’t need to be doing something unusual or cutting edge. Share how a strategy has worked for you. Share what you’re learning about and experimenting with. Share a lesson you’ve learned. Tell about a book you’ve read, a course you took, or an experience that gave you new insight.

Cool Trick: Follow contribution guidelines! See the link right up there…near the top of your screen? We can only publish posts that adhere to these guidelines.

Get Involved: It’s time to share YOUR insights with aea365 readers! We rely on hundreds of generous authors who have contributed over the past 8 years (!) to keep this blog going. As you can imagine, collecting 365 articles each year is no small task.

Is this the year YOU decide to contribute?

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Greetings! My name is Allison Titcomb and I work at the United Way of Tucson & Southern Arizona in Community Development and Collective Impact, I teach as adjunct faculty at The University of Arizona, and I own my own consulting business, ALTA Consulting, LLC. I have been a member of AEA for almost 20 years, co-chair the Local Affiliate Collaborative, and have served on the board of the Arizona Evaluation Network for over 17 years.

The new year reminds me of cycles—of time, of work, of life. I use this season to reflect on and renew my commitments and values. A useful concept that grounds me in the normality of ups and downs, growth and diminishment is called panarchy. “Panarchy is a conceptual framework to account for the dual, and seemingly contradictory, characteristics of all complex systems – stability and change.” I’ve included a simple hand drawn rendering of how I think about this cyclical model in relation to my work and ideas over time.Panarchy hand drawn cyclical model

As you reflect on your journey, or the journey of your organization, you might pose these questions:

  • Where do you find new energy and ideas to keep you going?
  • What might be some new opportunities to connect or partner with others?
  • Who could you invite to share the learning journey?

Lesson Learned:

My own interest and journey in Evaluation has been fueled and renewed by my membership and active participation in AEA and AZENET.

Hot Tips:

  • An infusion of new board members can bring energy and exciting new ideas. Their enthusiasm can be contagious.
  • Intentionally seeking and inviting diverse membership and perspectives creates many opportunities for innovative approaches to our work as evaluators and in AEA affiliates.

Rad Resources:

  • Milwaukee Evaluation! offers inspiring clarity around their pipeline goals.
  • If you’re not already a member of a local affiliate of AEA, you can get connected via the AEA website list.
  • The Tamarack Institute has a wonderful two-page tool that describes “The Phases of Community Change Eco-Cycle Mapping Tool.” They also describe transitional traps that can stall the flow such as scarcity, charisma, rigidity, and chronic disaster. Useful for evaluators and facilitators to help contextualize change processes.
  • If you’d like to read more about Panarchy, try the book, Panarchy: Understanding Transformation in Human and Natural Systems (2002). These cycles also comes to mind when I hear evaluators talk about their local affiliates. Some have been trying to get going, but have lacked enough buy in and commitment to get started. Some have been around for years and seem to have stagnated. Some have grown suddenly and feel overwhelmed with the larger scale. All of these tie back to this cycle of growth and renewal.
  • Ready to connect with an Arizona evaluators? Check out the Arizona Evaluation Network’s 2018 From Learning to Practice: Using Evaluation to Address Social Equity conference taking place this April in Tucson, AZ.

The American Evaluation Association is celebrating Arizona Evaluation Network (AZENet) Affiliate Week. The contributions all this week to aea365 come from our AZENet members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Scott Swagerty

Scott Swagerty

Hello! I am Scott Swagerty, PhD, Vice President of Budget and Finance for the Arizona Evaluation Network and the Methodologist for the Office of the Arizona Auditor General. I work in the Performance Audit Division which conducts audits to assess the effectiveness and efficiency of Arizona state agencies and programs. I wanted to share some Hot Tips for applying evaluation principles to assess and enhance the effectiveness of state government.

These tips are all based on principles of evaluation that I have adapted to my work in performance auditing of state agencies. However, they can be applied in most evaluation contexts and in my experience help to create a more collaborative and functional relationship between evaluator and client.

Hot Tips:

  • Convince the client that evaluation is useful—unlike in traditional relationships between client and evaluator, when I work with state agencies it is typically not by invitation and our presence can be intrusive. Being prepared to tell and show the client how an evaluation can help them helps to cultivate a strong working relationship with the agencies we audit.
  • Rely on the experts—my expertise is in quantitative methodology and research design. Performance auditors’ expertise varies, but does not always coincide with the subject matter we are evaluating. Relying on the agency staff and management to help us understand the subject matter is essential in producing a useful evaluation because they are the ones who understand their processes best and know whether our suggestions for improvement will lead to meaningful change.
  • Focus on what can be changed—it is true that in many state agencies there is a shortage of resources that potentially limits the agency’s ability to effectively achieve their mission. However, an evaluation focused on the lack of resources is not useful or actionable because statewide resource allocation is not an agency-level decision. By focusing on evaluating processes or programs as they presently exist, we can suggest changes that improve service delivery to citizens without requiring additional resources.
  • Make flexible recommendations for improvement—generally problems or bottlenecks in a process are easily identifiable, but the solution(s) to fix those problems are not so straightforward. Harkening back to the principle of “rely on the experts,” I believe that rather than prescribing a specific solution, it is best when possible to make recommendations that allow the client to design an appropriate solution in conjunction with their management and relevant staff considering the resources available to them. This approach allows for creativity and innovation beyond the program/process being evaluated and invites the client to be more invested in the outcome.keyboard and data dashboard printout

Rad Resources:

The American Evaluation Association is celebrating Arizona Evaluation Network (AZENet) Affiliate Week. The contributions all this week to aea365 come from our AZENet members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hi! I’m Deven Wisner of Wisner Analytics, LLC, and Vice President of Communication for the Arizona Evaluation Network board! I have a background in evaluation and industrial-organizational psychology, both of which fuel my passion for bringing data-driven decision making into organizations.

I’m going to chat with you about making evaluation a private (i.e. for profit) organization best practice. I’ll give you an idea of what those projects look like, tips for involving yourself in for-profit evaluation, and my lessons learned. Ready? Here we go!

Lessons Learned: Needs Assessment & Program Evaluation

One client I worked with offered a new set of services derived from perceived client needs. Great, right? Right…kind of. The problem is that their needs weren’t formally identified. In other words, the data to back the decision was non-existent. Luckily, as an evaluator, I’m familiar with being the “afterthought” (and that’s okay, we’re good at saving the day!). So, I developed an after-the-fact needs assessment. We utilized this information, along with other data, to evaluate the programs and, eventually, revise them to meet the needs of their clients.

wooden sign with Awesome and right arrow, less awesome and left arrow

Image credit – Jon Tyson on Unsplash

Lessons Learned: Program & Process Evaluation

As you might know, training programs are often very costly to organizations. For one organization, I became heavily involved with training programs to evaluate their intended outcomes. Additionally, I evaluated the delivery of the training materials to determine whether it was being done as expected! Both components I evaluated are important for developing impactful training programs.

Hot Tips:

  1. Just like you spin your skills for jobs you’re applying for, you need an elevator speech tailored to the audience you’re delivering it to. This means dropping the evaluator-specific jargon and taking the time to figure out what languages are shared and finding synonymous replacements!
  2. Now that you can talk to a variety of audiences, join networking groups you ordinarily wouldn’t. In addition to your Local AEA Affiliate, find breakfast clubs, professional organizations, and leadership development groups to plug into. Know someone part of these groups? Even better, they can make a warm introduction.
  3. Some of my best connections have been made on Twitter and LinkedIn. Individuals in analytics, organizational development, and market research usually share my love for data driven-decision making. Hint: find a few great people in industries you want to become an evaluator in and mine their follow/connection list.
  4. Finally, the low hanging fruit: pitch evaluation to your current for-profit connections. They can tell you where they see value (or don’t) and maybe they’ll even introduce you to their colleagues!

    open book "Adventures and Lessons Learned"

    Image credit – Ryan Graybill on Unsplash

Lessons Learned:

  1. Evaluation is important across disciplines, and to me that means a variety of organizational settings and types, too.
  2. Evaluators are not unfamiliar with capacity building, so creating a strong argument for its inclusion in the business world is the same challenge — it just looks a little different.

Interested in connecting with fellow evaluators in sunny Arizona? Check out the Arizona Evaluation Network’s 2018 From Learning to Practice: Using Evaluation to Address Social Equity conference taking place this April in Tucson, AZ.

The American Evaluation Association is celebrating Arizona Evaluation Network (AZENet) Affiliate Week. The contributions all this week to aea365 come from our AZENet members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I’m Jenny McCullough Cosgrove, Managing Director of Noesis Consulting, and self-proclaimed #EvalNerd, currently serving at the President-Elect for the Arizona Evaluation Network. I came to evaluation via social work and neuropsychology (a heady mix, to be sure). My most successful evaluation projects have been categorized by strong working relationships with program staff. Adam Horvath wouldn’t be surprised by that fact. He recognized how critical relationships were to the success of a project and even developed a working alliance inventory for use in psychotherapy. These working relationships are equally important for an evaluator to develop with program staff. Being mindful and planful in developing a strong working alliance will set an evaluation (and evaluator) up for success.

eyeglasses on top of bookHot Tip: Start with role clarity

A key component of a strong working alliance is role clarity. Use the task of clarifying roles and tasks as an opportunity to connect and create a shared vision for your project. Review roles and tasks regularly with your client to increase attention and commitment.

Cool Tricks: Using a decision-making matrix as a relationship developer and to develop mutual project-related tasks

Co-create a decision-making matrix! The Evaluation Nerd Blog recently shared an easy tool to use to develop and sustain good working relationships by clarifying the roles and responsibilities of an evaluation. This tool is most useful when it is completed by the evaluator and program manager together (either in person or over video chat). Start by brainstorming key tasks in the project. Then discuss responsibilities and communication.  Read more about the steps to completing the matrix in the blog post.

Decision-making matrix

Decision-making matrix

Hot Tip: Check In Regularly

At the end of each meeting, review the meeting content and any decisions that were made. Repeat actions and activities that are your responsibility. Check in on how your client is feeling and ask if there is anything that wasn’t addressed or that they need from you that they still have on their mind. If there are lingering questions are issues, explicitly address how they will be attended to (e.g., put at the top of the agenda for the next evaluation meeting, follow-up email, ad hoc meeting, etc). This doesn’t necessarily mean more work for you, it just provides an opportunity for you to check in on your client’s expectations for further relationship management.

Rad Resources:

Ready to build your capacity for relationship building? Check out the Arizona Evaluation Network’s 2018 From Learning to Practice: Using Evaluation to Address Social Equity conference taking place this April in Tucson, AZ.stack of white smooth stones

The Partnership Handbook by Flo Frank and Anne Smith is a fantastic resource to support internal reflection on your working partnerships.

The Harvard Business Review provides a step-by-step guide to building collaborative alliances.

Get Involved: Let’s continue the conversation! Tweet how you develop strong #eval working relationships using the hashtag #EvalAlliance

The American Evaluation Association is celebrating Arizona Evaluation Network (AZENet) Affiliate Week. The contributions all this week to aea365 come from our AZENet members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Wendy Wolfersteig

Wendy Wolfersteig

I’m Wendy Wolfersteig, Director of the Office of Evaluation and Partner Contracts for the Southwest Interdisciplinary Research Center (SIRC) and Research Associate Professor at Arizona State University and President of the Arizona Evaluation Network. I focus on evaluating effective prevention programs and thus discuss evidence-based practice (EBP) and how to use it in community and government settings. I have explained EBP so many times in so many ways, and lately it is a hot topic.

It has taken years to bring the term “evidence-based” into the vocabulary of Arizona state and local government officials. The push over the past 8-10 years from federal, private business and foundation sources that insists upon accountability, has slowly but surely led officials to use words like evidence-based, or at least evidence-informed, in selecting programs to be funded.

Yet, the fate of evidence-based decision-making was not clear as the year came to an end. When is evidence – evidence? What is the evidence that it is a fact? How are science and evidence to be considered in practice and policy making?

Even use of EPB terminology was being questioned with reports that government staff were encouraged not to use certain words, including “science-based.” Further, the National Registry of Evidence-Based Programs & Practices (NREPP), a database of prevention and treatment programs with evidence-based ratings, had its funding ended prematurely.

I gain hope from my graduate students when we discuss evidence-based practice – in practice. We talk about the research, when are data facts, when do programs account for participants’ cultural and other differences, and how to make these judgments. This focuses us on what research and evidence can and cannot determine, and how we each make personal and professional decisions every day. We are left to ponder the outcome when the NREPP website says that “H.H.S. will continue to use the best scientific evidence available to improve the health of all Americans.”

Hot Tips:

  1. Relate and avoid jargon. Put the reasoning for evidence-based evaluation and practice into the terms used by my/your client or potential client.
  2. Talk about desired outcomes – and how what assessments, practices, programs, strategies and activities were selected – would impact what happened.
  3. Ask questions before giving answers: Why do they want a specific strategy? How do they know if it would work? Were they willing to keep on doing “what we’ve always done” without some evaluation or data to know they were spending money and time in the best interest of their clients? Do they need data to show success? Who decides?

Rad Resources:

I learned a lot about professional efforts to enhance evidence-based decision-making by participating in the EvalAction 2017 visit to my local Congressperson’s office during the AEA Conference. Here are a few resources that came to my attention.

The American Evaluation Association is celebrating Arizona Evaluation Network (AZENet) Affiliate Week. The contributions all this week to aea365 come from our AZENet members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

We are Jenny McCullough Cosgrove, Nicole Huggett, and Deven Wisner, the 2017-2018 conference planning committee for the AEA Arizona Local Affiliate, the Arizona Evaluation Network (AZENet). This year, we built off of the momentum from the inspiring #Eval17 AEA conference, to bring focus and meaningful attention to inclusion and equity in evaluative practice at our annual Arizona Evaluation conference.

Cool Tricks: Provide a Space for Evaluators to Practice

We know participatory evaluation can be a powerful tool in advancing equity by explicitly including underrepresented stakeholder voice. Given this, the conference planning committee has worked with our keynote speaker Dr. Mia Luluquisen, Deputy Director of Community Assessment Planning and Education at Alameda County Public Health Department, to build an evaluation event that incorporates an active experience in participatory evaluation. Specifically, an evaluation of the conference will be used as an introduction to this topic.

Hot Tips: Purposefully Build Inclusion and Safety into the Event

  • Choose an event location that will be accessible to all abilities.
  • Design event products and communications so they are as usable by as many participants as possible.
  • Define and use an inclusive and just vocabulary in promotion of the event and during the event.
  • Add activities that focus on experiencing deep empathy.
  • Establish ground rules for active listening; encourage all participants to engage and listen.
  • Support critical reasoning and safety in participants by asking for quiet reflection before sharing ideas.
  • Do not assume that marginalized people have the responsibility to educate evaluators on equity issues. Be mindful of asking underrepresented peoples to teach or explain their needs or experience at your event. Marginalized people are often burdened with the expectation to be the teachers in matters of justice and equity issues.

Rad Resources:

Intrigued and want to learn (or experience) more? Check out the Arizona Evaluation Network’s 2018 From Learning to Practice: Using Evaluation to Address Social Equity conference taking place this April in Tucson, AZ.

The Annie E. Casey Foundation provides seven steps to embed equity and inclusion in a program or organization in the Race Equity and Inclusion Action Guide.

Racial Equity Tools provide some wonderful resources for evaluators to learn more about the fundamentals of racial inequity, as well as useful tools and guides to support learning.

Learn more about disability inclusion strategies from the Centers for Disease Control and Prevention.

Reflect on your strategies for gender inclusion with this guide from the University of Pittsburgh.

The American Evaluation Association is celebrating Arizona Evaluation Network (AZENet) Affiliate Week. The contributions all this week to aea365 come from our AZENet members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Jeneen R. Garcia

Jeneen R. Garcia

My name is Jeneen R. Garcia. I’ve been a full-time evaluator at the Independent Evaluation Office of the Global Environment Facility (GEF IEO) for the last seven years. The GEF is the largest multilateral funder of environmental programs worldwide. Because the programs we evaluate almost all take place in complex social-ecological systems, we constantly need to seek out new methods for dealing with complexity.

One of the methods we’ve used is Social Network Analysis (SNA). In one evaluation, we wanted to assess the role of the GEF in increasing interactions among environmental actors at the regional level. Two things made this system complex:

1) the work of these many actors intersected, but they had no hierarchical structure, and

2) interventions took place at multiple scales, which ultimately shaped interactions at the regional scale.

It’s hard to keep track of what everyone says they’re doing and who they’re doing it with. By mapping the relationships among actors, SNA allowed us to see how well-connected the actors in the region are, and which ones are key to keeping the network well-connected.

Because it was an impact evaluation, we also needed some sort of counterfactual to compare our observations with. The big problem was, it is practically impossible to “randomly select” a region that is comparable to any other, much less find a high enough number of them to ensure statistical robustness. In this case, we were looking at the South China Sea, a region with several territorial conflicts, and which GEF has funded for > 20 years. How could we find a region to compare with that?

Hot Tips

  • Instead of looking outward, we created a scenario of the same region without GEF’s presence. We did this by redoing the SNA with the same set of actors except the GEF. The result was, without GEF support, some actors that were important at the country level became disconnected from the regional discussions.
 SNA diagram with and without GEF

(click for larger image)

  • We did not rely on this analysis alone to assess the impact of GEF funding in the region. We triangulated it with field visits, interviews at multiple scales, document reviews, environmental monitoring data, global databases, and satellite images, among others. A wide range of evidence sources and methods for analysis is your best defense against data gaps in complex systems!

Rad Resources:

To find out more about which SNA measures were used to come up with our findings, you can check out this paper that I wrote up on the analysis.

You can also see how this analysis fits in with the larger impact evaluation by reading the full report.

For more on the basics on the basics of SNA and how it can be used in evaluation, you can explore this Prezi I made. It includes links to evaluations, software, and other resources related to SNA. (CAVEAT: I delivered that presentation to a Spanish-speaking audience and haven’t translated it yet. My apologies to the non-Spanish speakers!)

The American Evaluation Association is celebrating Social Network Analysis TIG Week with our colleagues in the Social Network Analysis Topical Interest Group. The contributions all this week to aea365 come from our SNA TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Rebecca Swann-Jackson, a Senior Research Associate at the Center for Research and Evaluation on Education and Human Services at Montclair State University.  I currently manage evaluation projects focused on teacher preparation and development, and educational programs and community-based initiatives serving urban children and families.  In the evaluation of an urban teacher residency program, I recently used social network analysis (SNA) to examine the relationship between support for novice teachers and retention (i.e., staying in their schools and/or the profession).

Social network analysis is an innovative method used to understand relationships. As relational models, networks show both structure (who and what) and process (how and why) at the same time. Further, you can obtain a more complete picture by combining quantitative (outsider view) and qualitative measures (insider view) of the structure and process.

SNA diagram

Hot Tips: These tips are especially relevant for those who want to try out mixing quantitative SNA with qualitative methods.

The network survey will help to construct the ‘who and what’ relational network. To use the network for evaluation purposes, you also might consider using qualitative methods to investigate ‘how and why’ wonderings. Interrogate, or question, your models; what do you want to know? Ask questions of the relationships and connections you see (and don’t see!).

In the case of the evaluation of the urban teacher residency program, I was curious about:

  1. How does each supporter do their job?
  2. Why do novice teachers reach out to these people for support?
  3. Why do novice teachers reach out to these people for these specific types support?

Tip 1: Have a data party to engage respondents in interpretation and questioning: Reconvene your survey respondents. Distribute copies of the network model with the identifiers removed. Have them identify the questions they have about the model. Ask participants which node they think represents them and ask them to explain their decision-making.

Tip 2: Investigate questions through qualitative inquiry with key nodes. In my evaluation, I used focus groups to further understand the nature of key nodes’ roles. I interviewed the key nodes to learn more about their day-to-day operations.

Lessons Learned: Combining SNA with qualitative methods provided a more holistic understanding of the relationship between support and retention. Learning how people perceived the network and the content and meaning of ties between individuals was essential to understanding network patterns as well as evaluating program implementation and outcomes.

Rad Resources:

Nvivo by QSR – Enabling Qualitative Social Network Analysis https://youtu.be/8cUBQSWgGqg

Robert Wood Johnson Foundation – Using Social Network Analysis in Evaluation https://www.rwjf.org/en/library/research/2013/12/using-social-network-analysis-in-evaluation.html 

The American Evaluation Association is celebrating Social Network Analysis TIG Week with our colleagues in the Social Network Analysis Topical Interest Group. The contributions all this week to aea365 come from our SNA TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Bethany Laursen

Bethany Laursen

Hello, everyone! I’m Bethany Laursen, principal consultant at Laursen Evaluation & Design, LLC and doctoral student at Michigan State University. I love sense making tools, don’t you? I need help untangling complex data into meaningful findings! Social network analysis (SNA) is one of those tools, and it can do a lot more than its name indicates if you know how to hack it.

SNA is fundamentally network analysis, and you can study almost anything as a network. In fact, if you’re a systems thinker like I am, you probably do this already!

Hot Tip: All you need to hack SNA is at least one set of nodes and one set of edges. Stuck? A few inspiring questions include: What is flowing in my network? What do I care about? What is easy to measure?

Here are some basic examples:

Nodes Edges
Bus stops Bus routes
Grants Shared objectives
Land preserves Wildlife migrations
Accounts Fund transfers
Activities Causes

 

Level 2 hacking adds more edges to make a multiplex graph. For example, we might track shared personnel as well as shared objectives among grants. Level 3 hacks add another set of nodes to create 2-mode networks, such as bus stops with ATMs within one block. Combining levels 2 and 3 gets you to level 4—a multiplex, two-mode network (!). There are more secret levels to discover if you create new nodes and edges out of your original ones using the analytic transformations available in SNA software.

For example, I once turned a simple information-exchange network into a two-mode expert-expertise network, and then—through a co-affiliation transformation in UCINET—I ended up with an awesome group concept map of everyone’s shared expertise, where the nodes were expertise types and the edges were people recognized as those experts. How cool is that?

Figure 1: An expertise network made of areas of expertise connected by people who have those expertises. From Laursen 2013.

Figure 1: An expertise network made of areas of expertise connected by people who have those expertises. From Laursen 2013.

Lesson Learned: You can make intangible, complex constructs visible and interpretable by re-purposing SNA.

Lesson Learned: It’s fun to play with the possibilities of SNA, but in the end, you need to have a purpose for the information you generate. Having a good question to answer is half the secret of sense making tools.

 

Rad Resources: Here are some methods and tools that re-purpose SNA:

The American Evaluation Association is celebrating Social Network Analysis TIG Week with our colleagues in the Social Network Analysis Topical Interest Group. The contributions all this week to aea365 come from our SNA TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Older posts >>

Archives

To top