AEA365 | A Tip-a-Day by and for Evaluators

TAG | stakeholder engagement

My name is Ryan Evans and I’m a research associate at Wilder Research, a nonprofit research firm based in Minnesota. At Wilder Research, I work primarily with small- and medium-sized nonprofits in the Twin Cities. When working with smaller clients, it is paramount to deeply involve them in planning and doing the evaluation to ensure that the results are as useful as possible for them.

Lesson Learned: When I started my career as an evaluation consultant, I designed cookie-cutter evaluations. A survey, some focus groups – or both in a mixed methods design – that culminated in a report. I’ve learned that cookie-cutter evaluations are often not responsive enough to the context and changing circumstance of small nonprofits to provide useful results. I have evolved my consulting style to deeply involve my clients in my evaluation work, resulting in an increased likelihood that they can use the results to strategically guide their organization.

Hot Tip: Use an iterative approach. When working on evaluation projects, I will modify my project plan to respond to new ideas that arise from planning and doing the evaluation. I repeatedly ask myself and my client, “Is this work meeting our learning goals? Will this work be useful for improving the program and increasing its reach and sustainability? What might be more useful?” For one of my projects, I had completed half of the planned interviews. When talking with my client about the findings so far and how they related to the project’s learning goals, we decided I should also observe their programming – so we canceled the remaining interviews and I observed the program instead.

Cool Trick: To expedite the iteration process, give clients something concrete and fairly detailed to respond to – a draft infographic, for example – as early as possible. I spend a relatively small amount of time developing initial drafts so that I receive feedback from my clients quickly. This speeds up the process immensely (compared to waiting until I feel I have developed something “just right”).

Hot Tip: Build on the expertise of your clients. I am working with a theater organization and recently proposed doing a student perception survey. They didn’t like the idea of doing a written survey because it wouldn’t utilize their expertise or preferred approach as theater artists. Instead, we designed a “talking survey” that they facilitated with their students. I designed the survey and took notes as they talked through the questions with their students and interactively obtained the data we wanted.

Rad Resources: In my informal researching, the consulting field calls this consultation style “process consulting” or “emergent consulting.” Here’s a link to a research-based blog post about consulting styles, including process and emergent styles.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hello! We are Kellie Hall, from the National Association of County & City Health Officials (NACCHO), and Emmalou Norland, from Cedarloch Research, LLC. We have worked in consort as internal and external evaluators on public health programs. During our time together, evaluation has been growing in popularity within the non-profit sector—and with that, so has the need to engage stakeholder groups from multiple levels (i.e., top executives, program managers, and front-line staff).

Lessons Learned: The importance of stakeholder engagement during evaluation—particularly as a critical component in ensuring the evaluation meets the utility standard—is well known in the field. As familiar as the concept is, however, the complex nature of engaging stakeholders in appropriate ways can be a perplexing challenge. For example, when federal funding dictates not only that a program evaluation must be done but also specifies its design, engaging stakeholders in the planning phase can seem superfluous. Furthermore, stakeholder engagement sessions typically focus on the why behind engagement, rather than the how of engagement with those of varying authoritative powers, divergent priorities, and competing needs. Understanding these contextual factors is crucial to engaging various levels of stakeholders.

Hot Tip: Engage stakeholders in the process of determining how to engage stakeholders!
Many evaluators begin their stakeholder engagement by creating a Stakeholder Engagement Plan. Instead, start one step earlier.

One way to do this is to gather your stakeholders together for a “hack-a-thon,” a process that comes from the technology field and is focused on collaborative problem solving. This highly interactive meeting starts with your stakeholders and ends with solutions tailored to address their needs. During a “hack-a-thon,” each stakeholder group works through the following stages together:

  1. Empathizing with another stakeholder group
  2. Defining a focused need for that other stakeholder group
  3. Ideating solutions to address that need
  4. Deciding on the most effective solution

(Check out an example hack-a-thon setup, including handouts, here.)

Then, you can use the results developed by the stakeholders themselves to create a “Stakeholder Profile” for each group, documenting their power, values, priorities, and engagement needs. This is now the beginning of your Stakeholder Engagement Plan!

Rad Resources: Some great stakeholder planning resources that I’ve referenced in my work include:

If you have a useful stakeholder engagement resource, please share in the comments below.

The American Evaluation Association is celebrating Labor Day Week in Evaluation: Honoring the WORK of evaluation. The contributions this week are tributes to the behind the scenes and often underappreciated work evaluators do. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Happy Saturday, folks!  I’m Liz Zadnik, aea365’s Outreach Coordinator.  I live in the Mid-Atlantic region of the country and was snowed in a few weeks ago.  The storm wasn’t as bad as it could have been (for us…thankfully), but I had a chance to spend some time catching up on my reading resolution.  

Rad Resource: First off, I need to again express my appreciation for AEA’s member access to journals and publications from the field. I love New Directions for Evaluation and was excited to see “Planning and Facilitating Working Sessions with Evaluation Stakeholders.”  Part of my “day job” is engaging stakeholders in conversations about nuanced topics and complex issues.  The inclusion of a case example helped me operationalize concepts and give me some great ideas for my own practice.

View of desk with three plants lined up from left to right with a whiteboard in the background

Lessons Learned: A big factor in successful group project is navigating potential issues or influences within the group of stakeholders.  This includes both investigating the attitudes and dynamics of group members, as well as your own biases as the facilitator.  The article encourages evaluators to learn about possible political, historical, and/or social contexts that may prevent or hinder group cohesiveness and trust.  Is it (in)appropriate to bring everyone together initially?  Or do distinct groups need to be engaged before a collective can be established?  

There’s also a great table with skills and questions for facilitators, each topic has examples and items to explore.  What caught my eye – most likely because it’s something that has tripped me up personally in the past – was a set of questions about previous group facilitation experience.  It’s challenging not to bring past experiences with you to the present, but a lack of patience or quickness to make assumptions about dynamics and process can really impede creativity, innovation, and thoughtful problem-solving.  

I also loved how the author outlines thoughtful considerations and steps for facilitating and operationalized those considerations with a case example.  Particularly during the description of the debrief – I am a huge fan of self-reflection and really appreciated its inclusion within the facilitation process.  

I would definitely recommend the article to anyone who wants to up their facilitation game and is looking for guidance on how best to engage project stakeholders!   

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi! We are Laura Beals, Director, and Barbara Perry, Evaluation Manager, of the Department of Evaluation and Learning at Jewish Family and Children’s Service Boston, a multi-service nonprofit in Massachusetts. At Eval 2015, we learned about “Data Placemats” from Veena Pankaj of the Innovation Network. Recently, we held several placemat-focused “Learning Conversations” with one of our multi-program divisions. We created seven placemats for these meetings:

  1. An overview of the Learning Conversation and placemat process.
  2. Client census—new and active—over the past four years for each program.
  3. Client demographics by program.
  4. Client geographic distribution heat map. This placemat was interactive, using Tableau. We wanted not only to show the geographic distribution of clients in Massachusetts, but also to provide an opportunity to explore the data further, through the use of filters for program and key demographics.
  5. A network analysis showing referral sources.
  6. A network analysis showing how clients were served by multiple programs at the agency.

(click for larger image)

beals

7. A learning and dissemination plan. This placemat encouraged meeting participants to use the data and allow our team to     create specific follow-up documents and undertake follow-up analysis.

Lessons Learned:

  • During the planning stages, check-in with stakeholders from around the organization. We asked the program director, division director, grant writers, and development associates what they wanted to learn about the division. Their responses allowed us to tailor the placemats to be as useful to as many people as possible.
  • Don’t forget to include the staff! In order to share the placemats and get feedback from the direct-service staff, at an all-staff meeting we held a shorter placemat discussion, focusing on two placemats; the other placemats were provided for later review. We also hung up the placemats near the staff offices and provided sticky notes for feedback and observations.
  • Be ready to “go on the road” with your placemats. We found that word spread about our placemats and there was interest from various stakeholders who had not been able be part of the original few meetings. By continuing the conversations, we were able to increase learning and generate new ideas.
  • Bring data chocolates! We had been waiting for an opportunity to create data chocolates, after being inspired by Susan Kistler. We wrapped shrunken versions of several of the graphs around chocolates. They put everyone in a good mood to talk data—the lightheartedness of our gesture helped break down barriers and were a great conversation starter.

Rad Resources:

The American Evaluation Association is celebrating Internal Evaluation (IE) Topical Interest Group Week. The contributions all this week to aea365 come from our IE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Dr. Moya Alfonso, MSPH, and I’m an Associate Professor at the Jiann-Ping Hsu College of Public Health at Georgia Southern University, and I am University Sector Representative and Board Member for the Southeast Evaluation Association (SEA). I would like to offer you a few tips on engaging stakeholders in participatory evaluation based on my 16 years of experience engaging stakeholders in community health research and evaluation.

Participatory evaluation is an approach that engages stakeholders in each step of the process.  Rather than the trained evaluator solely directing the evaluation, participatory evaluation requires a collaborative approach.  Evaluators work alongside stakeholders in developing research questions, deciding upon an evaluation design, designing instruments, selecting methods, gathering and analyzing data, and disseminating results.  Participatory evaluation results in stronger evaluation designs and greater external validity because community members have a high level of input in entire process.  It also strengthens buy-in to the results and a greater use of the evaluation products.

Rad Resource: Explore the University of Kansas Community Tool Box for introductory information on participatory evaluation.

Hot Tips: Here are a few tips for engaging stakeholders:

  • Establish a diverse stakeholder advisory group: Community stakeholders have a range of skills that can contribute to the evaluation process. For example, I worked with 8th grade youth on a participatory research project and assumed that I would need to conduct the statistical analysis of survey data.  To my surprise, one of the youths had considerable expertise and was able to conduct the analysis with little assistance. With training and support, community stakeholders can contribute and exceed your expectations.
  • Keep stakeholders busy: A common problem in working with advisory groups is attrition. Keep community stakeholders engaged with evaluation tasks that use their unique skill sets. Matching assignments to existing skill sets empower community stakeholders and result in increased buy-in and engagement.
  • Celebrate successes: Celebrating successes over the course of the evaluation is a proven strategy for keeping stakeholders engaged. Rather than waiting until the end of the evaluation, reward stakeholders regularly for the completion of evaluation steps.
  • Keep your ego in check: Some highly trained evaluators might find handing over the reins to community stakeholders challenging because they’re used to running the show. Participatory evaluation requires evaluators to share control and collaborate with community stakeholders. Try to keep an open mind and trust in the abilities of community stakeholders to participate in the evaluation process with your support and guidance.  You’ll be amazed at what you can achieve when stakeholders are fully engaged in evaluation research! 

The American Evaluation Association is celebrating Southeast Evaluation Association (SEA) Affiliate Week with our colleagues in the SEA Affiliate. The contributions all this week to aea365 come from SEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi, we’re Southeast Evaluation Association (SEA) members Taylor Ellis, a doctoral student and lead evaluator, and Dr. Debra Nelson-Gardell, an Associate Professor, providing consultation at the School of Social Work at The University of Alabama. We form a team tasked with evaluating a program providing community-based, family-inclusive intervention for youth with sexual behavior problems (youngsters who lay people might call juvenile sex offenders). This post focuses on our lessons learned regarding our approach to resistance in program evaluation.

Taut and Alkin (2002) reported people stereotypically view program evaluation as “being judged…that the evaluation is used to ‘get me’, that it is not going to be used to assist me but is perceived to be negative and punitive in its nature” (p. 43). Our program evaluation faced derailment because the program had never been evaluated before, or perhaps because of the inevitability of resistance to evaluation.  Accepting the resistance as normal, we tried addressing it.  But, our efforts didn’t work as we had hoped. Below are the hard lessons learned through “hard knocks.”

Lessons Learned:

  • The Importance of Stakeholder Input: Stakeholders need to believe evaluators will listen to them.  Early in the evaluation process, stakeholders were interviewed and asked about their ideas for program improvement to promote engagement in the process. What the interviews lacked was a greater emphasis on how what stakeholders said affected the evaluation.
  • Remember and (Emphatically) Remind Stakeholders of the Evaluation’s Purpose/Goals: During the evaluation, the purpose of the evaluation was lost in that stakeholders were not reminded of the evaluation’s purpose. Project updates to stakeholders should have been more intentional about movement towards the purpose. We lost sight of the forest as we negotiated the trees. This lack of constant visioning led to many stakeholders viewing the evaluation implementation as an unnecessary hassle.
  • The Illusion of Control: Easily said, not easily done: Don’t (always) take it personally. Despite our efforts, a great deal of resistance, pushback, and dissatisfaction remained. After weeks of feeling at fault, we found out that things were happening behind the scenes over which we had no control, but that directly affected the evaluation.

Knowing these lessons earlier could have made a difference, and we intend to find out.  Our biggest lesson learned:  Resist being discouraged by (likely inevitable) resistance, try to learn from it, and know that you are not alone.

The American Evaluation Association is celebrating Southeast Evaluation Association (SEA) Affiliate Week with our colleagues in the SEA Affiliate. The contributions all this week to aea365 come from SEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Hello!  I am Smita Shukla Mehta, a 2016 Fellow of AEA’s Minority Serving Institutions program.  I am a Professor of Special Education in the Department of Educational Psychology at the University of North Texas.  I teach doctoral courses in program evaluation and feel passionate about providing graduate students opportunities to develop technical competencies in conceptualizing and conducting evaluations of specific programs.  I have been fortunate to have several funded grant projects through the U.S. Department of Education that require systematic evaluation; thus I get to practice what I teach!

Blog Topic:

The topic of this blog is Asking the Right Questions: First Step toward Rigor in Program Evaluation.  I chose this topic because rigor in program evaluation necessitates directly connecting the [right] questions about the program outcomes and utilization of the findings.  Whether the questions are about the process, outcomes or both, they should address the purpose for which we evaluate programs (e.g., accountability, improvement, knowledge development, and oversight).

Hot Tips:

  1. Asking the right questions is not just an intellectual exercise to showcase our technical skills. It necessitates a deep and clear understanding of the program being evaluated, its institutional culture, its stage in the life-cycle, and expected impact for stakeholders.  Reflect on the importance of a comprehensive framework for evaluation.
  2. Understand the program from the perspectives of the stakeholders – those who administer and/or implement the program, and those who benefit from the services. Engage stakeholders in the planning and evaluation process for credibility and accuracy.  Know the purpose of the evaluation.  Is it to improve services, generate new knowledge, comply with professional standards, and/or accountability?  The purpose will help select an evaluation framework that directly addresses the program’s needs to produce the expected outcomes for stakeholders.

Cool Tricks:

  1. Use a Logic Model as a roadmap for understanding the program and the process for achieving the intended outcomes. The logic model is based on the Theory of Change that requires an evaluator to know the essentials of program theory and implementation to understand how a program works.
  2. Link evaluation questions to the program outcomes as suggested by Rama Radhakrishna and Rhemilyn Relado in their article published in the Journal of Extension.

Lessons Learned:

My experience as an MSI Fellow has taught me about the ways and extent of rigor needed in program evaluation.  Conceptualization of rigor actually starts with asking the right questions.  Rigor is not just about using highly technical data collection and analyses methods but about selecting the best methods for the most critical questions regarding the process and outcome of the program to be evaluated.

The American Evaluation Association is AEA Minority Serving Institution (MSI) Fellowship Experience week. The contributions all this week to aea365 come from AEA’s MSI Fellows. For more information on the MSI fellowship, see this webpage: http://www.eval.org/p/cm/ld/fid=230 Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello, we are Dani Rae Gorman and Angela Nancy Mendoza, former scholars of the AEA Graduate Diversity Education Internship (GEDI) program 2015-2016 cohort. We’d like to share some of our lessons learned throughout the program in engaging stakeholders from the inception and throughout the process of evaluation.

Lessons Learned:

Consistent Engagement

Evaluation phases change over a project’s life, and it is important to include stakeholders at each step. In many cases, stakeholders help to plan what they want, but are less involved with tasks such as helping to understand what the data mean and assisting in creating an effective way to communicate these findings. Having the right people involved from the beginning and keeping them involved throughout the evaluation are critical to the process. It increases the evaluation’s accuracy, appropriateness and utility. For example, having evaluation stakeholders involved in the interpretation of results to ensure the evaluators are getting the right message and are aware of important nuances.

Creating Value and Utility

In conducting relevant and accurate evaluations, it is important to understand the cultural context and communities in which the evaluation is to be carried out. Consideration and responsiveness to these factors help to ensure that an evaluation captures nuances and specific needs to help create an evaluation product that is accurate and useful to stakeholders.

Identifying and Engaging a Diversity of Stakeholders

Engaging stakeholders requires the identification of those whom the evaluation will impact. This includes program staff, managers, project leaders, clients, community members, and other stakeholders who may be affected by the evaluation findings. Engaging a diversity of stakeholders aides in creating an understanding of the identity being evaluated, its members and its culture. This in turn helps to ensure that informative questions are asked in the right way and that the outcomes are meaningful and useful to stakeholders.

Hot Tip:

Be patient and flexible in working to engage stakeholders through the evaluation process. It can be a challenge to facilitate engagement throughout the stages of an evaluation and individuals may have different experiences, perspectives, and responsibilities, but consistent engagement can create added value and utility of evaluation findings.

Rad Resources:

The American Evaluation Association is celebrating Graduate Education Diversity Internship (GEDI) Program week. The contributions all this week to aea365 come from AEA’s GEDI Program and its interns. For more information on GEDI, see their webpage here: http://www.eval.org/GEDI Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi, I’m Pat Christian and founder of Caleb Missionary Relief Services, an international nonprofit that evolved from evaluating children needs with vulnerabilities in Haiti.  My organization has implemented interventions to improve their quality of education and has made a difference with thousands of Haitian students.  I’ve conducted project level evaluations with input from stakeholders for decision making and accountability. Domestically, I’ve served as a seasonal grant reviewer for the Georgia Department of Education and evaluated educational programs from schools and nonprofits applying for federal funds.

I recently attended the 2016 Summer Evaluation Institute in Atlanta. At one workshop, a registrant from Africa inquired how to engage stakeholders in planning a program evaluation. In his country, he’d been asked to evaluate the success of the program at its end without prior involvement. Working internationally, I understood his concern and knew of situations where the overseer of the program did not engage the community and the program was a failure.

Rad Resource: “It’s not the Plan, It’s the Planning: Strategies for Evaluation Plans and Planning” a workshop by Dr. Sheila Robinson at the 2016 Summer Evaluation Institute incorporated a participatory approach for engaging stakeholders in a comprehensive evaluation plan. Dr. Robinson did an awesome job expounding upon five steps for planning a program evaluation while engaging stakeholders in the entire process. Based on her steps, I’ve suggested how international stakeholders can be involved in each step to maximize the evaluation.

Hot Tip: Engage stakeholders. Identify stakeholders who should be involved in planning a program evaluation and develop a plan how to engage them. Stakeholders can provide information on why the program evaluation is needed for the community and can be the program staff, community members and leaders, the participants, collaborative NGO partners, the nationals’ government, others from similar programs, etc.

Hot Tip: Focus the evaluation. Stakeholders can give input for the logic model and share ways how they will use the evaluation. Stakeholders, as an advisory group, can be a strong asset with getting the community to understand the significance of the evaluation, getting information disseminated more widely, and getting more participants to respond.

Hot Tip: Collect data. To ensure cultural competence in evaluation, stakeholders give the evaluator an understanding of the cultural dynamics and can recommend what data collection methods are best for the culture.

Hot Tip: Analyze & interpret the information. As an American, I may interpret data through one set of lens but involving stakeholders to discuss the analysis and lessons learned presents a different lens for interpreting data.

Hot Tip: Use the Information. Discuss with stakeholders how the information should be best communicated with the community and determine the next steps.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Archives

To top