AEA365 | A Tip-a-Day by and for Evaluators

TAG | stakeholder engagement

Grisel M. Robles-Shrader

Grisel M. Robles-Shrader

Keith A. Herzog

Keith A. Herzog

We are Grisel M. Robles-Schrader and Keith A. Herzog of the Northwestern University Clinical and Translational Sciences (NUCATS) Institute. The Center for Community Health (CCH), which is one of 10 centers and programs within NUCATS, is specifically charged with offering support and resources to catalyze and support meaningful community and academic engagement across the research spectrum with the aim of improving health and health equity. Community engagement centers across the nation-wide Clinical and Translation Sciences Award (CTSA) consortium offer a similar range of programs and services.

We facilitated efforts within our institution to develop an evaluation infrastructure to better understand, improve, promote, and evaluate the community engagement support and services that CCH offers to investigators. By engaging key stakeholders within CCH and NUCATS more broadly, we concentrated our efforts on metrics and data collection tools relevant to our team’s work.

As part of our comprehensive evaluation plan, the CCH developed community engagement metrics covering six domains aimed at measuring engagement support and outcomes beyond publications and funding:

  • consultation services,
  • capacity building & education,
  • fiscal support,
  • partnership development,
  • institutional-level changes, and
  • community-level changes

Rad Resources:

  • REDCapResearch and Electronic Database Capture is a secure web application for building and managing online surveys and databases.

Phases of Development & Feedback Loops

Phases of Development & Feedback Loops

Utilizing REDCap data collection tool enabled us to refine and adapt our “dream list” of metrics, based on our comprehensive logic model. In close collaboration with key internal stakeholders, we implemented the CCH Engagement and Tracking project focused on consultation services. At the end of the 12 month pilot period, we connected again with these stakeholders to assess what was working well, was not working well, and what needed to be revised. For example, we focused our review on categories that consistently had high rates of missing data and discussed whether those were still considered relevant questions to track. Moreover, we utilized the 12 month review as an opportunity to assess revisions to our logic model, based on evidence-based insights informed by the consultation tracking form.

Hot Tips:

  • Engage key stakeholders throughout the evaluation development and implementation processes. This ensures you collect relevant data, utilizing strategies that are meaningful for your team.
  • Utilize the 80/20 rule to avoid data collection creep (i.e., trying to collect everything, all the time). Ask yourselves: “What do we consistently encounter, do, collect, and share 80% of the time?”
  • Pilot data collection tools using real-world data. Refine the tool. Revise and repeat (as necessary).
  • Establish strong project management skills to keep the group on task and to secure buy-in from key stakeholders.
  • Support standardization by developing manuals with succinct definitions and concrete examples. Including instructions and contextual links within REDCap so it is available when your team enters data.

 

The American Evaluation Association is celebrating Chicagoland Evaluation Association (CEA) Affiliate Week. The contributions all this week to aea365 come from CEA members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Hello aea365 readers! I’m Sheila B Robinson, aea365 Lead Curator and sometimes Saturday contributor. This past week, I taught courses at AEA’s Summer Evaluation Institute in Atlanta, GA. One of my courses, It’s Not the Plan, it’s the Planning: Strategies for Evaluation Plans and Planning fills up every year. In this course, participants learn:

  • How to identify types of evaluation activities (e.g. questions to ask of potential clients) that comprise evaluation planning
  • Potential components of a comprehensive evaluation plan
  • How to identify key considerations for evaluation planning (e.g. client needs, collaboration, procedures, agreements, etc.)

What isn’t in the curriulum for this course, is the answer to a couple of questions participants frequently ask at the end:

  • How do I engage stakeholders in the evaluation?
  • How do I get buy-in from stakeholders?
  • How can I get stakeholders to value the evaluation?

Lesson Learned:

My best advice for stakeholder engagement and obtaining buy-in for evaluation is to engage early and often. Meet with people, share information about the evaluation and how it’s going. Offer intermediate reports, even if you have only a little to report. Most importantly, meet people where they are: If stakeholders are concerned with bottom line dollars and cents, talk with them about that. If they’re concerned about the impact on the target population, share what and how beneficiaries are doing in the program. In other words, tailor your interactions with stakeholders or presentations to them to align with their specific areas of concern and interest and connect on both an emotional and intellectual level. Evaluation is not just about data. It’s about people. Successful stakeholder engagement cannot be achieved with a one-size-fits-all approach! For more specifics, here are a few helpful resources.

Rad Resources:

Get Involved:

I’m certain our readers would appreciate more on this topic. What are your best strategies for stakeholder engagement? Please offer them in the comments, or better yet, contribute a blog article on the topic!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Hello AEA members! My name is Miki Tsukamoto and I am a Monitoring and Evaluation Coordinator at the International Federation of Red Cross and Red Crescent Societies (IFRC).

Video has proved to be a useful data collection tool to engage communities to share their feedback on the impact of projects and programmes in the International Federation of Red Cross and Red Crescent Societies (IFRC).[1] In an effort to develop more efficient and inclusive approaches to monitoring projects, IFRC’s Planning, Monitoring, Evaluation and Reporting (PMER) Unit in Geneva, in cooperation with Newcastle University’s Open Lab and in coordination with the Indonesian Red Cross Society (Palang Merah Indonesia-PMI) and IFRC Jakarta, piloted an initiative using the Most Significant Change approach facilitated by a mobile video application (app) called “Our Story,” adapted from the Bootlegger app, in the community of Tumbit Melayu in 2017. Stories were planned, collected, directed and edited by women, men, youth and elderly of the community through this “one stop shop” mobile application. The subject was to gather feedback on a water, sanitation and hygiene promotion (WASH) project being implemented by the Indonesian Red Cross Society (Palang Merah Indonesia-PMI) with the support of IFRC in the district of Berau, East Kalimantan province. Costs of this pilot project were minimal, as the app allows video data collection to be done without having to continuously rely on external expertise or expensive equipment.

Our Story: Women’s feedback on a WASH project in Berau, Indonesia


Our Story: Elderly’s feedback on a WASH project in Berau, Indonesia

Our Story: Youth’s feedback on a WASH project in Berau

Our Story: Men’s feedback on a WASH project in Berau

Our Story: Community’s feedback on a WASH project in Berau, Indonesia

Lessons Learned:

  • Data collection: When collecting disaggregated data, it is important that facilitators be flexible and respect the rhythm of each community group, including their schedules and availability.
  • Community needs: By collecting stories from representative groups from the community, it provides an opportunity for organizations to dive deeper into the wishes of the community and therefore better understand and address their varying specific needs.
  • Our Story app: The community welcomed this new tool as it was an app that facilitated the planning, capturing and creation of their story on a mobile device. This process can be empowering for an individual and/or group, and serve to increase their interest and future participation in IFRC and/or National Society-led projects.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

[1] Recent participatory video initiatives produced by communities receiving assistance from IFRC and/or National Society projects can be found at: https://www.youtube.com/playlist?list=PLrI6tpZ6pQmQZsKuQl6n4ELEdiVqHGFw2

· · ·

Hi, I am Jennifer Johnson. I am the Director of the Division of Public Health Statistics and Performance Management for the Florida Department of Health. I want to discuss how improving stakeholder relationships can improve data collection.

In most evaluations, collection of quantitative and qualitative data forms a critical aspect of stakeholder engagement and relationships. Methods for collecting both types of data can include structured interviews, surveys, and file reviews. Evaluators also analyze data sets that vary in number and types of variables and formats.

Ultimately, however, key stakeholders provide the data. Thus, effective relationships with key stakeholders can be the lifeline to the data upon which a strong evaluation depends.

Whether participation is voluntary or contractually required, evaluators can adopt practices throughout evaluations that enhance stakeholder engagement specific to data collection. These practices foster effective and clear communication and help evaluators to establish trust.

Hot Tips:

  1. Communicate with Leadership. Initiate engagement with the executive leadership of stakeholder organizations, unless the evaluator has identified specific individuals. Give stakeholder leadership the opportunity to establish parameters and requests for communication throughout the evaluation. These parameters should identify those individuals or groups to always keep informed. Follow up by clarifying what the rules of engagement will be. Ensure that members of the evaluation team follow this agreement.
  1. Communicate Early. Be forthcoming and transparent from the beginning. Clearly communicate the evaluation scope at initial meetings. Specify data and data collection method that the evaluator may request from stakeholders. Inform stakeholders at this stage whether they will have an opportunity to review and discuss preliminary results and conclusions based on their data.
  1. Communicate Specifics. Develop clear and thorough processes for collecting data. Develop and submit data requests that clearly articulate and specify the requested data and information. Include specific variables when requesting databases. Include specific and clear instructions for submitting data. Provide an easy and convenient method for feedback and questions. Set reasonable deadlines and consider stakeholder organizational factors, such as crunch times staffing, and workload issues. If possible, modify data requests based on extenuating circumstances or to ease the burden on the stakeholder.
  1. Communicate Strategically. Data exchanges goes in both directions. Identify opportunities to answer stakeholder questions or provide information. Share results and information that could benefit stakeholders, but only if that sharing does not compromise the evaluation or use additional resources. This could include information that helps stakeholders address organizational problems or improve performance.

 

The American Evaluation Association is celebrating Southeast Evaluation Association (SEA) Affiliate Professional Development Week with our colleagues in the SEA Affiliate. The contributions all this week to aea365 come from SEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Greetings, AEA365 Readers! I am Dr. Nancy Bridier, Senior Doctoral Adjunct at Grand Canyon University, Public Sector Representative, and Board Member for the Southeast Evaluation Association (SEA). I am also an Independent Consultant based in the Florida panhandle. Communication with our clients is part of our practice, but are we communicating effectively? I would like to share tips for effective stakeholder communication.

Rad Resource: Stakeholders are not just those who contract our services, but may also include those affected by the program. This may depend on their relationship to and interest in the program. Explore the University of Kansas Community Toolbox checklist for identifying stakeholders.

Hot Tips:

  • What to communicate and why: Effective communication is not just about the technology we use, but its purpose. I have emailed written reports and presented PowerPoint slides to communicate findings. While these are commonly used tools, they are not always effective for every stakeholder. Understand the type of information stakeholders want and how they prefer to receive it. It may be text, numbers, graphics (charts, tables), visual, or combination. If your stakeholders are in a different area, a web conferencing tool, such as Zoom or WebEx, is a great interactive way to communicate with stakeholders. They also allow stakeholders to ask questions and receive immediate answers. These tools allow you the opportunity to observe stakeholder reactions.
  • When to communicate: Effective communication begins with the initial meeting. Establish a clear outline of the stakeholders’ purpose, questions, timelines, and communication processes. Communicate throughout the project to ensure nothing has changed. Engage stakeholders in decision-making. Inform the stakeholders of progress. Better Evaluation.org offers some great tips, tools, and methods of communicating findings to stakeholders after the evaluation is completed.
  • Considerations: Some evaluators invite stakeholders to review a draft report as part of their communicating and reporting strategy. Before engaging in this practice, consider the costs and ethical implications of accepting a stakeholder’s revisions to a draft evaluation report.
  • Communicating findings: Share the procedures and lessons learned. Know your stakeholders to convey information effectively. Define terminology. Avoid using jargon. Demonstrate results and accountability. Focus on success and improvement. Outline changes to the program to improve outcomes.

Lessons Learned:

On my first program evaluation, I failed to establish communication guidelines to the primary stakeholder. During an eight-week parent education program, the stakeholder changed the assessment instrument based on responses on the pretest. Needless to say, we had to complete more than one cycle of the program to establish a baseline for comparison. Let your stakeholders know communication is a collaborative process. Inform them about the type of information that you need, and the steps of the evaluation process.

The American Evaluation Association is celebrating Southeast Evaluation Association (SEA) Affiliate Professional Development Week with our colleagues in the SEA Affiliate. The contributions all this week to aea365 come from SEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Ryan Evans and I’m a research associate at Wilder Research, a nonprofit research firm based in Minnesota. At Wilder Research, I work primarily with small- and medium-sized nonprofits in the Twin Cities. When working with smaller clients, it is paramount to deeply involve them in planning and doing the evaluation to ensure that the results are as useful as possible for them.

Lesson Learned: When I started my career as an evaluation consultant, I designed cookie-cutter evaluations. A survey, some focus groups – or both in a mixed methods design – that culminated in a report. I’ve learned that cookie-cutter evaluations are often not responsive enough to the context and changing circumstance of small nonprofits to provide useful results. I have evolved my consulting style to deeply involve my clients in my evaluation work, resulting in an increased likelihood that they can use the results to strategically guide their organization.

Hot Tip: Use an iterative approach. When working on evaluation projects, I will modify my project plan to respond to new ideas that arise from planning and doing the evaluation. I repeatedly ask myself and my client, “Is this work meeting our learning goals? Will this work be useful for improving the program and increasing its reach and sustainability? What might be more useful?” For one of my projects, I had completed half of the planned interviews. When talking with my client about the findings so far and how they related to the project’s learning goals, we decided I should also observe their programming – so we canceled the remaining interviews and I observed the program instead.

Cool Trick: To expedite the iteration process, give clients something concrete and fairly detailed to respond to – a draft infographic, for example – as early as possible. I spend a relatively small amount of time developing initial drafts so that I receive feedback from my clients quickly. This speeds up the process immensely (compared to waiting until I feel I have developed something “just right”).

Hot Tip: Build on the expertise of your clients. I am working with a theater organization and recently proposed doing a student perception survey. They didn’t like the idea of doing a written survey because it wouldn’t utilize their expertise or preferred approach as theater artists. Instead, we designed a “talking survey” that they facilitated with their students. I designed the survey and took notes as they talked through the questions with their students and interactively obtained the data we wanted.

Rad Resources: In my informal researching, the consulting field calls this consultation style “process consulting” or “emergent consulting.” Here’s a link to a research-based blog post about consulting styles, including process and emergent styles.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hello! We are Kellie Hall, from the National Association of County & City Health Officials (NACCHO), and Emmalou Norland, from Cedarloch Research, LLC. We have worked in consort as internal and external evaluators on public health programs. During our time together, evaluation has been growing in popularity within the non-profit sector—and with that, so has the need to engage stakeholder groups from multiple levels (i.e., top executives, program managers, and front-line staff).

Lessons Learned: The importance of stakeholder engagement during evaluation—particularly as a critical component in ensuring the evaluation meets the utility standard—is well known in the field. As familiar as the concept is, however, the complex nature of engaging stakeholders in appropriate ways can be a perplexing challenge. For example, when federal funding dictates not only that a program evaluation must be done but also specifies its design, engaging stakeholders in the planning phase can seem superfluous. Furthermore, stakeholder engagement sessions typically focus on the why behind engagement, rather than the how of engagement with those of varying authoritative powers, divergent priorities, and competing needs. Understanding these contextual factors is crucial to engaging various levels of stakeholders.

Hot Tip: Engage stakeholders in the process of determining how to engage stakeholders!
Many evaluators begin their stakeholder engagement by creating a Stakeholder Engagement Plan. Instead, start one step earlier.

One way to do this is to gather your stakeholders together for a “hack-a-thon,” a process that comes from the technology field and is focused on collaborative problem solving. This highly interactive meeting starts with your stakeholders and ends with solutions tailored to address their needs. During a “hack-a-thon,” each stakeholder group works through the following stages together:

  1. Empathizing with another stakeholder group
  2. Defining a focused need for that other stakeholder group
  3. Ideating solutions to address that need
  4. Deciding on the most effective solution

(Check out an example hack-a-thon setup, including handouts, here.)

Then, you can use the results developed by the stakeholders themselves to create a “Stakeholder Profile” for each group, documenting their power, values, priorities, and engagement needs. This is now the beginning of your Stakeholder Engagement Plan!

Rad Resources: Some great stakeholder planning resources that I’ve referenced in my work include:

If you have a useful stakeholder engagement resource, please share in the comments below.

The American Evaluation Association is celebrating Labor Day Week in Evaluation: Honoring the WORK of evaluation. The contributions this week are tributes to the behind the scenes and often underappreciated work evaluators do. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Happy Saturday, folks!  I’m Liz Zadnik, aea365’s Outreach Coordinator.  I live in the Mid-Atlantic region of the country and was snowed in a few weeks ago.  The storm wasn’t as bad as it could have been (for us…thankfully), but I had a chance to spend some time catching up on my reading resolution.  

Rad Resource: First off, I need to again express my appreciation for AEA’s member access to journals and publications from the field. I love New Directions for Evaluation and was excited to see “Planning and Facilitating Working Sessions with Evaluation Stakeholders.”  Part of my “day job” is engaging stakeholders in conversations about nuanced topics and complex issues.  The inclusion of a case example helped me operationalize concepts and give me some great ideas for my own practice.

View of desk with three plants lined up from left to right with a whiteboard in the background

Lessons Learned: A big factor in successful group project is navigating potential issues or influences within the group of stakeholders.  This includes both investigating the attitudes and dynamics of group members, as well as your own biases as the facilitator.  The article encourages evaluators to learn about possible political, historical, and/or social contexts that may prevent or hinder group cohesiveness and trust.  Is it (in)appropriate to bring everyone together initially?  Or do distinct groups need to be engaged before a collective can be established?  

There’s also a great table with skills and questions for facilitators, each topic has examples and items to explore.  What caught my eye – most likely because it’s something that has tripped me up personally in the past – was a set of questions about previous group facilitation experience.  It’s challenging not to bring past experiences with you to the present, but a lack of patience or quickness to make assumptions about dynamics and process can really impede creativity, innovation, and thoughtful problem-solving.  

I also loved how the author outlines thoughtful considerations and steps for facilitating and operationalized those considerations with a case example.  Particularly during the description of the debrief – I am a huge fan of self-reflection and really appreciated its inclusion within the facilitation process.  

I would definitely recommend the article to anyone who wants to up their facilitation game and is looking for guidance on how best to engage project stakeholders!   

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi! We are Laura Beals, Director, and Barbara Perry, Evaluation Manager, of the Department of Evaluation and Learning at Jewish Family and Children’s Service Boston, a multi-service nonprofit in Massachusetts. At Eval 2015, we learned about “Data Placemats” from Veena Pankaj of the Innovation Network. Recently, we held several placemat-focused “Learning Conversations” with one of our multi-program divisions. We created seven placemats for these meetings:

  1. An overview of the Learning Conversation and placemat process.
  2. Client census—new and active—over the past four years for each program.
  3. Client demographics by program.
  4. Client geographic distribution heat map. This placemat was interactive, using Tableau. We wanted not only to show the geographic distribution of clients in Massachusetts, but also to provide an opportunity to explore the data further, through the use of filters for program and key demographics.
  5. A network analysis showing referral sources.
  6. A network analysis showing how clients were served by multiple programs at the agency.

(click for larger image)

beals

7. A learning and dissemination plan. This placemat encouraged meeting participants to use the data and allow our team to     create specific follow-up documents and undertake follow-up analysis.

Lessons Learned:

  • During the planning stages, check-in with stakeholders from around the organization. We asked the program director, division director, grant writers, and development associates what they wanted to learn about the division. Their responses allowed us to tailor the placemats to be as useful to as many people as possible.
  • Don’t forget to include the staff! In order to share the placemats and get feedback from the direct-service staff, at an all-staff meeting we held a shorter placemat discussion, focusing on two placemats; the other placemats were provided for later review. We also hung up the placemats near the staff offices and provided sticky notes for feedback and observations.
  • Be ready to “go on the road” with your placemats. We found that word spread about our placemats and there was interest from various stakeholders who had not been able be part of the original few meetings. By continuing the conversations, we were able to increase learning and generate new ideas.
  • Bring data chocolates! We had been waiting for an opportunity to create data chocolates, after being inspired by Susan Kistler. We wrapped shrunken versions of several of the graphs around chocolates. They put everyone in a good mood to talk data—the lightheartedness of our gesture helped break down barriers and were a great conversation starter.

Rad Resources:

The American Evaluation Association is celebrating Internal Evaluation (IE) Topical Interest Group Week. The contributions all this week to aea365 come from our IE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Dr. Moya Alfonso, MSPH, and I’m an Associate Professor at the Jiann-Ping Hsu College of Public Health at Georgia Southern University, and I am University Sector Representative and Board Member for the Southeast Evaluation Association (SEA). I would like to offer you a few tips on engaging stakeholders in participatory evaluation based on my 16 years of experience engaging stakeholders in community health research and evaluation.

Participatory evaluation is an approach that engages stakeholders in each step of the process.  Rather than the trained evaluator solely directing the evaluation, participatory evaluation requires a collaborative approach.  Evaluators work alongside stakeholders in developing research questions, deciding upon an evaluation design, designing instruments, selecting methods, gathering and analyzing data, and disseminating results.  Participatory evaluation results in stronger evaluation designs and greater external validity because community members have a high level of input in entire process.  It also strengthens buy-in to the results and a greater use of the evaluation products.

Rad Resource: Explore the University of Kansas Community Tool Box for introductory information on participatory evaluation.

Hot Tips: Here are a few tips for engaging stakeholders:

  • Establish a diverse stakeholder advisory group: Community stakeholders have a range of skills that can contribute to the evaluation process. For example, I worked with 8th grade youth on a participatory research project and assumed that I would need to conduct the statistical analysis of survey data.  To my surprise, one of the youths had considerable expertise and was able to conduct the analysis with little assistance. With training and support, community stakeholders can contribute and exceed your expectations.
  • Keep stakeholders busy: A common problem in working with advisory groups is attrition. Keep community stakeholders engaged with evaluation tasks that use their unique skill sets. Matching assignments to existing skill sets empower community stakeholders and result in increased buy-in and engagement.
  • Celebrate successes: Celebrating successes over the course of the evaluation is a proven strategy for keeping stakeholders engaged. Rather than waiting until the end of the evaluation, reward stakeholders regularly for the completion of evaluation steps.
  • Keep your ego in check: Some highly trained evaluators might find handing over the reins to community stakeholders challenging because they’re used to running the show. Participatory evaluation requires evaluators to share control and collaborate with community stakeholders. Try to keep an open mind and trust in the abilities of community stakeholders to participate in the evaluation process with your support and guidance.  You’ll be amazed at what you can achieve when stakeholders are fully engaged in evaluation research! 

The American Evaluation Association is celebrating Southeast Evaluation Association (SEA) Affiliate Week with our colleagues in the SEA Affiliate. The contributions all this week to aea365 come from SEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Older posts >>

Archives

To top