AEA365 | A Tip-a-Day by and for Evaluators

TAG | participatory evaluation

Greetings AEA365 bloggers! Corey Newhouse and Stephanie Kong here from Public Profit – a consulting firm helping mission-driven organizations measure and manage what matters. If you are like us, you often need to facilitate meaning-making sessions with stakeholders from all levels. Programs are being asked to use more and more data, but it can be overwhelming and confusing as to where they should start – or where you should start if you are the one who is suppose to be supporting them.

Luckily we’ve taken some of the legwork out of this process, and have put together a set of field-tested activities that work in a variety of programs and contexts. Our guide, Dabbling in the Data, provides step-by-step guidance on 15 different approaches to participatory activities that cover distribution, change over time, contribution, categories, and communicating findings. Our hope is that this guide will help you clear the hurdle of participatory methods, allowing you to engage others in meaningful, fun conversations about data.

Lessons Learned:

  • Participatory data methods are difficult! Simply knowing best practices doesn’t guarantee success. In putting together this guide, we started from common facilitation methods, and then adapted them for participatory data sessions.
  • Programs often have the data they need in order to make well-informed decisions, but don’t know what to do with it. Likewise, teams want to interact with their data, but don’t have the time to figure out where to begin, thus they stop using data as effectively.
  • Without a structured approach to interpreting the data, often the first person to speak up will set the agenda for the entire group. Meaningful insights are easily overlooked when this happens. The approaches in this guide have been set up to help drive engagement from multiple stakeholders.

Rad Resources:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hello, we are Greg Lestikow, CEO and Fatima Frank, Project Manager of evalû, a small consulting firm that focuses exclusively on rigorous evaluations of social and economic development initiatives.  We champion impact evaluation that maintains academic rigor but is based entirely on our clients’ need to improve strategic and operational effectiveness and increase profitability.

In a recent project, we were tasked with designing a qualitative instrument to complement quantitative data around the sensitive topic of gender-based violence.

Rad Resource: We approached this challenge by designing a focus group discussion (FGD) protocol informed by an article on the “Participatory Ranking Method” (PRM), in which participants rank potential indicators from most to least important. PRM acknowledges project beneficiaries as experts and recognizes the local community as capable of identifying and measuring their progress towards positive change. As such, PRM incorporates local perspectives in the construction of research instruments. By using PRM, we were able to select indicators that are meaningful to the project’s local beneficiaries (in our case adolescent girls affected by violence) and reflective of the concepts they find useful when tracking their own progress. PRM is an ideal evaluation methodology for measuring awareness of sensitive topics and tracking outcomes over time, particularly for projects that may not see any kind of impact in the short or medium term.

Hot Tips:

  • Start with a participatory activity to gauge local perspectives and to understand which social practices are considered more or less acceptable in the community. In our case, we asked participants what gender-based violence meant to them.
  • To facilitate ranking, show a series of cards labeled with different kinds of social practices (in our case: Shout; Insult, Threaten, Push, Hit, Beat, Kill) and have participants order them from the most to the least acceptable, asking them to explain their decisions.  Alternatively, participants can free-list social practices that are common in their communities and then rank-order them.
  • Include an open-ended discussion to understand which social practices are acceptable in different relational and social contexts.

Lessons Learned:

  • Make sure moderator and note-taker are gender appropriate.
  • If you want to obtain a broad range of perspectives but anticipate potential problems with mixing certain community members in the same FGD, create a few FGD groups and separate participants.
  • Ask local evaluation or project teams about any other cultural practices to consider before an FGD. For example, in Sierra Leone we started each FGD with a prayer, as this is a standard practice when people meet.

Please share your stories on challenges, solutions, and experiences in dealing with sensitive topics by leaving a comment here or contacting us.

The American Evaluation Association is celebrating Best of aea365, an occasional series. The contributions for Best of aea365 are reposts of great blog articles from our earlier years. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello! My name is Miki Tsukamoto and I am a Monitoring and Evaluation Coordinator at the International Federation of Red Cross and Red Crescent Societies (IFRC).

What if video could be used as the “spark” to increase the engagement and interest of communities in your programmes?

Recently, I had an opportunity to be part of a PVE team for the Global Framework for Climate Service’s programme which aimed to deliver and apply “…salient, credible and actionable climate services towards improved health and food security in Malawi and Tanzania.” To ensure better use and acceptance of this PVE for future programming, IFRC piloted the Most Significant Change technique[1](MSC), using the OECD/DAC criteria of relevance/appropriateness, effectiveness, coverage, sustainability and impact as themes for group discussions. Here are some of the lessons learnt:

Lessons learned:

Rad Resources: PVE videos were made at the community level, the country level and the multi-regional level.

Country level PVEs:

(https://www.youtube.com/watch?v=fSXj0IllfvQ&index=3&list=PLrI6tpZ6pQmQZsKuQl6n4ELEdiVqHGFw2)

(https://www.youtube.com/watch?v=mFWCOyIb9mU&index=4&list=PLrI6tpZ6pQmQZsKuQl6n4ELEdiVqHGFw2)

Multi-country PVE:

(https://www.youtube.com/watch?v=HzbcIZbQYbs&index=2&list=PLrI6tpZ6pQmQZsKuQl6n4ELEdiVqHGFw2)

A Red Cross Red Crescent Guide to Community Engagement and Accountability (CEA)

Guide to the “Most Significant Change” Technique by Rick Davies and Jess Dart

[1] http://www.mande.co.uk/docs/MSCGuide.pdf

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

We are Akashi Kaul, third year graduate student at George Mason University, and Rodney Hopson, former AEA president and professor at George Mason University. Our reflection for this Memorial Day series is on what “participation” means. We highlight three things: (1) the ambiguity around participation’ since it exists in evaluation as both theory and method; (2) the need for discussing power when talking about participation in evaluation; and (3) the need to refer to intersectional literature when referring to these concepts.

Participation is the latest buzzword in evaluation – from impact assessment to democratic evaluation – there has been a growing focus on this word. Cousins and Whitmore (1998) distinguished “transformative participatory evaluation” from “practical participatory evaluation.”  Yet, there remains ambiguity about the ‘why,’ ‘who,’ ‘how,’ ‘what’ and ‘for whom’ of ‘participation’. For starters, the fact that participation is used in evaluation as a method and a theory renders the division between the ‘transformative’ and ‘practical’ paradigms a little perfunctory, since not all evaluation processes that employ ‘participation,’ use ‘participatory evaluation’ theory. Further, the primary distinction between transformative and practical participatory evaluation, that the later ‘aims to increase the use of evaluation results through the involvement of intended users’ (Smits & Champagne, 2008) is one that is necessary for the former too. Finally, there is much to be said about whether participation is a means or an end in and of itself and how that impacts evaluation.

Then there is the finding that participation is still an evaluator-driven process (Cullen, Coryn & Rugh, 2011), sometimes excluding the spirit of ‘participation’ entirely. Recent writings on culturally responsive evaluation (Hood, Hopson & Freirson, 2005), a process that innately includes participation of all stakeholders, raises questions about the role of culture for understanding variations in participation (also see Chouinard and Hopson (2016) for how ‘participation is used as a proxy for culture).

The larger questions with respect to participation in evaluation are around power, voice, and the identification of ‘stakeholders’. That evaluation is a political process, conducted in political environs with political ramifications is articulated often enough. However, such discussion around power are both general and sparse. Evaluation can learn from other disciplines about power and participation.

Rad Resource: Planning studies, for example, use the ladder of citizen participation, which could easily span the realm from ‘practical’ to ‘transformative’ – clearly making the practical to be non-participation or tokenism.

Rad Resources: Power is discussed and argued about in literature from Marx to Gramsci to Foucault to Fanon to Bourdieu – thinkers, we rarely use in evaluation.

Tough questions: Is power limited to capital i.e. donors, or is it ubiquitous a la Foucault? Is it cultural capital that counts or colonial/postcolonial/neocolonial thought pervasiveness? These are tough questions that evaluation, in the United States and abroad, needs to consider going forth.

The American Evaluation Association is celebrating Memorial Week in Evaluation. The contributions this week are remembrances of evaluation concepts, terms, or approaches. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

 

 

 

·

My name is Dr. Moya Alfonso, MSPH, and I’m an Associate Professor at the Jiann-Ping Hsu College of Public Health at Georgia Southern University, and I am University Sector Representative and Board Member for the Southeast Evaluation Association (SEA). I would like to offer you a few tips on engaging stakeholders in participatory evaluation based on my 16 years of experience engaging stakeholders in community health research and evaluation.

Participatory evaluation is an approach that engages stakeholders in each step of the process.  Rather than the trained evaluator solely directing the evaluation, participatory evaluation requires a collaborative approach.  Evaluators work alongside stakeholders in developing research questions, deciding upon an evaluation design, designing instruments, selecting methods, gathering and analyzing data, and disseminating results.  Participatory evaluation results in stronger evaluation designs and greater external validity because community members have a high level of input in entire process.  It also strengthens buy-in to the results and a greater use of the evaluation products.

Rad Resource: Explore the University of Kansas Community Tool Box for introductory information on participatory evaluation.

Hot Tips: Here are a few tips for engaging stakeholders:

  • Establish a diverse stakeholder advisory group: Community stakeholders have a range of skills that can contribute to the evaluation process. For example, I worked with 8th grade youth on a participatory research project and assumed that I would need to conduct the statistical analysis of survey data.  To my surprise, one of the youths had considerable expertise and was able to conduct the analysis with little assistance. With training and support, community stakeholders can contribute and exceed your expectations.
  • Keep stakeholders busy: A common problem in working with advisory groups is attrition. Keep community stakeholders engaged with evaluation tasks that use their unique skill sets. Matching assignments to existing skill sets empower community stakeholders and result in increased buy-in and engagement.
  • Celebrate successes: Celebrating successes over the course of the evaluation is a proven strategy for keeping stakeholders engaged. Rather than waiting until the end of the evaluation, reward stakeholders regularly for the completion of evaluation steps.
  • Keep your ego in check: Some highly trained evaluators might find handing over the reins to community stakeholders challenging because they’re used to running the show. Participatory evaluation requires evaluators to share control and collaborate with community stakeholders. Try to keep an open mind and trust in the abilities of community stakeholders to participate in the evaluation process with your support and guidance.  You’ll be amazed at what you can achieve when stakeholders are fully engaged in evaluation research! 

The American Evaluation Association is celebrating Southeast Evaluation Association (SEA) Affiliate Week with our colleagues in the SEA Affiliate. The contributions all this week to aea365 come from SEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

I caught up with Laurie Stevehn after her full-day workshop: Strategies for Interactive Evaluation, which she presented with Jean King to an engaged and eager group of evaluators who came from a diverse background of cultures and work settings.

Why did you choose this topic for Evaluation 2016?

Laurie has spent 20-25 years working in participatory evaluation for capacity building and thinks evaluatively about bringing people together. This workshop draws on the strategies and methods covered in Interactive Evaluation Practice: Mastering the Interpersonal Dynamics of Program Evaluation which she co-authored with Jean King. No matter your approach to evaluation, you have to engage and interact with people. This workshops helps attendees identify best approaches that can be applied to organizations or programs. In this workshop attendees think through the strategies so when they walk out of the doors, they now have a plan in place.

Did you learn anything form other’s experiences?

The attendees in Laurie’s workshop were very engaging and eager to share their experiences and examples from their practices. Sometimes the experience was a variation on the strategies presented and sometimes the attendee’s brought their own challenges. In today’s workshop, Jean and Laurie spent time discussing the challenge of dominant personalities and how to handle these situations while keeping the conversation inclusive.

What are you looking forward to at the conference?

While in Atlanta, Laurie is excited to attend sessions that deal with participatory evaluation and have lots of engagement. She enjoys sessions that share the most recent research on evaluation. She is particularly looking forward to seeing Mel Mark and Michael Quinn Patton who are great theorist and stay on top of research on evaluation.

How do you benefit from AEA?

Laurie has been an AEA Member since 1995. She always finds AEA members friendly and ready to share. The key leaders in evaluations make themselves readily available and accessible to all attendees. She finds it humbling to be a part of an organization and community that focuses so much on the greater good for humanity and community.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.  

 

 

· · ·

My name is Julie Poncelet and I am co-founder of Action Evaluation Collaborative (AEC), a partnership of independent consultants who use evaluation to strengthen social change. We want to share some of the participatory methods we use to engage youth in evaluations.

By participatory methods, we mean ways of working together, to gather insights, such that everyone, including program participants, are involved and play an active and influential part in decisions that affect their lives. Methods include a variety of tools or exercises to engage youth in meaningful, age-appropriate ways.

Last year, AEC blogged about the use of Participatory Video (PV) with a youth group in Mexico. Although we are not always able to engage youth in PV for evaluation, we do video document (with permission) all of our participatory engagement work so that the voices and experiences of youth can be brought into the collective analysis and sensemaking process. In addition, we also encourage the active participation of youth in such processes directly so that insights can be contextualized and dialogued enriched. However, the power dynamics between youth and adults should be considered, especially when working internationally.

Hot Tip: In our evaluation work, we use a range of visioning tools, adapting activities like force field analysis, fish and boulders (which we called butterflies and stones) or road journey to explore girls’ individual and collective aspirations. These tools help us, as well as the girls and program implementers, to better understand the current status of girls in the community and conditions or forces that support them to achieve their vision or hold them back. A little bit of paper, some cut out butterflies, and scented markers can create a fun, energizing space for girls to safely explore and share their insights.

  • One key lesson learned from doing this type of visioning work with girls from marginalized communities is that many have not had the opportunity to explore their aspirations and so it may be challenging for them at first to engage with the activity. We have found that conversation, especially around what girls like (or don’t like), what they are curious about, and who they look up to, can help girls to reflect or think more deeply about their dreams.

Rad Resources: We have used Liberating Structures in some of our evaluations working with older youth and have found the following to provide an energizing structure to facilitate the quick generation and sharing of ideas: 1-2-4-All and Mins Specs. We have also used storyboarding (Liberating Structures version and from Dr. Kim Sabo Flores’ book) and other drawing activities to engage youth in sharing experiences, feelings and attitudes.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Ann Zukoski and I am a Senior Research Associate at Rainbow Research, Inc. in Minneapolis, Minnesota.

Founded as a nonprofit organization in 1974, Rainbow Research’s mission is to improve the effectiveness of socially-concerned organizations through capacity building, research, and evaluation. Rainbow Research is known for its focus on using participatory evaluation approaches.

Through my work, I am always looking for new tools and approaches to engage stakeholders throughout the evaluation process. Today, I am sharing two methods that I have found helpful.

Rad Resource:

Ripple Effect Mapping [REM] is an effective method for having a large group of stakeholders identify the intended and unintended impacts of projects. In REM stakeholders use elements of Appreciative Inquiry, mind mapping, and qualitative data analysis to reflect upon and visually map the intended and unintended changes produced by a complex program or collaboration. It is a powerful technique to document impacts, and engage stakeholders. Rainbow Research is currently collaborating with Scott Chazdon at the University of Minnesota to use this method to evaluate a community health program impact by conducting REM at two points in time —at the beginning and end of a three-year project. Want to learn more? See http://evaluation.umn.edu/wp-content/uploads/Ripple-Effect-Mapping-MESI13-spring-training-march-2013_KA-20130305.pdf

Hot Tip:

The Art of Hosting (AoH) is a set of facilitation tools, evaluators can use to engage stakeholders and create discussions that count. AoH is a set of methods for working with groups to harness the collective wisdom and self-organizing capacity of groups of any size.  The Art of Hosting uses a set of conversational processes to invite people to step in and become fully engaged in the task at hand. This working practice can help groups make decisions, build their capacity and find new ways to respond to opportunities challenges and change. For more information see – http://www.artofhosting.org/what-is-aoh/

Have you used these tools? Let us all know your thoughts!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi Eval Friends! We are Kerry Zaleski and Mary Crave of the University of Wisconsin-Extension and Tererai Trent of Tinogona Foundation and Drexel University. Over the past few years we have co-facilitated workshops on participatory M&E methods for centering vulnerable voices at AEA conferences and eStudies.

This year, we are pleased to introduce participatory processes for engaging young people in evaluation during a half day professional development workshop, borrowing from Child-to-Child approaches. Young people can be active change agents when involved in processes to identify needs, develop solutions and monitor and evaluate changes in attitudes and behaviors for improved health and well-being.

Child-to-Child approaches help center evaluation criteria around the values and perspectives of young people, creating environments for continual learning among peers and families. Children learn new academic skills and evaluative thinking while having fun solving community problems!

Child-to-Child approaches help young people lead their communities to:

  • Investigate, plan, monitor and evaluate community programs by centering the values and perspective of people affected most by poverty and inequality.
  • Overcome stigma and discrimination by intentionally engaging marginalized people in evaluation processes.

We are excited to introduce Abdul Thoronka, a community health specialist from Sierra Leone, as a new member of our team. Abdul has extensive experience using participatory methods and Child-to-Child approaches in conflict- and trauma- affected communities in Africa and the US.

Lessons Learned:

  • Adult community members tend to be less skeptical and more engaged when ‘investigation’ types of exercises are led by children in their community rather than external ‘experts’. The exercises make learning about positive behavior change fun and entertaining for the entire community.
  • Young people are not afraid to ‘tell the truth’ about what they observe.
  • Exercises to monitor behaviors often turn into a healthy competition between young people and their families.

Hot Tips:

  • Child-to-child approaches can be used to engage young people at all stages of an intervention. Tools can include various forms of community mapping, ranking, prioritizing, values-based criteria-setting and establishing a baseline to measure change before and after an intervention.
  • Build in educational curricula by having the children draw a matrix, calculate percentages or develop a bar chart to compare amounts or frequency by different characteristics.
  • Explain the importance of disaggregating data to understand health and other disparities by different attributes (e.g. gender, age, ability, race, ethnicity)
  • Ask children to think of evaluation questions that would help them better understand their situation.

Rad Resources:

Child-to-Child Trust

The Barefoot Guide Connection

AEA Coffee Break Webinar 166: Pocket-Chart Voting-Engaging vulnerable voices in program evaluation with Kerry Zaleski, December 12, 2013 (recording available free to AEA members).

Robert Chambers 2002 book: Participatory Workshops: A Sourcebook of 21 Sets of Ideas and Activities.

Want to learn more? Register for Whose Judgment Matters Most: Using Child-to-Child approaches to evaluate vulnerability-centered programs at Evaluation 2014.

We’re featuring posts by people who will be presenting Professional Development workshops at Evaluation 2014 in Denver, CO. Click here for a complete listing of Professional Development workshops offered at Evaluation 2014. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Greetings from Mary Crave and Kerry Zaleski, of the University of Wisconsin – Extension and Tererai Trent of Tinogona Foundation and Drexel University. For the past few years we’ve teamed up to teach participatory methods for engaging vulnerable and historically under-represented persons in monitoring and evaluation. We’ve taught hands-on professional development workshops at AEA conferences, eStudies, and Coffee Breaks. “Visionary Evaluation for a Sustainable, Equitable Future” is not only the theme for Evaluation 2014, it is a succinct description of why we believe so strongly in what we teach. Lessons Learned: We’ve noticed during our trainings around the world that there is a continuum or range of what an evaluator might consider to be “participatory”.  Being aware of our own position and philosophy of participatory methods is especially critical when working with persons who traditionally may have been excluded from participation due to income, location, gender, ethnicity or disability. Trent suggests these lenses or levels from low to high: Spectator Participation > Tokenism Participation > Incentive Participation > Functional Participation > Ownership Participation. The more ownership or the higher the level of participation, the more impact a program will have on social justice issues and sustainable, equitable futures for people. Those who want their methods lens to focus on “ownership participation” sometimes have trouble reaching that aim because they have a small tool box or get stuck using the wrong tool in a particular time in the program cycle. Rubrics for success often leave out the voice of the vulnerable, though those voices can also be included using participatory tools. Hot Tips:

  • There are M & E tools especially suited for working with vulnerable persons that allow all voices to be heard, that do not depend on literacy skills, that consider cultural practices and power relationships in decision making and discussion, and that engage program beneficiaries in determining rubrics for success. These tools can be used in the planning, monitoring, data collection, analysis, and reporting stages of the program cycle.
  • You can expand your tool box of methods, and widen your lens on participatory methods at our 2-day workshop at AEA 2104, Reality Counts (Workshop #6) We’ll be joined by Abdul Thoronka, an international community health specialist and manager of a community organization that works with persons with disabilities.

Rad Resources: Robert Chambers 2002 book: Participatory Workshops: A Sourcebook of 21 Sets of Ideas and Activities. Food and Agricultural Organization (FAO) of the UN: Click on publications; type in PLA in search menu. AEA Coffee Break Webinar 166: Pocket-Chart Voting-Engaging vulnerable voices in program evaluation with Kerry Zaleski, December 12, 2013 (recording available free to AEA members). Want to learn more? Register for Reality Counts: Participatory methods for engaging marginalized and under-represented persons in M&E at Evaluation 2014. This week, we’re featuring posts by people who will be presenting Professional Development workshops at Evaluation 2014 in Denver, CO. Click here for a complete listing of Professional Development workshops offered at Evaluation 2014. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Older posts >>

Archives

To top