AEA365 | A Tip-a-Day by and for Evaluators

TAG | participatory evaluation

Hello! My name is Miki Tsukamoto and I am a Monitoring and Evaluation Coordinator at the International Federation of Red Cross and Red Crescent Societies (IFRC).

What if video could be used as the “spark” to increase the engagement and interest of communities in your programmes?

Recently, I had an opportunity to be part of a PVE team for the Global Framework for Climate Service’s programme which aimed to deliver and apply “…salient, credible and actionable climate services towards improved health and food security in Malawi and Tanzania.” To ensure better use and acceptance of this PVE for future programming, IFRC piloted the Most Significant Change technique[1](MSC), using the OECD/DAC criteria of relevance/appropriateness, effectiveness, coverage, sustainability and impact as themes for group discussions. Here are some of the lessons learnt:

Lessons learned:

Rad Resources: PVE videos were made at the community level, the country level and the multi-regional level.

Country level PVEs:

(https://www.youtube.com/watch?v=fSXj0IllfvQ&index=3&list=PLrI6tpZ6pQmQZsKuQl6n4ELEdiVqHGFw2)

(https://www.youtube.com/watch?v=mFWCOyIb9mU&index=4&list=PLrI6tpZ6pQmQZsKuQl6n4ELEdiVqHGFw2)

Multi-country PVE:

(https://www.youtube.com/watch?v=HzbcIZbQYbs&index=2&list=PLrI6tpZ6pQmQZsKuQl6n4ELEdiVqHGFw2)

A Red Cross Red Crescent Guide to Community Engagement and Accountability (CEA)

Guide to the “Most Significant Change” Technique by Rick Davies and Jess Dart

[1] http://www.mande.co.uk/docs/MSCGuide.pdf

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

We are Akashi Kaul, third year graduate student at George Mason University, and Rodney Hopson, former AEA president and professor at George Mason University. Our reflection for this Memorial Day series is on what “participation” means. We highlight three things: (1) the ambiguity around participation’ since it exists in evaluation as both theory and method; (2) the need for discussing power when talking about participation in evaluation; and (3) the need to refer to intersectional literature when referring to these concepts.

Participation is the latest buzzword in evaluation – from impact assessment to democratic evaluation – there has been a growing focus on this word. Cousins and Whitmore (1998) distinguished “transformative participatory evaluation” from “practical participatory evaluation.”  Yet, there remains ambiguity about the ‘why,’ ‘who,’ ‘how,’ ‘what’ and ‘for whom’ of ‘participation’. For starters, the fact that participation is used in evaluation as a method and a theory renders the division between the ‘transformative’ and ‘practical’ paradigms a little perfunctory, since not all evaluation processes that employ ‘participation,’ use ‘participatory evaluation’ theory. Further, the primary distinction between transformative and practical participatory evaluation, that the later ‘aims to increase the use of evaluation results through the involvement of intended users’ (Smits & Champagne, 2008) is one that is necessary for the former too. Finally, there is much to be said about whether participation is a means or an end in and of itself and how that impacts evaluation.

Then there is the finding that participation is still an evaluator-driven process (Cullen, Coryn & Rugh, 2011), sometimes excluding the spirit of ‘participation’ entirely. Recent writings on culturally responsive evaluation (Hood, Hopson & Freirson, 2005), a process that innately includes participation of all stakeholders, raises questions about the role of culture for understanding variations in participation (also see Chouinard and Hopson (2016) for how ‘participation is used as a proxy for culture).

The larger questions with respect to participation in evaluation are around power, voice, and the identification of ‘stakeholders’. That evaluation is a political process, conducted in political environs with political ramifications is articulated often enough. However, such discussion around power are both general and sparse. Evaluation can learn from other disciplines about power and participation.

Rad Resource: Planning studies, for example, use the ladder of citizen participation, which could easily span the realm from ‘practical’ to ‘transformative’ – clearly making the practical to be non-participation or tokenism.

Rad Resources: Power is discussed and argued about in literature from Marx to Gramsci to Foucault to Fanon to Bourdieu – thinkers, we rarely use in evaluation.

Tough questions: Is power limited to capital i.e. donors, or is it ubiquitous a la Foucault? Is it cultural capital that counts or colonial/postcolonial/neocolonial thought pervasiveness? These are tough questions that evaluation, in the United States and abroad, needs to consider going forth.

The American Evaluation Association is celebrating Memorial Week in Evaluation. The contributions this week are remembrances of evaluation concepts, terms, or approaches. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

 

 

 

·

My name is Dr. Moya Alfonso, MSPH, and I’m an Associate Professor at the Jiann-Ping Hsu College of Public Health at Georgia Southern University, and I am University Sector Representative and Board Member for the Southeast Evaluation Association (SEA). I would like to offer you a few tips on engaging stakeholders in participatory evaluation based on my 16 years of experience engaging stakeholders in community health research and evaluation.

Participatory evaluation is an approach that engages stakeholders in each step of the process.  Rather than the trained evaluator solely directing the evaluation, participatory evaluation requires a collaborative approach.  Evaluators work alongside stakeholders in developing research questions, deciding upon an evaluation design, designing instruments, selecting methods, gathering and analyzing data, and disseminating results.  Participatory evaluation results in stronger evaluation designs and greater external validity because community members have a high level of input in entire process.  It also strengthens buy-in to the results and a greater use of the evaluation products.

Rad Resource: Explore the University of Kansas Community Tool Box for introductory information on participatory evaluation.

Hot Tips: Here are a few tips for engaging stakeholders:

  • Establish a diverse stakeholder advisory group: Community stakeholders have a range of skills that can contribute to the evaluation process. For example, I worked with 8th grade youth on a participatory research project and assumed that I would need to conduct the statistical analysis of survey data.  To my surprise, one of the youths had considerable expertise and was able to conduct the analysis with little assistance. With training and support, community stakeholders can contribute and exceed your expectations.
  • Keep stakeholders busy: A common problem in working with advisory groups is attrition. Keep community stakeholders engaged with evaluation tasks that use their unique skill sets. Matching assignments to existing skill sets empower community stakeholders and result in increased buy-in and engagement.
  • Celebrate successes: Celebrating successes over the course of the evaluation is a proven strategy for keeping stakeholders engaged. Rather than waiting until the end of the evaluation, reward stakeholders regularly for the completion of evaluation steps.
  • Keep your ego in check: Some highly trained evaluators might find handing over the reins to community stakeholders challenging because they’re used to running the show. Participatory evaluation requires evaluators to share control and collaborate with community stakeholders. Try to keep an open mind and trust in the abilities of community stakeholders to participate in the evaluation process with your support and guidance.  You’ll be amazed at what you can achieve when stakeholders are fully engaged in evaluation research! 

The American Evaluation Association is celebrating Southeast Evaluation Association (SEA) Affiliate Week with our colleagues in the SEA Affiliate. The contributions all this week to aea365 come from SEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

I caught up with Laurie Stevehn after her full-day workshop: Strategies for Interactive Evaluation, which she presented with Jean King to an engaged and eager group of evaluators who came from a diverse background of cultures and work settings.

Why did you choose this topic for Evaluation 2016?

Laurie has spent 20-25 years working in participatory evaluation for capacity building and thinks evaluatively about bringing people together. This workshop draws on the strategies and methods covered in Interactive Evaluation Practice: Mastering the Interpersonal Dynamics of Program Evaluation which she co-authored with Jean King. No matter your approach to evaluation, you have to engage and interact with people. This workshops helps attendees identify best approaches that can be applied to organizations or programs. In this workshop attendees think through the strategies so when they walk out of the doors, they now have a plan in place.

Did you learn anything form other’s experiences?

The attendees in Laurie’s workshop were very engaging and eager to share their experiences and examples from their practices. Sometimes the experience was a variation on the strategies presented and sometimes the attendee’s brought their own challenges. In today’s workshop, Jean and Laurie spent time discussing the challenge of dominant personalities and how to handle these situations while keeping the conversation inclusive.

What are you looking forward to at the conference?

While in Atlanta, Laurie is excited to attend sessions that deal with participatory evaluation and have lots of engagement. She enjoys sessions that share the most recent research on evaluation. She is particularly looking forward to seeing Mel Mark and Michael Quinn Patton who are great theorist and stay on top of research on evaluation.

How do you benefit from AEA?

Laurie has been an AEA Member since 1995. She always finds AEA members friendly and ready to share. The key leaders in evaluations make themselves readily available and accessible to all attendees. She finds it humbling to be a part of an organization and community that focuses so much on the greater good for humanity and community.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.  

 

 

· · ·

My name is Julie Poncelet and I am co-founder of Action Evaluation Collaborative (AEC), a partnership of independent consultants who use evaluation to strengthen social change. We want to share some of the participatory methods we use to engage youth in evaluations.

By participatory methods, we mean ways of working together, to gather insights, such that everyone, including program participants, are involved and play an active and influential part in decisions that affect their lives. Methods include a variety of tools or exercises to engage youth in meaningful, age-appropriate ways.

Last year, AEC blogged about the use of Participatory Video (PV) with a youth group in Mexico. Although we are not always able to engage youth in PV for evaluation, we do video document (with permission) all of our participatory engagement work so that the voices and experiences of youth can be brought into the collective analysis and sensemaking process. In addition, we also encourage the active participation of youth in such processes directly so that insights can be contextualized and dialogued enriched. However, the power dynamics between youth and adults should be considered, especially when working internationally.

Hot Tip: In our evaluation work, we use a range of visioning tools, adapting activities like force field analysis, fish and boulders (which we called butterflies and stones) or road journey to explore girls’ individual and collective aspirations. These tools help us, as well as the girls and program implementers, to better understand the current status of girls in the community and conditions or forces that support them to achieve their vision or hold them back. A little bit of paper, some cut out butterflies, and scented markers can create a fun, energizing space for girls to safely explore and share their insights.

  • One key lesson learned from doing this type of visioning work with girls from marginalized communities is that many have not had the opportunity to explore their aspirations and so it may be challenging for them at first to engage with the activity. We have found that conversation, especially around what girls like (or don’t like), what they are curious about, and who they look up to, can help girls to reflect or think more deeply about their dreams.

Rad Resources: We have used Liberating Structures in some of our evaluations working with older youth and have found the following to provide an energizing structure to facilitate the quick generation and sharing of ideas: 1-2-4-All and Mins Specs. We have also used storyboarding (Liberating Structures version and from Dr. Kim Sabo Flores’ book) and other drawing activities to engage youth in sharing experiences, feelings and attitudes.

The American Evaluation Association is celebrating Youth Focused Evaluation (YFE) TIG Week with our colleagues in the YFE AEA Topical Interest Group. The contributions all this week to aea365 come from our YFE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Ann Zukoski and I am a Senior Research Associate at Rainbow Research, Inc. in Minneapolis, Minnesota.

Founded as a nonprofit organization in 1974, Rainbow Research’s mission is to improve the effectiveness of socially-concerned organizations through capacity building, research, and evaluation. Rainbow Research is known for its focus on using participatory evaluation approaches.

Through my work, I am always looking for new tools and approaches to engage stakeholders throughout the evaluation process. Today, I am sharing two methods that I have found helpful.

Rad Resource:

Ripple Effect Mapping [REM] is an effective method for having a large group of stakeholders identify the intended and unintended impacts of projects. In REM stakeholders use elements of Appreciative Inquiry, mind mapping, and qualitative data analysis to reflect upon and visually map the intended and unintended changes produced by a complex program or collaboration. It is a powerful technique to document impacts, and engage stakeholders. Rainbow Research is currently collaborating with Scott Chazdon at the University of Minnesota to use this method to evaluate a community health program impact by conducting REM at two points in time —at the beginning and end of a three-year project. Want to learn more? See http://evaluation.umn.edu/wp-content/uploads/Ripple-Effect-Mapping-MESI13-spring-training-march-2013_KA-20130305.pdf

Hot Tip:

The Art of Hosting (AoH) is a set of facilitation tools, evaluators can use to engage stakeholders and create discussions that count. AoH is a set of methods for working with groups to harness the collective wisdom and self-organizing capacity of groups of any size.  The Art of Hosting uses a set of conversational processes to invite people to step in and become fully engaged in the task at hand. This working practice can help groups make decisions, build their capacity and find new ways to respond to opportunities challenges and change. For more information see – http://www.artofhosting.org/what-is-aoh/

Have you used these tools? Let us all know your thoughts!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi Eval Friends! We are Kerry Zaleski and Mary Crave of the University of Wisconsin-Extension and Tererai Trent of Tinogona Foundation and Drexel University. Over the past few years we have co-facilitated workshops on participatory M&E methods for centering vulnerable voices at AEA conferences and eStudies.

This year, we are pleased to introduce participatory processes for engaging young people in evaluation during a half day professional development workshop, borrowing from Child-to-Child approaches. Young people can be active change agents when involved in processes to identify needs, develop solutions and monitor and evaluate changes in attitudes and behaviors for improved health and well-being.

Child-to-Child approaches help center evaluation criteria around the values and perspectives of young people, creating environments for continual learning among peers and families. Children learn new academic skills and evaluative thinking while having fun solving community problems!

Child-to-Child approaches help young people lead their communities to:

  • Investigate, plan, monitor and evaluate community programs by centering the values and perspective of people affected most by poverty and inequality.
  • Overcome stigma and discrimination by intentionally engaging marginalized people in evaluation processes.

We are excited to introduce Abdul Thoronka, a community health specialist from Sierra Leone, as a new member of our team. Abdul has extensive experience using participatory methods and Child-to-Child approaches in conflict- and trauma- affected communities in Africa and the US.

Lessons Learned:

  • Adult community members tend to be less skeptical and more engaged when ‘investigation’ types of exercises are led by children in their community rather than external ‘experts’. The exercises make learning about positive behavior change fun and entertaining for the entire community.
  • Young people are not afraid to ‘tell the truth’ about what they observe.
  • Exercises to monitor behaviors often turn into a healthy competition between young people and their families.

Hot Tips:

  • Child-to-child approaches can be used to engage young people at all stages of an intervention. Tools can include various forms of community mapping, ranking, prioritizing, values-based criteria-setting and establishing a baseline to measure change before and after an intervention.
  • Build in educational curricula by having the children draw a matrix, calculate percentages or develop a bar chart to compare amounts or frequency by different characteristics.
  • Explain the importance of disaggregating data to understand health and other disparities by different attributes (e.g. gender, age, ability, race, ethnicity)
  • Ask children to think of evaluation questions that would help them better understand their situation.

Rad Resources:

Child-to-Child Trust

The Barefoot Guide Connection

AEA Coffee Break Webinar 166: Pocket-Chart Voting-Engaging vulnerable voices in program evaluation with Kerry Zaleski, December 12, 2013 (recording available free to AEA members).

Robert Chambers 2002 book: Participatory Workshops: A Sourcebook of 21 Sets of Ideas and Activities.

Want to learn more? Register for Whose Judgment Matters Most: Using Child-to-Child approaches to evaluate vulnerability-centered programs at Evaluation 2014.

We’re featuring posts by people who will be presenting Professional Development workshops at Evaluation 2014 in Denver, CO. Click here for a complete listing of Professional Development workshops offered at Evaluation 2014. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Greetings from Mary Crave and Kerry Zaleski, of the University of Wisconsin – Extension and Tererai Trent of Tinogona Foundation and Drexel University. For the past few years we’ve teamed up to teach participatory methods for engaging vulnerable and historically under-represented persons in monitoring and evaluation. We’ve taught hands-on professional development workshops at AEA conferences, eStudies, and Coffee Breaks. “Visionary Evaluation for a Sustainable, Equitable Future” is not only the theme for Evaluation 2014, it is a succinct description of why we believe so strongly in what we teach. Lessons Learned: We’ve noticed during our trainings around the world that there is a continuum or range of what an evaluator might consider to be “participatory”.  Being aware of our own position and philosophy of participatory methods is especially critical when working with persons who traditionally may have been excluded from participation due to income, location, gender, ethnicity or disability. Trent suggests these lenses or levels from low to high: Spectator Participation > Tokenism Participation > Incentive Participation > Functional Participation > Ownership Participation. The more ownership or the higher the level of participation, the more impact a program will have on social justice issues and sustainable, equitable futures for people. Those who want their methods lens to focus on “ownership participation” sometimes have trouble reaching that aim because they have a small tool box or get stuck using the wrong tool in a particular time in the program cycle. Rubrics for success often leave out the voice of the vulnerable, though those voices can also be included using participatory tools. Hot Tips:

  • There are M & E tools especially suited for working with vulnerable persons that allow all voices to be heard, that do not depend on literacy skills, that consider cultural practices and power relationships in decision making and discussion, and that engage program beneficiaries in determining rubrics for success. These tools can be used in the planning, monitoring, data collection, analysis, and reporting stages of the program cycle.
  • You can expand your tool box of methods, and widen your lens on participatory methods at our 2-day workshop at AEA 2104, Reality Counts (Workshop #6) We’ll be joined by Abdul Thoronka, an international community health specialist and manager of a community organization that works with persons with disabilities.

Rad Resources: Robert Chambers 2002 book: Participatory Workshops: A Sourcebook of 21 Sets of Ideas and Activities. Food and Agricultural Organization (FAO) of the UN: Click on publications; type in PLA in search menu. AEA Coffee Break Webinar 166: Pocket-Chart Voting-Engaging vulnerable voices in program evaluation with Kerry Zaleski, December 12, 2013 (recording available free to AEA members). Want to learn more? Register for Reality Counts: Participatory methods for engaging marginalized and under-represented persons in M&E at Evaluation 2014. This week, we’re featuring posts by people who will be presenting Professional Development workshops at Evaluation 2014 in Denver, CO. Click here for a complete listing of Professional Development workshops offered at Evaluation 2014. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Greetings!  We are Tom McQuiston (USW Tony Mazzocchi Center) and Tobi Mae Lippin and Kristin Bradley-Bull (New Perspectives Consulting Group).  We have collaborated for over a decade on participatory evaluation and assessment projects for the United Steelworkers (labor union).  And we have grappled mightily with how to complete high-quality data analysis and interpretation in participatory ways.

Hot Tip: Carefully determine up front what degree of full evaluation team participation there will be in data analysis.  Some practical considerations include:  the amount of team time, energy, interest, and analysis expertise that is available; the levels of data analysis being completed; the degree of project focus on team capacity-building; and the project budget and timeline.  How these and other considerations get weighed is, of course, also a product of the values undergirding your work and the project.

Hot Tip: Consider preparing an intermediate data report (a.k.a. “half-baked” report) that streamlines the analysis process for the full team.  Before the full team dives in, we:  review the raw quantitative data; run preliminary cross-tabs and statistical tests; refine the data report content to include only the — to us — most noteworthy data; remove extraneous columns spit out of SPSS; and assemble the tables that should be analyzed together — along with relevant qualitative data — into reasonably-sized thematic chunks for the team.

Hot Tip: Team time is a precious commodity, so well-planned analysis/ interpretation meetings are essential.  Some keys to success include:

  1. Invest in building the capacity of all team members.  We do this through a reciprocal process of us training other team members in, say, reading a frequency or cross-tab table or coding qualitative data and of them training us in the realities of what we are all studying.
  2. Determine time- and complexity-equivalent analyses that sub-teams can work on simultaneously.  Plan to have the full team thoughtfully review sub-team work.
  3. Stay open to shifting in response to the team’s expertise and needs.  An empowered team will guide the process in ever-evolving ways.

Some examples of tools we have developed — yes, you, too, can use Legos™ in your work — can be found at: http://newperspectivesinc.org/resources.

We never fail to have many moments of “a-ha,” “what now” and “wow” in each participatory process.  We wish the same for you.

The American Evaluation Association is celebrating Best of aea365 week. The contributions all this week are reposts of great aea365 blogs from our earlier years. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hello from Mary Crave and Kerry Zaleski, of the University of Wisconsin – Extension and Tererai Trent of Tinogona Foundation and Drexel University.  For the past few years we’ve teamed up to teach hands-on professional development workshops at AEA conferences on participatory methods for engaging vulnerable and historically under-represented persons in monitoring and evaluation. Our workshops are based on:

  • More than 65 years of collective community-based experience in the US and more than 55 countries
  • Our philosophy that special efforts should be made to engage people who have often been left out of the community decision-making process (including program assessment and evaluation)
  • The thoughtful work of such theorists and practitioners as Robert Chambers, a pioneer in Participatory Rural Appraisal.

Lessons Learned: While many evaluators espouse the benefits of participatory methods, engaging under-represented persons often calls for particular tools, methods and approaches. Here’s the difference:

  1. Vulnerability: Poverty, cultural traditions, natural disasters, illness and disease, disabilities, human rights abuses, a lack of access to resources or services, and other factors can make people vulnerable in some contexts. This can lead to marginalization or oppression by those with power, and critical voices are left out of the evaluation process.
  2. Methods and tools have many benefits: They can be used throughout the program cycle; are adaptable to fit any context; promote inclusion, diversity and equality; spark collective action; and, support community ownership of results – among others.
  3. 3.     Evaluators are really facilitators and participants become the evaluators of their own realities.

Hot Tip:  Join us to learn more about the foundations of and some specific “how-to” methods on this topic at an upcoming AEA eStudy, February 5 and February 12, 1-2:30 PM EST. Click here to register.

We’ll talk about the foundations of participatory methods and walk through several tools such as community mapping, daily calendars, pair-wise ranking, and pocket-chart voting.

Rad Resources: Robert Chambers 2002 book: Participatory Workshops: A Sourcebook of 21 Sets of Ideas and Activities.

Food and Agricultural Organization (FAO) of the UN: http://www.fao.org/docrep/006/ad424e/ad424e03.htm (click on publications, type in PLA in search menu)

AEA Coffee Break Webinar 166: Pocket-Chart Voting-Engaging vulnerable voices in program evaluation with Kerry Zaleski, December 12, 2013 (recording available free to AEA members).

June Gothberg on Involving Vulnerable Populations in Evaluation and Research, August 23, 2013

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Older posts >>

Archives

To top