AEA365 | A Tip-a-Day by and for Evaluators

Hello AEA members! My name is Miki Tsukamoto and I am a Monitoring and Evaluation Coordinator at the International Federation of Red Cross and Red Crescent Societies (IFRC).

Video has proved to be a useful data collection tool to engage communities to share their feedback on the impact of projects and programmes in the International Federation of Red Cross and Red Crescent Societies (IFRC).[1] In an effort to develop more efficient and inclusive approaches to monitoring projects, IFRC’s Planning, Monitoring, Evaluation and Reporting (PMER) Unit in Geneva, in cooperation with Newcastle University’s Open Lab and in coordination with the Indonesian Red Cross Society (Palang Merah Indonesia-PMI) and IFRC Jakarta, piloted an initiative using the Most Significant Change approach facilitated by a mobile video application (app) called “Our Story,” adapted from the Bootlegger app, in the community of Tumbit Melayu in 2017. Stories were planned, collected, directed and edited by women, men, youth and elderly of the community through this “one stop shop” mobile application. The subject was to gather feedback on a water, sanitation and hygiene promotion (WASH) project being implemented by the Indonesian Red Cross Society (Palang Merah Indonesia-PMI) with the support of IFRC in the district of Berau, East Kalimantan province. Costs of this pilot project were minimal, as the app allows video data collection to be done without having to continuously rely on external expertise or expensive equipment.

Our Story: Women’s feedback on a WASH project in Berau, Indonesia


Our Story: Elderly’s feedback on a WASH project in Berau, Indonesia

Our Story: Youth’s feedback on a WASH project in Berau

Our Story: Men’s feedback on a WASH project in Berau

Our Story: Community’s feedback on a WASH project in Berau, Indonesia

Lessons Learned:

  • Data collection: When collecting disaggregated data, it is important that facilitators be flexible and respect the rhythm of each community group, including their schedules and availability.
  • Community needs: By collecting stories from representative groups from the community, it provides an opportunity for organizations to dive deeper into the wishes of the community and therefore better understand and address their varying specific needs.
  • Our Story app: The community welcomed this new tool as it was an app that facilitated the planning, capturing and creation of their story on a mobile device. This process can be empowering for an individual and/or group, and serve to increase their interest and future participation in IFRC and/or National Society-led projects.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

[1] Recent participatory video initiatives produced by communities receiving assistance from IFRC and/or National Society projects can be found at: https://www.youtube.com/playlist?list=PLrI6tpZ6pQmQZsKuQl6n4ELEdiVqHGFw2

· · ·

Beverly Peters

Beverly Peters

Greetings! I am Beverly Peters, an assistant professor of Measurement and Evaluation at American University. I have over 25 years of experience teaching, researching, and designing, implementing, and evaluating community development and governance projects, mainly in southern Africa.

This year’s AEA Conference theme, Speaking Truth to Power, addresses the heart of a concern that I have considered for years. I have found from my work in Africa that as an evaluator, by nature I leverage a certain amount of unwelcome power in my interactions with stakeholders. I have spent more than two decades asking how I can better understand that power, and mitigate it so that I can hear the truth from stakeholders.

I realized this power of the evaluator, first when I was conducting my PhD dissertation research in two villages in South Africa, and later as I continued microcredit work in the region. Issues of racial and economic privilege permeated my work in an environment emerging from more than four decades of apartheid. How could I ensure that stakeholders would not be silenced by that power? How could I ensure that the messages that stakeholders gave me were not distorted? While working on microcredit projects, I used ethnographic research methods and intercultural communication skills to break down power relationships. Although it was time consuming, ethnographic story telling helped to give my work perspective, and rural villagers voice.

The position of power and privilege has a host of facets to consider, some of which are not easily addressed. Many of these are related to the nature of the evaluator/stakeholder relationship, as I saw in my early work in South Africa. For years since then, I have also recognized that who I am as a person and an evaluator—my gender, age, nationality, and race, just to name a few attributes—impacts the data that I collect and the data to which I have access. This position of privilege, together with the attributes from above, can prevent evaluators from speaking truth to power.

Hot Tips:

How can I begin to break down this unwelcome position of privilege and address these inherent challenges, so that I can find ways to speak truth to power?

  • Keep a personal journal during every project. This will help you to be self reflective of who you are as a person and an evaluator, and help to identify how data might be impacted.
  • Craft a strong Evaluation Statement of Work that guides the evaluation and anticipates power relationships in the evaluand.
  • Secure a diverse evaluation team that includes local experts that will contribute to data collection, dialogue, and understanding.
  • Develop intercultural communication skills and use qualitative data collection techniques to uncover the emic, or insider, values of the stakeholder population.

My experiences have shown that being self reflective, having a strong evaluation plan and a diverse evaluation team, and collecting emic data can go a long way in identifying, understanding, and presenting insider values that can challenge the bonds of power over time.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Martha Brown, president of RJAE Consulting. This blog sheds light on the need to Speak Truth to Power (STTP) in AEA face-to-face and virtual spaces when racism, male supremacy, and other oppressive forces act to silence others. How do AEA members silence others? Here are two examples.

First, soon after subscribing to EVALTalk in 2016, I noticed sexism, misogyny and racism frequently present in the discussion threads. For instance, an African evaluator commented that requests for assistance and information made by African evaluators are often ignored. Many people were upset and sought to remedy the situation in various ways. A few men entered the conversation, exercising white male privilege in full force. First, they denied that racism was the problem. Worse yet, one man blamed the African evaluator for not doing more to be heard. According to Jones and Okum, a symptom of white supremacy culture is “to blame the person for raising the issue rather than to look at the issue which is actually causing the problem.”  Yet so many of us stood by and said nothing.

At Evaluation 2017, I attended what was supposed to be a panel presentation by three women. However, for the first 10 minutes, all we heard was the lone voice of a man in the front row who seemed to think that what he had to say was far more important than what the three female panelists had to say. Privilege normalizes silencing tactics, as “those with power assume they have the best interests of the organization at heart and assume those wanting change are ill-informed (stupid), emotional, inexperienced” (Jones & Okun, 2001). Yet not one person – not even the session moderator – intervened and returned the session to the presenters.

If others have similar stories, please share in the comments. No longer can we permit anyone to degrade, diminish or dismiss someone else’s work in AEA spaces. When it happens, we must lean into the discomfort and shine light onto the dark veil of sexism, racism, elitism, etc. right then and there. If we don’t, then we are complicit in allowing the abuse of power to continue.

Personally, I can no longer carry the burden of guilt and shame for allowing myself or my fellow evaluators to be silenced while I say nothing. Enough is enough. A new day is dawning, and it is time to speak truth to power in the moment when power is attempting to silence someone. Will you join me?

Rad Resources:

Virginia Stead’s: RIP Jim Crow: Fighting racism through higher education policy, curriculum, and cultural interventions.

Jones & Okun’s: White supremacy culture. From Dismantling racism: A workbook for social change groups.

Gary Howard’s: We can’t teach what we don’t know.

Ali Michael: How Can I Have a Positive Racial Identity? I’m White!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Scott Chaplowe and I currently work as the Director of Evidence, Measurement and Evaluation for climate at the Children’s Investment Fund Foundation (CIFF). As an evaluation professional, much of my work is not simply doing evaluation, but building the capacity of others to practice, manage, support and/or use evaluation.

Hot Tips: Beyond the 5 considerations shared in part 1 of this post, here are the other 5, based on an expert lecture I gave on this topic at AE2017:

  1. Ensure your ECB strategy is practical and realistic to organizational capacities. ECB should be realistic given the available time, budget, expertise and other resources. This is underscores the importance of initial analysis and local stakeholder engagement to set up ECB for success.
  2. Identify & capitalize on existing sources for ECB. There are a multiplicity of resources for and approaches to ECB, ranging from face-to-face delivery to webinars, communities of practice, discussion boards, self-paced reading, and blogs like this. These resources can be used in solo or blended as part of a capacity building program that forts different learning styles and needs. Indeed, it is important not to ‘reinvent the wheel’ if it can be ‘recycled.’ However, do not fall into the trap of adopting just because it is available—ensure that ECB resources are relevant for the desired capacity building objectives, or can be modified accordingly.
  3. Design and deliver learning grounded on adult learning principles. Adults are self-directed learners that bring to training past experiences, values, opinions, expectations and priorities that shape why and how they learn. Principles for adult learning stress a learner-centered approach that is applied, experiential, participatory and builds upon prior experience. You can read more about this here.
  4. Uphold professional standards, principles and ethics. An essential aspect of capacity building it to instill an understanding of and appreciation for ethical conduct and other standards for good practice. Specific guidelines and principles will vary according to context – sometimes specific to the organization itself, other times adopted from industry standards, such as the. AEA’s Guiding Principles For Evaluators and Statement on Cultural Competence in Evaluation, and the JCSEE’s Program Evaluation Standards Statements.
  5. Monitor and evaluate your ECB efforts to learn and adapt. Practice what we preach and track and assess ECB efforts to adapt, improve and be accountable to our ECB objectives. This begins at the design stage, when identifying those capacities that will be assess.

Ancillary Consideration. The above top 10 list is far from exhaustive, and as it is about human organizations and behavior, it is not absolute.

Rad Resources – Read more about this top 10 list here, and you can view the AEA365 presentation. Also, check out the book, Monitoring and evaluation Training: A Systematic Approach, and this webpage has an assortment of resources to support evaluation learning and capacity building.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Scott Chaplowe and I currently work as the Director of Evidence, Measurement and Evaluation for climate at the Children’s Investment Fund Foundation (CIFF). As an evaluation professional, much of my work is not simply doing evaluation, but building the capacity of others to practice, manage, support and/or use evaluation. I’ve discovered I am not alone as other evaluation colleagues have echoed similar experiences with evaluation capacity development (ECB).

Hot Tips: Based on an expert lecture I gave on this topic at AEA2017, here are 5 considerations for building Evaluation capacity:

  1. Adopt a systemic (systems) approach to organizational evaluation capacity building (ECB). ECB does not happen in isolation, but is embedded in complex social systems. Each organization will be distinct in time and place, and ECB interventions should be tailored according to the unique configuration of different factors and actors that shape the supply and demand for ECB. Supply refers to the presence of evaluation capacity, (human and material), and demand refers to the incentives and motivations for evaluation use. The conceptual diagram below illustrates key considerations in an organizational ECB system.

  1. Plan, deliver and follow-up ECB with attention to transfer. If organizational ECB is to make a difference, it is not enough to ensure learning occurs; targeted learners need to apply their learning. As Hallie Preskill and Shanelle Boyle aptly express, “Unless people are willing and able to apply their evaluation knowledge, skills, and attitudes [“KSA”] toward effective evaluation practice, there is little chance for evaluation practice to be sustained,
  2. Meaningfully engage stakeholders in the ECB process ECB will be more effective when it is done with rather than to organizational stakeholders. Meaningful engagement helps build ownership to sustain ECB implementation and use. It is especially important to identify and capitalize on ECB champions, and mitigate ECB adversaries who can block ECB and its uptake.
  3. Systematically approach organizational ECB, but remain flexible and adaptable to changing needs. ECB is intentional, and therefore it’s best orderly planned to gather information and analyze demand, needs and resources, identify objectives, and design a realistic strategy to achieve (and evaluate) ECB objectives.

However, a systematic approach does not mean a rigid blueprint that is blindly followed, which can inhibit experimentation to respond to changing capacity needs. ECB should remain flexible to adapt to the dynamic nature of the ECB system, which will vary and change over time and place.

  1. Align and pursue ECB with other organizational objectives. ECB should not be “silo-ed,” but ideally planned with careful attention to other organizational objectives and capacity building interventions. Consider how ECB activities complement, duplicate or compete with other capacity building activities.

Rad Resources – Read more about this top 10 list here and you can view the AEA365 presentation. Also, check out the book, Monitoring and evaluation Training: A Systematic Approach, and this webpage has an assortment of resources to support evaluation learning and capacity building.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello, my name is Jayne Corso and I am the community manager for the American Evaluation Association.

Social media offers a great way to have conversations with like-minded individuals. But, what if those like-minded individuals don’t know you have a Facebook, Twitter, or LinkedIn page. I am sharing just a few easy tips for getting the word out on your social media channels.

Hot Tip: Have Social Media Predominately Displayed on Your Website

A great way to show that you are on social media channels is to display social media icons at the top of your website. Some organizations put these at the bottom of their website where they usually get lost—when was the last time you scrolled all the way to the bottom of a website?

Moving your icons to the top of your website is also helpful for mobile devices. More and more people are using their cell phones instead of desktops to search website. With the icons above the “fold” or at the top of your page, they are easy to find no matter what device you are using.

Hot Tip: Reference Social Media in Emails

You are already sending emails to your followers or database, so why not tell them about your social media channels? You can do this in a very simple way, by adding the icons to your email template, or you can call out your social channels in your emails. Try doing a dedicated email promoting your social channels. Social media is the most direct way to communicate with your followers or database, so showcase this benefit to your fans!

Hot Tip: Continue the Conversation on Social Media

Moving conversations to your social media pages can add longevity to your discussion and invites more people to participate. If you have written an email about an interesting topic, invite your database to continue the conversation on Twitter. You can create a hashtag for your topic, so all posts can be easily searched. You can also do this on Facebook and encourage a conversation in the comments of a post.

I hope these tips were helpful. Follow AEA on Facebook and Twitter!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · ·

Ramesh Tuladhar

Ramesh Tuladhar

Greetings, I am Ramesh Tuladhar, focal point and coordinator for Disaster Risk Reduction (DRR) Thematic Committee of Community of Evaluators in Nepal (COE-Nepal). I am a professional geologist with experience in disaster risk management, monitoring, and evaluation. I am currently engaged as the monitoring and evaluation consultant of the Pilot Project on Climate Resilience (PPCR) implemented by the Department of Hydrology and Meteorology, Government of Nepal.

Lessons Learned: Eighty-seven out of 192 (45%) United Nations member states responded to the Sendai Framework Data Readiness Review in 2017. This proportion suggests that more stakeholders from member states, and also non-member states, may consider learning about and contributing to the Sendai Framework, which includes four priorities for action, to help improve effectiveness and sustainability of DRR interventions.

Hot Tip:

Rad Resources: To learn about the progress of DRR in Nepal, please visit:

The American Evaluation Association is celebrating Disaster and Emergency Management Evaluation (DEME) Topical Interest Group (TIG) Week. The contributions all this week to aea365 come from our DEME TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Greetings! My name is Nnenia Campbell, and I am a postdoctoral research associate at the University of Colorado Natural Hazards Center as well as an independent evaluation consultant specializing in disaster-related programming. As an alumna of AEA’s Graduate Education Diversity Internship (GEDI) program, I frequently consider how I can engage in culturally responsive evaluation or call attention to the role of cultural context in my work. These concepts are particularly important in disaster and emergency management evaluation because extreme events can affect diverse populations with vastly disparate impacts.

Scholars and practitioners alike have observed that initiatives designed to alleviate the burden of disaster losses often fail to meet their goals, particularly within underserved communities. Moreover, although the concept of social vulnerability has begun to feature prominently in emergency management discourse, common issues and oversights can inadvertently reinforce inequality and undermine the interests of those who suffer the most crippling disaster impacts. Opaque or exclusionary decision-making practices, discounting of local knowledge, and imposition of locally inappropriate “solutions” are common complaints about programs intended to help communities prepare for or respond to hazard events.

In evaluating disaster resilience and recovery initiatives, it is important to pay attention to which stakeholders are at the table and how that compares to the broader populations they serve. Which interests are being represented? What histories may inform how a program is perceived? Alternatively, what factors may influence how program implementers engage clients and characterize their needs? Culturally responsive evaluation provides a powerful lens for answering such questions and for clarifying why they are important to ask in the first place.

Hot Tip:

  • Do your homework. Culturally responsive evaluation literature emphasizes the importance of capturing the cultural context of the program under study. Ignoring factors such as the history of a program and its stakeholders, the relationships and power dynamics among them, or the values and assumptions that shape their actions can lead to grave errors in interpretation.
  • Seek out cultural brokers. In order to adequately address the concerns of diverse stakeholders, evaluators must establish trust and respect. Working with cultural brokers, or trusted liaisons who can help to communicate concerns and advocate on behalf of a group, can foster greater understanding encourage meaningful engagement.

Rad Resources:

The American Evaluation Association is celebrating Disaster and Emergency Management Evaluation (DEME) Topical Interest Group (TIG) Week. The contributions all this week to aea365 come from our DEME TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Alicia Stachowski and I am an Associate Professor of Psychology at the University of Wisconsin, Stout.  I am working with Sue Ann Corell Sarpy on a Citizen Science Program sponsored by the National Academies of Science.  We would like to share some preliminary findings from this research.

Lessons Learned:

Why Use Citizen Scientists?

In the aftermath of a disaster, communities often lack information about environmental contamination that could be used to guide prevention and recovery activities.    Community-led citizen science, where lay individuals or non-experts lead or participate in data collection and research activities, offers great promise for promoting equitable, cross-boundary collaborations, fostering scientific literacy, and empowering community-based actions around environmental risks.

Building a Network of Citizen Scientists

The Citizen Science Training Program was designed to build organizational capacity and enhance community health and well-being through promotion of citizen science in coastal Louisiana communities.  The training program was aimed at developing a network of citizen scientists for environmental contamination monitoring, creating avenues for communication and dissemination of project activities to stakeholders, and strengthening collaborative partnerships to enable sustainable networks for knowledge, skills, and resources.  Our evaluation includes a social network analysis of the existing and developing relationships among participants.

How Does a Citizen Scientist Networks Develop?

The project is designed to create and support Citizen Science networks.  We used Social Network Analysis to examine the emergency of these networks. Our project is on-going, but the following figures show an example of our preliminary findings:

We asked participants to indicate who they know, who they share information and resources with, who they discuss community issues with, who they go to for advice, and who they collaborate with. Our preliminary results illustrate an increase in ties, or connections among participants (i.e., network density). For example, respondents indicated which other participants they discussed community issues with before and after training. Before the survey, network density was 4% (see Figure below).

Pre-training ties among participants coded by parish regarding with whom they discussed community issues with.

Following the survey, network density increased to 45%.

Post-training ties among participants coded by parish regarding with whom they discussed community issues with. 

 

Lessons Learned:

Although our project is still in progress, we have found critical factors that lead to success in building and enhancing Citizen Scientists’ Networks:

Diversity Among Trainees.  We included a diverse group of participants.  They varied in age, gender, race/ethnicity, and occupations.

Small Group Activities.  The training included small group activities that encouraged information and resource sharing among participants.

Hands on Activities/Exercises.  The training included hands-on activities and exercises in using the monitoring and testing equipment.  These activities/exercises encouraged active participation and interaction among trainees.

Large and Small Group Discussion.  The small group activities and hands-on exercises were followed by discussion among participants that allowed for exchange of different points of view.

Follow-up Field Research.  The training culminated with participants identifying a community-based need that they are currently addressing using the knowledge, resources, and community capacity that was enhanced by the training.

The American Evaluation Association is celebrating Disaster and Emergency Management Evaluation (DEME) Topical Interest Group (TIG) Week. The contributions all this week to aea365 come from our DEME TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

Hello!  We are Phung Pham, doctoral student at Claremont Graduate University, and Jordan Freeman, new alumna of The George Washington University.  As novices of evaluation and research in the contexts of disasters and emergencies, we would like to share what we have found helpful in getting acquainted with disaster and emergency management evaluation and research.

Rad Resources:

  • Brief overview of the disaster and emergency management cycle, which includes mitigation, preparedness, response, and recovery.
  • Issue 126 of Volume 2010 in New Directions for Evaluation is a collection of chapters illustrating evaluation, research, policy, and practices in disaster and emergency management.
  • World Association for Disaster and Emergency Medicine (WADEM) offers frameworks for disaster research and evaluation.
  • United Nations Children’s Emergency Fund (UNICEF) has a database of evaluation reports, including ones focused on emergencies.
  • Active Learning Network for Accountability and Performance (ALNAP) is a global network of organizations dedicated to improving the knowledge and use of evidence in humanitarian responses, and has an extensive library of resources.

Get Involved!  Here are some trainings and events for your consideration:

We hope these resources are helpful to those of you who are new to or curious about evaluation and research in the contexts of disasters and emergencies.  There is a lot to learn and great work to be continued!

The American Evaluation Association is celebrating Disaster and Emergency Management Evaluation (DEME) Topical Interest Group (TIG) Week. The contributions all this week to aea365 come from our DEME TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Older posts >>

Archives

To top