AEA365 | A Tip-a-Day by and for Evaluators

CAT | Disaster and Emergency Management Evaluation

Ramesh Tuladhar

Ramesh Tuladhar

Greetings, I am Ramesh Tuladhar, focal point and coordinator for Disaster Risk Reduction (DRR) Thematic Committee of Community of Evaluators in Nepal (COE-Nepal). I am a professional geologist with experience in disaster risk management, monitoring, and evaluation. I am currently engaged as the monitoring and evaluation consultant of the Pilot Project on Climate Resilience (PPCR) implemented by the Department of Hydrology and Meteorology, Government of Nepal.

Lessons Learned: Eighty-seven out of 192 (45%) United Nations member states responded to the Sendai Framework Data Readiness Review in 2017. This proportion suggests that more stakeholders from member states, and also non-member states, may consider learning about and contributing to the Sendai Framework, which includes four priorities for action, to help improve effectiveness and sustainability of DRR interventions.

Hot Tip:

Rad Resources: To learn about the progress of DRR in Nepal, please visit:

The American Evaluation Association is celebrating Disaster and Emergency Management Evaluation (DEME) Topical Interest Group (TIG) Week. The contributions all this week to aea365 come from our DEME TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Greetings! My name is Nnenia Campbell, and I am a postdoctoral research associate at the University of Colorado Natural Hazards Center as well as an independent evaluation consultant specializing in disaster-related programming. As an alumna of AEA’s Graduate Education Diversity Internship (GEDI) program, I frequently consider how I can engage in culturally responsive evaluation or call attention to the role of cultural context in my work. These concepts are particularly important in disaster and emergency management evaluation because extreme events can affect diverse populations with vastly disparate impacts.

Scholars and practitioners alike have observed that initiatives designed to alleviate the burden of disaster losses often fail to meet their goals, particularly within underserved communities. Moreover, although the concept of social vulnerability has begun to feature prominently in emergency management discourse, common issues and oversights can inadvertently reinforce inequality and undermine the interests of those who suffer the most crippling disaster impacts. Opaque or exclusionary decision-making practices, discounting of local knowledge, and imposition of locally inappropriate “solutions” are common complaints about programs intended to help communities prepare for or respond to hazard events.

In evaluating disaster resilience and recovery initiatives, it is important to pay attention to which stakeholders are at the table and how that compares to the broader populations they serve. Which interests are being represented? What histories may inform how a program is perceived? Alternatively, what factors may influence how program implementers engage clients and characterize their needs? Culturally responsive evaluation provides a powerful lens for answering such questions and for clarifying why they are important to ask in the first place.

Hot Tip:

  • Do your homework. Culturally responsive evaluation literature emphasizes the importance of capturing the cultural context of the program under study. Ignoring factors such as the history of a program and its stakeholders, the relationships and power dynamics among them, or the values and assumptions that shape their actions can lead to grave errors in interpretation.
  • Seek out cultural brokers. In order to adequately address the concerns of diverse stakeholders, evaluators must establish trust and respect. Working with cultural brokers, or trusted liaisons who can help to communicate concerns and advocate on behalf of a group, can foster greater understanding encourage meaningful engagement.

Rad Resources:

The American Evaluation Association is celebrating Disaster and Emergency Management Evaluation (DEME) Topical Interest Group (TIG) Week. The contributions all this week to aea365 come from our DEME TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Alicia Stachowski and I am an Associate Professor of Psychology at the University of Wisconsin, Stout.  I am working with Sue Ann Corell Sarpy on a Citizen Science Program sponsored by the National Academies of Science.  We would like to share some preliminary findings from this research.

Lessons Learned:

Why Use Citizen Scientists?

In the aftermath of a disaster, communities often lack information about environmental contamination that could be used to guide prevention and recovery activities.    Community-led citizen science, where lay individuals or non-experts lead or participate in data collection and research activities, offers great promise for promoting equitable, cross-boundary collaborations, fostering scientific literacy, and empowering community-based actions around environmental risks.

Building a Network of Citizen Scientists

The Citizen Science Training Program was designed to build organizational capacity and enhance community health and well-being through promotion of citizen science in coastal Louisiana communities.  The training program was aimed at developing a network of citizen scientists for environmental contamination monitoring, creating avenues for communication and dissemination of project activities to stakeholders, and strengthening collaborative partnerships to enable sustainable networks for knowledge, skills, and resources.  Our evaluation includes a social network analysis of the existing and developing relationships among participants.

How Does a Citizen Scientist Networks Develop?

The project is designed to create and support Citizen Science networks.  We used Social Network Analysis to examine the emergency of these networks. Our project is on-going, but the following figures show an example of our preliminary findings:

We asked participants to indicate who they know, who they share information and resources with, who they discuss community issues with, who they go to for advice, and who they collaborate with. Our preliminary results illustrate an increase in ties, or connections among participants (i.e., network density). For example, respondents indicated which other participants they discussed community issues with before and after training. Before the survey, network density was 4% (see Figure below).

Pre-training ties among participants coded by parish regarding with whom they discussed community issues with.

Following the survey, network density increased to 45%.

Post-training ties among participants coded by parish regarding with whom they discussed community issues with. 

 

Lessons Learned:

Although our project is still in progress, we have found critical factors that lead to success in building and enhancing Citizen Scientists’ Networks:

Diversity Among Trainees.  We included a diverse group of participants.  They varied in age, gender, race/ethnicity, and occupations.

Small Group Activities.  The training included small group activities that encouraged information and resource sharing among participants.

Hands on Activities/Exercises.  The training included hands-on activities and exercises in using the monitoring and testing equipment.  These activities/exercises encouraged active participation and interaction among trainees.

Large and Small Group Discussion.  The small group activities and hands-on exercises were followed by discussion among participants that allowed for exchange of different points of view.

Follow-up Field Research.  The training culminated with participants identifying a community-based need that they are currently addressing using the knowledge, resources, and community capacity that was enhanced by the training.

The American Evaluation Association is celebrating Disaster and Emergency Management Evaluation (DEME) Topical Interest Group (TIG) Week. The contributions all this week to aea365 come from our DEME TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

Hello!  We are Phung Pham, doctoral student at Claremont Graduate University, and Jordan Freeman, new alumna of The George Washington University.  As novices of evaluation and research in the contexts of disasters and emergencies, we would like to share what we have found helpful in getting acquainted with disaster and emergency management evaluation and research.

Rad Resources:

  • Brief overview of the disaster and emergency management cycle, which includes mitigation, preparedness, response, and recovery.
  • Issue 126 of Volume 2010 in New Directions for Evaluation is a collection of chapters illustrating evaluation, research, policy, and practices in disaster and emergency management.
  • World Association for Disaster and Emergency Medicine (WADEM) offers frameworks for disaster research and evaluation.
  • United Nations Children’s Emergency Fund (UNICEF) has a database of evaluation reports, including ones focused on emergencies.
  • Active Learning Network for Accountability and Performance (ALNAP) is a global network of organizations dedicated to improving the knowledge and use of evidence in humanitarian responses, and has an extensive library of resources.

Get Involved!  Here are some trainings and events for your consideration:

We hope these resources are helpful to those of you who are new to or curious about evaluation and research in the contexts of disasters and emergencies.  There is a lot to learn and great work to be continued!

The American Evaluation Association is celebrating Disaster and Emergency Management Evaluation (DEME) Topical Interest Group (TIG) Week. The contributions all this week to aea365 come from our DEME TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Sue Ann Corell Sarpy, Principal of Sarpy and Associates, LLC and Program Chair of the Disaster and Emergency Management Evaluation (DEME) TIG.  This week, contributions represent research and practice in DEME encompassing a national and international scope.  This week starts with an evaluation study I conducted concerning resiliency training for workers and volunteers responding to large-scale disasters.

The National Institute of Environmental Health Sciences (NIEHS) Worker Training Program with the Substance Abuse and Mental Health Services Administration (SAMSHA) identified a need to create trainings that promoted mental health and resiliency for workers and volunteers in disaster impacted communities.  We used a developmental evaluation approach that spanned two distinct communities (e.g., Gulf South region; New York/New Jersey region) disasters (e.g., Gulf of Mexico Oil Spill; Hurricane Sandy) and worker populations (e.g., disaster workers/volunteers; disaster supervisors) to identify effective principles and address challenges associated with these dynamic and complex training needs.  We used an iterative evaluation process to enhance development and delivery of the training such that project stakeholders provided active support and participation in the evaluation and discussion of findings and incorporated the effective principles (best practices/lessons learned) into the next iteration of training.

Evaluation results supported the usefulness of this type of developmental evaluation approach for designing and delivering disaster worker trainings.  The training effectiveness was demonstrated in different geographic regions responding to different disaster events with different target audiences.  This graphic depicts percentage of ratings of agreement with how well the training met the needs of the various target audiences.  We found that ratings increased as we continued to integrate the best principles in the training, starting with disaster worker training in the Gulf South region to the final phase of the project – the disaster supervisory training in the New York/New Jersey region.

Note: DWRT = Disaster Worker Resiliency Training; DSRT = Disaster Supervisor Resiliency Training; Responses ranged from “SOMEWHAT Agree” to “STRONGLY Agree”

Lessons Learned: Several key factors were critical for success in evaluating resiliency training for workers and volunteers responding to large-scale disasters:

Major stakeholders actively involved in development, implementation, and evaluation of trainings.  We included information from workers, supervisors, community-based organizations, and subject matter experts in the evaluation data and discussion of findings.

Evaluation conducted early on in the training design and feedback of effective principles used in each iteration.  Evaluators were brought in as a key stakeholder early in the process and we were integral in revising and refining training products as the project progressed.

Build relationships and trust with various stakeholders to gather information and refine curriculum.  The inclusion of stakeholder feedback, so that everyone gets a voice in the system, built trust and buy-in in the evaluation process and was key to its success.

Creating balance between standardized training and flexibility to tailor training to meet needs (adaptability).  The best principles that emerged across worker populations, communities, and disasters/emergencies provided a framework that allowed for a structure for the trainings but afforded flexibility/adaptability needed to meet specific training needs.

Rad Resources: Visit the NIEHS Responder and Community Resiliency website for training resources related to this project.    

The American Evaluation Association is celebrating Disaster and Emergency Management Evaluation (DEME) Topical Interest Group (TIG) Week. The contributions all this week to aea365 come from our DEME TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Greetings AEA365! My name is Miki Tsukamoto and I am a Senior Monitoring and Evaluation Officer at the Planning and Evaluation Department in the International Federation of Red Cross and Red Crescent Societies (IFRC).

What if you had an opportunity to add a human face to baseline surveys and reflect numbers in a more appealing way?

In a joint initiative with the Uganda Red Cross Society (URCS) and the Swedish Red Cross,  I recently had such an opportunity. We piloted video as a tool to complement a baseline survey which had been carried out for URCS’s community resilience programme. The video aimed to capture stories from communities according to selected objectives/indicators of the programme, with the idea that in three-years’ time this tool could then be used again to measure and demonstrate change or highlight gaps in programming.

Lessons Learned: Baseline data is important for planning, monitoring and evaluating a project’s performance. In many organizations, the end product of such a survey can sometimes result in a report filled with numbers; which, although useful for some purposes, is not always understood by all stakeholders, including some of the communities we aim to assist. Taking this into consideration, video seemed to be an ideal medium for what the IFRC needed since it:

  • Offers visual imagery and can transcend language barriers if needed;
  • Allows community(ies) with an opportunity to participate and directly express their views during the interviews; and
  • Provides a more appealing way to capture and report on the baseline.

Here are 3 lessons that I took away from this experience:

GatekeepersGatekeepers: It is important to identify your gatekeeper(s), since this will be necessary for meeting community members on the ground, and in obtaining their permission to film and in accepting the presence of the film crew in the community(ies) and in the randomly selected individual households.


Independent InterpreterIndependent Interpreter:
If interpretation is necessary, an independent interpreter is key since s/he serves as the voice of the interviewee, as well as the interviewer. S/He has an important role in reducing bias and providing a comfortable environment for an honest dialogue during the interview process.

Community buy-inCommunity buy-in: The filming process and the community’s better understanding of the aims of the video project, can help build a stronger buy in from the community(ies) for your programme overall.

Rad Resources: We have two version of the baseline video (if you are reading this via email that does not support embedded video, please click through back to the online post):

Short Version: 

Long Version: 

Hot Tip: For those interested in innovations in the field of humanitarian technology and its practical impact Humanitarian Technology: Science, Systems and Global Impact 2014 conference is coming up soon  in Boston, MA from 13 to 15 May 2014.

Rad Resources:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Archives

To top