Hello, I am Alexis V. Marbach, MPH. As the Empowerment Evaluator for the Rhode Island Coalition Against Domestic Violence, I support the evaluation activities of the Centers for Disease Control DELTA FOCUS (Domestic Violence Prevention Enhancements and Leadership Through Alliances, Focusing on Outcomes for Communities United with States) Grant. The DELTA FOCUS grant, awarded to 10 domestic violence coalitions throughout the country, challenges programs to evaluate their primary prevention programs in a rigorous and an intentional way. One step on the road to an evaluation plan is to conduct an evaluability assessment (EA). In Rhode Island we conducted an EA at the state level and for two local subgrantee sites. Here are some tips and tools that helped us along the way.
Template tip: While all EAs are unique in that they reflect the agency, community, and project values, there are core components that helped to guide our process. Those core concepts included:
Key Findings
1) What are the program or strategy goals (scope and purpose of the program)
2) How does the program intend to achieve program goals?
3) What resources are needed to implement the program?
Description of existing data collection methods and process
1) Describe the data collection methods instruments
2) Who is the intended audience of data collection instruments?
3) Who collects the data?
4) How often is the data collected?
Evaluation Plan Recommendation
1) How will this assessment inform the evaluation plan?
i. What can be evaluated?
ii. What evaluation questions can feasibly be answered?
Hot Tip: Timing tip: Remember that your EA is a step between creating an action plan (including a logic model) and your evaluation plan. When planning the timing of both, be sure to budget in enough time to meet with key stakeholders and constituents, conduct literature reviews, potentially conduct an internal assessment to determine capacity to conduct evaluation activities. We conducted EAs in a little more than a month, and this felt incredibly rushed even though we had full time staff working on the project.
Lesson learned: It’s okay to learn that your strategy is not ready to be evaluated. It’s fair to say that we put a great deal of pressure on ourselves to perfectly align our strategies with evaluation activities, even when it felt like cramming a square peg into a round hole. One of the great lessons of an EA is that you may have to go back to your initial plan and rethink your strategy.
Rad Resource: The National Center on Domestic and Sexual Violence has compiled evaluation resources that are a blend of general tools and ones specific to violence against women strategies.
http://www.ncdsv.org/publications_programeval.html
The American Evaluation Association is celebrating CP TIG Week with our colleagues in the Community Psychology Topical Interest Group. The contributions all week come from CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Hello Alexis,
My name is Anuska and I am a graduate student working towards a Professional Master of Education at Queen’s University. We have been asked to consult the aea365 blog site as part of a course on Program Inquiry and Evaluation, and contact the author of an article of interest to us.
As a counsellor at a women’s shelter for survivors of intimate partner violence, I found your article to be very insightful. As I developed my program evaluation plan for a class project, the Evaluability Assessment guiding questions you outline here helped me take a step back and examine whether my initial ideas for an evaluation plan coincided with the program’s goals and needs. I also became mindful of whether the data collection and analysis methods would be realistically implemented, and results used, by program implementers.
From the resource list you mention, I was most interested in “Building Data Systems for Monitoring and Responding to Violence Against Women”. I agree with the points raised about how clarity with respect to defining terms related to violence against women are needed (I have noticed inconsistencies between, for example, the use of “intimate partner violence”, “domestic violence”, and “conjugal violence” in my practice as well as when reviewing research). I also appreciate the argument that multiple data collection strategies are needed in evaluations related to violence against women, and personal interviews would provide more insight towards examining the extent of violence lived than medical or criminal records. Many incidents of violence often go unreported, and some forms of violence (verbal, psychological) may not be considered as criminal acts. Based on my experience accompanying and supporting clients through legal proceedings, I would add to this that those conducting interviews with survivors should be informed and adequately trained to support survivors through the possible trauma of reliving the violence they suffered during interview processes. I appreciated the emphasis on ensuring the safety and confidentiality of survivors as well as those collecting data, and the importance placed on stakeholders being involved in evaluations.
I also enjoyed reading “Case Study: Culturally Relevant Evaluation of Prevention Efforts”. As someone who hopes to work towards establishing a career in prevention for education programming with a focus on violence prevention and sexual health, it was fascinating to read an example of a curriculum that approaches the topic of violence in a culturally sensitive manner, as well as a concrete example of how such a program can be evaluated.
One challenge I am confronted by in proposing my evaluation plan to my team, is how to motivate my colleagues to allocate the time and effort towards evaluation processes. Although I think that the evaluation tools can have multiple benefits to stakeholders and participants, I wonder how we can realistically implement such strategies amidst the immediate crises that require our attention on a daily basis. Do you have any suggestions based on your experience evaluating similar programs and services in the field?
Thank you again for your article and resources, they gave me a lot to think about and reflect on!
Warm Regards
Anuska Martins