Hi all, we’re blogging today from the National Resource Center on Domestic Violence. Cris Sullivan is NRCDV’s Senior Research Advisor, and Annika Gifford is Senior Director of Policy and Research. Together with CEO Anne Menard, one of our projects has focused on helping domestic violence organizations evaluate how their services impact domestic violence survivors and their children.
Domestic violence (DV) programs have been undergoing scrutiny to demonstrate that they are making a significant difference in the lives of those using their services. Increasingly, funders are expecting them to demonstrate that their efforts are resulting in positive outcomes for survivors.
In addition to the issues facing all nonprofits trying to evaluate their impact (e.g., little to no money, time or expertise), DV programs have the following additional factors to consider:
- They are often working with people in crisis who may not be in a space to engage in program evaluation.
- They have to consider safety and confidentiality of the people with whom they work (so, for example, cannot contact people later through mail).
- Some funders expect DV programs to have unrealistic or even victim-blaming outcomes (e.g., “victims will leave the relationship”).
- DV programs recognize that each survivor seeking help has their own individual needs, life experiences, and concerns. Services are tailored to each person, making program evaluation that much more difficult.
Rad Resource: To help domestic violence programs evaluate their work on their own terms — and with no extra money or time — we have created an online resource center that houses a great deal of free and accessible resources.
Among other things, The DV Evidence Project houses a theory of change that programs can use to demonstrate the process through which their services result in long-term benefits for survivors and their children. The site also provides brief summaries of the evidence behind shelters, advocacy, support groups and counseling (demonstrating that programs are engaged in “evidence-based practice”). Finally, evaluation tools are provided so that programs don’t need to re-invent the wheel. These evaluation tools include client surveys, tips for engaging staff in evaluation, strategies for gathering the data in sensitive ways, and protocols for interpreting and using the findings. We hope these resources are helpful to those in the field doing this incredibly important work!
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to email@example.com . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.