I am Alice Hausman, a professor of Public Health at Temple University. I have been working as a community based participatory research (CBPR) evaluator of youth violence prevention initiatives in urban environments for many years.
- Involve the Community in Identifying Measures and Data. As part of the participatory evaluation planning process, I always ask community participants to define their vision of program success. But I take it one step further by looking for data that might actually measure these community-defined outcomes. The process of working with community partners to identify measures and data has been as rewarding as just asking what success would look like.
- Use available data sources in partnership with the community. One community collaborative I worked with identified available data sets and survey opportunities they could use to evaluate their programs. In another project, a randomized community trial of a multi-level violence prevention program, we found that the standardized psychometric tools being used by the evaluation trial could be used to measure community-defined constructs, such as “showing kids love”, after reconfiguring the items through a participatory review process.
- Remind yourself of the value of community-evaluator partnerships. In our case, the indicator itself was insightful about the community’s perception of social and relationship factors related to preventing youth violence. But the actual process of discussing the instruments and constructs was rewarding for all parties. The academic researchers learned more about the lived experience of their community partners who learned more about measurement development and psychometric research.
- Don’t hesitate to collaboratively develop new measures Another important outcome of the process of identifying existing data to measure community ideas was the realization that new measures and data might be needed to accurately capture the constructs defined by the community. While our community partners were initially concerned with the burden of adding new questionnaires, their views shifted somewhat after seeing that the benefit of being able to actually measure community defined constructs would outweigh the risks of more surveys.
Get Involved: I would love to hear from others who have done work in this area. We can compare notes on indicators and measures and possibly find ways to make measuring community-defined outcomes as routine as measuring outcomes defined by funders.
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to firstname.lastname@example.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.