Hi I’m Pat Campbell, president of Campbell-Kibler Associates, Inc. A couple of years ago, Eric Jolly, president of the Science Museum of Minnesota and I were wondering why, with so many evaluations of so many programs, we didn’t know more about “what works, for whom in what context”. Our conclusion—rigor is not enough. Even the most rigorous evaluations can be incomplete, or even wrong, when they don’t take into account the needs, values, issues, and goals of different subgroups. Our solution, with the help of the National Science Foundation, was to develop BeyondRigor.org to provide folks with lots of easy to use tips to improve evaluation with diverse populations (and let’s face it, all populations are diverse).
Hot Tips: From BeyondRigor.org:
- Whenever possible, use measures that have been tested and validated with groups similar to the groups who will be given the measures.
- Have members of the target population review affective and psychosocial measures for clarity. Ask them what concepts they think are being measured. If what is being measured is obvious, consider using a less obvious measure.
- Prior to the data collection, provide the observer or interviewer with as little demographic information as possible about the participants.
- When interpreting demographic differences, consider such conceptually relevant, and possibly confounding, factors as socioeconomic status, individual and family educational backgrounds, immigrant status, and place of residence. Where possible include statistical controls.
- Make comparisons across more and less effective projects/programs. Factors common across effective projects/programs may also be common across ineffective ones.
Lesson Learned: Context is key. Contextual factors include individual factors (e.g. age, race,ethnicity) and group factors (e.g. the local economy, available resources, location, changes in project/program leadership). Evaluators need to:
- have knowledge and understanding of the contextual factors that are important to the evaluation;
- be aware of their own world view and the assumptions they make, especially in data analysis and interpretation;
- build relationships and trust with project/program staff and participants.
Rad Resource: BeyondRigor.org of course. Its tips can help evaluators collect more accurate data, do more appropriate analysis and even be more apt to collect the right data. It can also help evaluators better understand the role of context and explore some contextual factors that are important to consider in evaluations.
Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.
Pat – thanks for your information and link to your website. I really liked the tip style and great reminders plus new information. Thanks