AEA365 | A Tip-a-Day by and for Evaluators

TAG | evaluation resources

Greetings! My name is Catherine Cooper, and I am the Faculty Director of the Educational Partnership Center and Professor of Psychology at the University of California, Santa Cruz. I invite you to explore and use the resources from the Bridging Multiple Worlds Alliance (BMWA).

The BMWA is a growing network of researchers, educators, and policy makers – including evaluators – in the U.S. and other nations who work with P-20 (preschool through graduate school) partnerships to support low-income, immigrant, and ethnic minority youth.  These partnerships support youth in building pathways from childhood to college and careers without giving up ties to their families and cultural communities. We work in collaboration with alliance partners, including youth themselves and evaluators of programs and partnerships.

Rad Resource: In the BMWA, we offer three resources that evaluators tell us are especially useful:

  • Aligning models and measures to build a common language among partners.
  • Tools for research, policies, and practice, including formative and summative evaluation.
  • Longitudinal data tools for qualitative and quantitative evaluation and research

The Bridging Multiple Worlds (BMW) Model (shown below) taps five dimensions for opening pathways:

  • Demographics—students’ age, gender, national origins, race/ethnicities, languages, and parents’ education and occupation
  • Students’ aspirations and identity pathways in college, careers, and cultural domains
  • Students’ math and language academic pathways through school
  • Resources and challenges across students’ cultural worlds of families, peers, schools, community programs, sports, and religious activities, among others
  • Partnerships that reach across nations, ethnicities, social class, and gender to open pathways from preschool through graduate school (P-20)

 Cooper 21 April 2015

Rad Resource: Bridging Multiple Worlds Tools include:

  • Survey measures of these five dimensions for middle/high school and college students
  • Activities for middle and high school students for building pathways to college and careers, with pre- and post-activity surveys (in English and Spanish)
  • Logic model template for programs and alliances among programs
  • Longitudinal case study templates

Rad Resource: I invite you to join BMWA partners– students, families, schools, community programs, and universities–in using these tools to ask your own questions and build common ground among evaluators, researchers, educators, and policymakers. The tools and other resources are available at www.bridgingworlds.org.

Rad Resource: Bridging Multiple Worlds: Cultures, Identities, and Pathways to College (Cooper, 2011) describes BMW and related models, supporting evidence, tools, and applications in P-20 research, practice, and policy work.

Hot Tip: Healthy partnerships are learning communities where “everyone gets to be smart”. Focus on questions and indicators partners are interested in and display data in clear and meaningful formats.  This increases enthusiasm, engagement, and cooperation. Examples of such questions, indicators, and formats are on our website.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on theaea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest toaea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hi all, we’re blogging today from the National Resource Center on Domestic Violence. Cris Sullivan is NRCDV’s Senior Research Advisor, and Annika Gifford is Senior Director of Policy and Research. Together with CEO Anne Menard, one of our projects has focused on helping domestic violence organizations evaluate how their services impact domestic violence survivors and their children.

Domestic violence (DV) programs have been undergoing scrutiny to demonstrate that they are making a significant difference in the lives of those using their services. Increasingly, funders are expecting them to demonstrate that their efforts are resulting in positive outcomes for survivors.

In addition to the issues facing all nonprofits trying to evaluate their impact (e.g., little to no money, time or expertise), DV programs have the following additional factors to consider:

  • They are often working with people in crisis who may not be in a space to engage in program evaluation.
  • They have to consider safety and confidentiality of the people with whom they work (so, for example, cannot contact people later through mail).
  • Some funders expect DV programs to have unrealistic or even victim-blaming outcomes (e.g., “victims will leave the relationship”).
  • DV programs recognize that each survivor seeking help has their own individual needs, life experiences, and concerns. Services are tailored to each person, making program evaluation that much more difficult.

Rad Resource: To help domestic violence programs evaluate their work on their own terms — DVEP_Magnetand with no extra money or time — we have created an online resource center that houses a great deal of free and accessible resources.

Among other things, The DV Evidence Project houses a theory of change that programs can use to demonstrate the process through which their services result in long-term benefits for survivors and their children. The site also provides brief summaries of the evidence behind shelters, advocacy, support groups and counseling (demonstrating that programs are engaged in “evidence-based practice”). Finally, evaluation tools are provided so that programs don’t need to re-invent the wheel.  These evaluation tools include client surveys, tips for engaging staff in evaluation, strategies for gathering the data in sensitive ways, and protocols for interpreting and using the findings. We hope these resources are helpful to those in the field doing this incredibly important work!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hello! I’m Sheila B. Robinson, aea365’s Leader Volunteer Curator. I teach Program Evaluation Methods at University of Rochester’s Warner School of Education, and am a grant coordinator for Greece Central School District in Rochester, NY. In my spare time, I read, learn and blog about evaluation, and it’s no exaggeration to say I never come away from my computer, book, article on evaluation without  learning something. “Ancora imparo” is attributed to Michelangelo in his late 80s!

As I’m once again preparing my syllabus, I’m reflecting on a wealth of free and low-cost evaluation resources. Since I always want to improve the course for my students, I’m look for new readings and activities to ensure my course is up-to-date, and that my students learn about the “big names” and big issues in the evaluation community today.

Lesson Learned: I’m convinced evaluators are the most collegial, collaborative, and generous people ever, and I’m always impressed with how many of them are willing to share their knowledge and resources with everyone.

Hot Tips:

1.) Fill your toolbox! Susan Kistler, AEA’s Executive Director Emeritus, has contributed numerous aea365 posts on free or low-cost technology tools. Search her name, or glance through aea365 archive for links and descriptions.

2.) Join the conversations! Mentioned before, but definitely worth another look: AEA’s LinkedIn discussion, and EvalTalk – two places I’ve learned about the multitude of websites, textbooks, and articles on evaluation, many of which have made their way into my course. Here’s a link to a discussion on “Comprehensive set of websites on evaluation and research methods.” I recently asked EvalTalk for some “must-read journal articles for program evaluation students” and got great responses; some people even sent me their syllabi!  Cool trick: I’ve copied rich EvalTalk and LinkedIn discussions on a topic of interest (e.g. pre- and post-testing) to share with students as examples of the types of discussions evaluators have in “the real world” of evaluation work.

3.) Cull from collections! Who doesn’t love one-stop shopping? My favorite place for great collections is AEA’s site. Check out everything under the Reading, Learning, and Community tabs and all the links on the main page. Check out Evaluator and Evaluation blogs and evaluators on Twitter. Chris Lysy maintains a large collection of evaluation-related blogs at EvalCentral. Gene Shackman has amassed probably the largest collection of Free Resources for Program Evaluation and Social Research Methods.

4.) “Extend” your learning! Google “evaluation” + “extension” and find a universe of free tools and resources from university extension programs. Here are just a few:  University of Wisconsin-Extension, Penn State Extension, NC Cooperative Extension, K-State research and Extension. I stumbled upon this collection at University of Kentucky’s Program Development and Evaluation Resources.

Apprendimento felice! (Happy learning!)

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· · ·

Archives

To top