AEA365 | A Tip-a-Day by and for Evaluators

TAG | social justice

Hello! My name is Libby Smith, I work at the University of Wisconsin-Stout, where I am fortunate enough to work in evaluation in multiple capacities. Today I’d like to share with you some of the lessons I’ve learned from being a member of ¡Milwaukee Evaluation! First, Nicole Robinson and the rest of the executive board are tireless promoters of our mission both in and outside of our state. They have truly taught me the value of building connections across long distances and being part of a network that shares a common goal. Second, I have learned that infusing social justice into my work is not optional or occasional.

Lessons Learned:

From participating in webinars on using racial equity in evaluation to participating in last spring’s Social Justice and Evaluation Conference, the professional development I have received as a member has been consistent, effective, and incredibly valuable to my growth as an evaluator. I was honored to be asked to present an Eval 101 session at the spring conference, but the lessons learned through listening to the people who attended my session were incredibly valuable.  The organization’s commitment to promoting social justice within evaluation sets it apart from the other groups that I belong to.

My connection to ¡Milwaukee Evaluation! led to the most professionally satisfying work of my career. In 2015, I began collaborating with the Annie E. Casey Foundation as they established the Leaders in Equitable Evaluation and Diversity (LEEAD) program, an effort to increase the number of underrepresented evaluators of color, a mission directly aligned with our goals. Through our Graduate Certificate in Evaluation Studies, we provide evaluation training to the early-career scholars in the LEEAD program. I am so proud of the work that we are doing, knowing that we are exponentially expanding our ability to bring change to the field of evaluation and to the communities we work in as evaluators.

The American Evaluation Association is celebrating ¡Milwaukee Evaluation! Week with our colleagues in the Wisconsin statewide AEA Affiliate. The contributions all this week to aea365 come from our ¡Milwaukee Evaluation! members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

I’m Maya Pilgrim, the Texas Association Against Sexual Assault’s Evaluation Manager.  My work has focused on the primary prevention of sexual violence – addressing the root causes to prevent sexual violence from happening in the first place.  It means addressing systemic attitudes and inequities that while often similar, can look different according to context, a classic wicked problem.

In early September, at the National [Anti-]Sexual Assault Conference, I posed the question, “What if we saw evaluation AS social justice work?”  As not just a means to an end, but a means AND an end.  I shared a few of the Eight Key Feminist Evaluation Principles to frame our conversation:

  1. Evaluation is a political activity.
  2. Knowledge is culturally, socially, and temporally contingent.
  3. Knowledge should be a resource of and for the people who create, hold, and share it.
  4. There are multiple ways of knowing (and some ways are privileged over others).

Lesson Learned: Consistent through the various conversations on what we need to change about our current evaluation practices were concepts of reciprocity, shared power, and centering those most affected.  After returning home, I found myself sitting in a meeting feeling frustrated.   Once again, things were looking less equitable and reciprocal than I’d hoped.  I realized that in many professional contexts, a major barrier to making evaluation an extension of social justice work and centering those most affected by systemic inequities was “the deadline.”

For many mainstream non-profit organizations trying to improve their evaluation practices, including perspectives and knowledge of those who are marginalized often requires going beyond well-worn networks and established work plans and finding ways to compensate those whose knowledge aren’t normally compensated.  All of this takes more time than is often budgeted in program planning.  If we want evaluation to be an extension of social justice work, funders, organizations, and programs have to be creative and innovative in re-envisioning the assumed processes and timelines for designing and evaluating programs.  Have you or your organization developed a creative or innovative way to do this?  Don’t keep it to yourself!

Rad Resources:

Brisolara, Seigart and SenGupta’s Feminist Evaluation and Research: Theory and Practice is a great place to start.

Community at the Center, highlights themes that overlapped substantially with our conversations around evaluation as social justice in terms of reciprocity, sharing power and centering those most affected.

The National Latino Network’s Building Evidence Toolkit utilizes many of the key feminist evaluation principles.

Developmental Evaluation Exemplars highlights how the responsive and flexible nature of DE evaluation puts these principles into action.

AEA365 Blog: searching “Social Justice” gives you access to all the previously shared wisdom and resources by Bessa Whitmore, Nora Murphy, MQP, Liz Zadnik and more.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi, I’m Nora F. Murphy, a developmental evaluator deeply committed to social justice. I recently attended the Minnesota Evaluation Studies Institute (MESI) Spring Training and Donna Mertens’ workshop on Weaving Social Justice and Evaluation Together. Stimulated by the concepts and conversations I have been reflecting on how social justice appears in my practice and had the following insights:

Lessons Learned:

#1: I actively choose evaluations of projects related to systems change to increase social justice and equity and assumed this was enough. Mertens challenged us to go a step further by placing human rights and social justice at the center. While these elements are always present in my evaluations they are not always at the center.

#2: Where people are working towards social justice and equity there is trauma—individual and community, past and present. Evaluators can ignore this and, I suspect, often do. I realized that my most meaningful evaluations did not ignore this but rather recognized and honored this aspect of people’s experiences.

#3: AEA’s Guiding Principles For Evaluators (2004) states that evaluators bear responsibility for general and public welfare. When designing an evaluation I can choose to ignore the trauma or design an evaluation that creates the space to recognize the trauma and promote healing as a way to benefit both individuals and society as a whole.

I will ask myself these questions and commit to the following as I explore the intersection of evaluation, social justice, trauma, and healing:

  • What gets placed at the center? Mertens suggests we place human rights and social justice at the center. I will be more intentional about doing so.
  • How do I attend to what’s in the center? I will consider methods that promote healing through deep listening, bearing witness, and creating opportunities for people to connect to their inner selves and to each other.
  • For what purpose and to what ends do we evaluate? Bob Williams suggested recently in an EvalTalk post (4.4.15) titled “Evaluation’s Warrant” that one possible purpose is to serve humanity. I will deepen my thinking about this idea.
  • Who is evaluating? Educator Parker Palmer (2009) asks himself: “How does the quality of my selfhood form— or deform— the way I relate to my students, my subject, my colleagues, my world?” In a similar vein I will ask this question of myself as an evaluator and do the inner work needed to bring my best self to my work.

Rad Resources:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Donna M. Mertens and I am an independent consultant based in Washington DC; my work is both domestic and international. I had the honor of being the keynote speaker at the Minnesota Evaluation Studies Institute (MESI) in March 2015. The MESI theme was Social Justice amidst Standards and Accountability: The Challenge for Evaluation. The concept of social justice in the context of evaluation implies that evaluators can play a role in addressing those wicked problems that persist in society, such as violence, lack of access to quality education for all, poverty, substance abuse, and environmental pollution.

Lesson Learned: Wicked problems and Social Justice. Evaluators are concerned and involved in contributing to the solution of wicked problems. They also recognize the importance of bringing a social justice lens to this work. Michael Harnar conducted a survey of 1,187 evaluators and reported that 69% (n=819) either strongly or somewhat agreed with this statement: Evaluation should focus on bringing about social justice.

Rad Resource: Mertens, D.M. editorial: Mixed Methods and Wicked Problems, Journal of Mixed Methods Research, 2015, 9, 3-6. Abstract http://mmr.sagepub.com/content/9/1/3.extract

Harnar, M. (2014). Developing criteria to identify transformative participatory evaluators. JMDE. http://journals.sfu.ca/jmde/index.php/jmde_1/article/view/383

Lesson Learned: Social Justice Lens Leads to Different Evaluation Questions. Evaluators who work with a social justice lens are concerned with the question of program effectiveness and answering the impact question, Did “it” work? They are also interested in asking other types of questions:

  • Was “it” the right thing?
  • Was “it” chosen and/or developed and implemented in culturally responsive ways?
  • Were contextual issues of culture, race/ethnicity, gender, disability, deafness, religion, language, immigrant or refugee status, age or other dimensions of diversity used as a basis for discrimination and oppression addressed?
  • How were issues of power addressed?
  • Do we want to continue to spend money on things that don’t work?

Rad Resource: Native American Center for Excellence published Steps for Conducting Research and Evaluation in Native Communities that provides a specific context in which a social justice lens is applied in evaluation.

Lessons Learned: Social Justice Criteria for Evaluators. Evaluators who work with a social justice lens consider the following criteria to be indicators of the quality of the evaluation:

  • Emphasizes human rights and social justice
  • Analyses asymmetric power relations
  • Advocates culturally competent relations between the evaluator and community members
  • Employs culturally appropriate mixed methods tied to social action
  • Applies critical theory, queer theory, disability and deafness rights theories, feminist theory, critical race theory, and/or postcolonial and indigenous theories

Rad Resource: Reyes J., Kelcey J., Diaz Varela A. (2014). Transformative resilience guide: Gender, violence and educationWashington, DC: World Bank.

The American Evaluation Association is celebrating MESI Spring Training Week. The contributions all this week to aea365 come from evaluators who presented at or attended the Minnesota Evaluation Studies Institute Spring Training. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

I’m Chad Green, Program Analyst at Loudoun County Public Schools in Northern Virginia, and Chair of the PreK-12 Educational Evaluation TIG. Welcome to our sponsored AEA365 week in conjunction with Teacher Appreciation Week, which is celebrated from May 5-9 in the U.S.

Since 2011, I have championed our TIG’s core values below as quoted from our website.

“The PreK-12 Educational Evaluation TIG values relevant, responsive, high quality educational evaluation that reflects our beliefs in social justice, equity, and the importance of educating the whole child.”

How so? When the language and actions of my colleagues and I align with these core values, I acknowledge and celebrate them publicly. Barbara Taylor called this form of strategic dialogue “metasensemaking” in her study of elementary school principals involved in the process of organizational change. She defined it as “a form of organizational enactment used to further the potential for organizational momentum and individual motivation.” The “Walking the Talk” section of the monthly AEA Newsletter is another example of this leadership practice.

The beauty of our TIG’s values is that, as essentially contested concepts, anyone can take an active role in their interpretation and enactment. While I have had the opportunity to celebrate instances of equity and whole child education in action over the years, social justice has eluded me. I attribute this shortcoming primarily to my own lack of understanding of this fuzzy concept.

Lesson Learned: I only write this now because recently I have learned that, during a persuasive presentation by Paul Carr of Lakehead University in Canada, many people understand the role of democracy in education only in a “thin” way.

Last month a few of us from the TIG’s leadership team attended the Annual Meeting of the American Educational Research Association. In his panel presentation sponsored by the Dewey Studies SIG, Carr introduced the following framework of teacher/student engagement in democracy which builds on the seminal work of Benjamin Barber.

Carr's notion of Thin vs. Thick Democracy

Carr’s notion of Thin vs. Thick Democracy

According to Carr, a thin connection to democracy in education is exemplified by weak linkages between the school experiences of teachers and students and the broader experiences of society in general.  He argued that social justice can only be expressed on the thick end of the spectrum, in which freer forms of democracy influence all aspects of how education is organized (e.g., academic curriculum, assemblies, extra-curricular events, staff meetings, etc.).

Carr’s presentation struck a chord because it reminded me of Sandra Mathison’s panel presentation last year on her efforts to bridge the evaluation culture gap in school systems throughout British Columbia (see below).  I look forward to celebrating other educational evaluators who have integrated social justice into their practice.

Mathison's notion of the Evaluation Culture Gap

Mathison’s notion of the Evaluation Culture Gap

The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PK12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· ·

Greetings from Music City! I am Kathryn Bowen, Vice President for Research & Evaluation at Centerstone Research Institute. I am involved with the planning, implementation, and reporting of multiple program evaluations. Universal professional values like honesty, integrity, accountability, tolerance, and respect for people influence all my work. My feminist values are aligned with fairness, social justice, equity, and empathy regardless of political, social, economic, geographic, gender, ethnic, and age differences.  I intentionally design evaluations to increase the likelihood that the data collected, analyzed, and reported help me understand the multiple realities and lived experiences of women and sensitizes me to social structures that perpetuate inequity, oppression, social injustice, and powerlessness of women. This is done most commonly by including multiple and mixed methods in my evaluation/analysis plans. My ultimate aim is to generate knowledge to create change that makes a difference in the lives of women. A disclosure and conscious decision on my part is that in some program contexts I need to conduct these feminist evaluations without out “framing it” as feminist.

Lessons Learned:

  • I have found that using the term feminist can undermine the intent of the evaluation work. Instead, I strive to be clear about values and objectives of the evaluation.
  • Depending upon the program context and the culture/perspective of program participants, my approach as feminist might exclude rather than include women impacted most significantly by the program.
  • It is important to recognize that my concerns as an educated, white, middle class woman may not be normative for traumatized women enrolled in co-occurring mental health and substance abuse treatment programs.
  • Systematic oppression can be totally invisible to women who have internalized it from the cradle.
  • To integrate feminist guidelines that help to frame the evaluation planning and implementation it is best to be transparent about what I mean by “feminist evaluation” rather than being strident about using the term “feminist” or labeling myself a “feminist evaluator”.

Hot Tip:

  • Feminist evaluation principles need to be reflected clearly and words marked by authenticity rather than a label.
  • As a feminist evaluator, clearly frame your values, seek to understand the values of program stakeholders, and establish ways to communicate shared and divergent values in the process. This can help you understand lived experiences and identify structural inequality that exists in organizations, institutions, governments, or social networks where embedded bias provides advantages for some members and marginalizes or produces disadvantages for others.

Rad Resources:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Bonnie Stabile and I teach Program Evaluation in the MPA and MPP programs at George Mason University.  I served as a session scribe at Evaluation 2010 and attended session number 535, Teaching About Specific Aspects of Evaluation.  I chose this session because I am interested in learning about innovative facets of the field to share with my students, and always hoping to hear of ways to enhance classroom instruction.

Lessons Learned: Topics covered in this valuable session addressed the enduring question of how to incorporate qualitative considerations into a quantitatively focused evaluation class, the compelling question of how to incorporate social justice considerations into the teaching of evaluation, and the practical, yet critically important, consideration of how to develop evaluation reports that are “useful, user-friendly and used.”

Session chair John Stevenson of the University of Rhode Island shared his means for inculcating an appreciation for “qualitative knowing” in his psychology students as they approach their evaluation endeavors, by including exercises using a “cultural review,” a participatory research role play, and an ethical reflection, all inspired by the work of Ian Shaw of the University of York.

Veronica Thomas of Howard University and Anna Madison of the University of Massachusetts, Boston urged the infusion of a social justice perspective into graduate evaluation training.  They argue that faculty and students should not focus solely on theory, methods and practice in evaluation classes, but should also consider the social inequalities that shape both problem identification and programmatic efforts to resolve social problems.  Thomas and Madison advocate a rejection of the position that knowledge is neutral and untainted by human interests, and the inclusion of course work and experiences that help students recognize social inequities and privilege.

Tamara Walser of the University of North Carolina at Wilmington outlined a four- step means for improving evaluation reports that she requires of her students.  The first step is to create charts, graphs and tables from the data; the second step is to outline the story the data tells through the effective use of headings and subheadings; the third step is to write the narrative using story-telling (but not creative writing!) techniques; and the fourth step is to choose the very best illustrations of the main findings from among the charts, graphs and tables for inclusion in the report, while relegating supporting documentation to the appendix.

Great Resources: The following items are recommended for further exploration of this session’s topics:

  • Ian Shaw, Qualitative Evaluation, Sage Publications, 1999.
  • Veronica Thomas and Anna Madison, “Integration of Social Justice into the Teaching of Evaluation,” American Journal of Evaluation 31 (4), 2010.*

*AEA members have free access to all articles from the American Journal of Evaluation, via the AEA website.

 

At AEA’s 2010 Annual Conference, session scribes took notes at over 30 sessions and we’ll be sharing their work throughout the winter on aea365. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · ·

Archives

To top