AEA365 | A Tip-a-Day by and for Evaluators

TAG | community psychology

Hi, we are Olya Glantsman, a visiting professor of Psychology at DePaul University and Judah Viola, a dean of the College of Professional Studies and Advancement at National Louis University in Chicago. Last month we celebrated the release of our book titled Diverse Careers in Community Psychology.

The idea for the Diverse Careers in Community Psychology was born as a response to the question often posed by our students: “What can I do with a degree in Community Psychology?” Each time we got this question, we became more and more convinced that a text like this is long overdue for students (undergraduate or graduate), professionals looking to expand, shift, or change their career, and anyone who is mentoring or advising community minded students or employees.

Below are some hot tips we culled from the results of an extensive career survey of over 400-plus participants and 23 chapters written by over 30 different community psychologists (CPs) with various backgrounds.

Hot Tips: When looking for a job

  • Keep in mind, that those with community-oriented degrees do not have a problem finding a job
  • Many professionals successfully market their job skills and competencies rather than their degree and really find a “niche” – using unique interests and talents that the organization or team needs
  • A large number of survey participants have cited using Practice Competencies in helping them secure a job and using the same competencies throughout their work
    • All respondents reported using between five and fourteen competencies
  • When searching for a practice job, start preparing while still in training, cast a broad net, and search multiple disciplines
  • Obtaining mentorship and networking are two of the most important activities one can participate in.
    • More than half (59%) of survey respondents reported that they found out about their current job through networking

Rad Resources: Job Search

  • Participants use multiple search techniques when looking for employment (e.g., job postings, networking, listservs, etc.)
  • Finding practice related job search:
    • AEA, APA, and American Public Health Association (APHA) website, Indeed.com, Idealist.org, npo.net, simplyhired.com, and careerbuilder.com

Rad Resources: Job Training

Whether you are beginning your career or trying to expand or shift into a new arena, there are lots of options and opportunities. Whatever your journey, we hope you would find more helpful tips and hints in our book.

The American Evaluation Association is celebrating Community Psychology TIG Week with our colleagues in the CP AEA Topical Interest Group. The contributions all this week to aea365 come from our CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Our names are Susan M. Wolfe and Kyrah K. Brown and we are consultants at CNM Connect where we provide evaluation and capacity building services to nonprofit organizations.  Our work also includes evaluating community collaborations and coalitions. To effectively address most health, education, and other social issues at a systems level requires that communities address inequity and injustice.

RAD RESOURCE: In January, 2017 an article titled “Collaborating for Equity and Justice: Moving Beyond Collective Impact” was published in the Nonprofit Quarterly.

The authors presented the following six principles to promote equity and justice and each has implications for how coalitions and community collaboratives are evaluated.

Principle 1: Explicitly address issues of social and economic injustice and structural racism.

  • HOT TIP: Nearly all human problems, especially where there are disparities in outcomes, can be traced to social and economic injustice and/or structural racism. As an evaluator, examine whether these issues are being discussed and directly addressed.

Principle 2: Employ a community development approach in which residents have equal power in determining the coalition’s or collaborative’s agenda and resource allocation.

  • HOT TIP: Ask who has the actual power to make decisions and set agendas for the collaborative.

Principle 3: Employ community organizing as an intentional strategy and as part of the process. Work to build resident leadership and power.

  • HOT TIP: Closely examine the membership and leadership to determine the extent to which residents, or those who are directly affected by the issue at hand, are members and leaders.

Principle 4: Focus on policy, systems, and structural change.

  • HOT TIP: Review the agendas and activities to determine whether they are promoting more programs, or facilitating change in policies, systems, and structures.

Principle 5: Build on the extensive community-engaged scholarship and research over the last four decades that show what works, that acknowledge the complexities, and that evaluate appropriately.

Principle 6: Construct core functions for the collaborative based on equity and justice that provide basic facilitating structures and build member ownership and leadership.

RAD RESOURCE: The Collaborating for Equity and Justice Toolkit provided by The Community Tool Box can be accessed at: https://www.myctb.org/wst/CEJ/Pages/home.aspx

HOT TIP: Many nonprofits and health agencies are engaged in collaborative work and are oftentimes looking for effective frameworks to model after. Evaluators can use the Collaborating for Equity and Justice Toolkit to facilitate discussions and coalition development and planning efforts. When you introduce nonprofits and collaboratives to the framework, it may be helpful to provide brief presentations or facilitate interactive planning sessions. Prepare guided questions that help nonprofits to think about the application of the six principles in their work. As mentioned, the application of this framework can prove to be useful in refining or developing coalition goals that are intentional and evaluating their efficiency and effectiveness in meeting those goals.

The American Evaluation Association is celebrating Community Psychology TIG Week with our colleagues in the CP AEA Topical Interest Group. The contributions all this week to aea365 come from our CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Hi! I’m Tara Gregory, Director of the Center for Applied Research and Evaluation (CARE) at Wichita State University. Like any evaluator, the staff of CARE are frequently tasked with figuring out what difference programs are making for those they serve. So, we tend to be really focused on outcomes and see outputs as the relatively easy part of evaluating programs. However, a recent experience reminded me not to overlook the importance of outputs when designing and, especially, communicating about evaluations.

In this instance, my team and I had designed what we thought was a really great evaluation that covered all the bases in a particularly artful manner – and I’m only being partially facetious. We thought we’d done a great job. But the response from program staff was “I just don’t think you’re measuring anything.” It finally occurred to us that our focus on outcomes in describing the evaluation had left out a piece of the picture that was particularly relevant for this client – the outputs or accountability measures that indicated programs were actually doing something. It wasn’t that we didn’t identify or plan to collect outputs. We just didn’t highlight how they fit in the overall evaluation.

Lesson Learned: While the toughest part of an evaluation is often figuring out how to measure outcomes, clients still need to know that their efforts are worth something in terms of the stuff that’s easy to count (e.g., number of people served, number of referrals, number of resources distributed, etc.). Although just delivering a service doesn’t necessarily mean it was effective, it’s still important to document and communicate the products of their efforts. Funders typically require outputs for accountability and the programs place value in the tangible evidence of their work.

Cool Trick: In returning to the drawing board for a better way to communicate our evaluation plan, we created a graphic that focuses on the path to achieving outcomes with the outputs offset to show that they’re important, but not the end result of the program.  In an actual logic model or evaluation plan, we’d name the activities, outputs and outcomes more specifically based on the program. But this graphic helps keep the elements in perspective.    example graph of outputs and outcomesThe American Evaluation Association is celebrating Community Psychology TIG Week with our colleagues in the CP AEA Topical Interest Group. The contributions all this week to aea365 come from our CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Aloha! I am Anna Smith Pruitt, a community and cultural psychology doctoral candidate at the University of Hawai‘i at M?noa. I currently work with Dr. Jack Barile and his Ecological Determinants Lab evaluating community mental health and housing programs that serve people experiencing severe mental illness and homelessness on O‘ahu. These program participants are medically and socially vulnerable due to their compromised mental and physical health and marginalized social identities. Community psychology (CP) emphasizes that any interaction – including evaluation – is an intervention, and we strive to make our evaluations empowering experiences for program participants. I have found CP methods and values – particularly, collaboration, empowerment, citizen participation, and social justice – instructive in evaluating programs designed to serve marginalized groups. Relying on CP values and using a community-based participatory approach, Dr. Barile and I engaged participants in a Housing First (HF) program as co-researchers and together, created an ongoing evaluation partnership. Below are tips and resources we found useful:

HOT TIP 1: Collaborate!

One way community psychologists work to empower marginalized groups is by including members as partners in the research process. Such participatory approaches can be difficult when working with vulnerable individuals who may be actively experiencing physical and mental health complications (e.g., psychosis), and often evaluators (myself included!) assume these individuals are incapable of meaningfully participating in research. In my experience, even the most vulnerable member can engage in the evaluation process!

RAD RESOURCE: For info on participatory approaches, see the Community Toolbox’s chapters on CBPR research and participatory evaluation.

HOT TIP: Intentionally build advocacy into the evaluation design.

Once empowered, marginalized groups want to use their power to change the social conditions that contribute to their marginalization. In other words, empowerment leads to increased citizen participation and advocacy! At the conclusion of data analysis, HF participants chose to advocate for themselves using program evaluation data.

RAD RESOURCE: Check out the Community Toolbox’s Advocating for Change Toolkit.

Photovoice exhibit

Photovoice exhibit

HF participants advocated for themselves through a Photovoice exhibit (above).

HOT TIP: Practice self-reflexivity/critical awareness.

When evaluating programs with marginalized groups, constantly examine your privilege and be aware of power dynamics and the historical context that impact your relationship with participants and their experiences. Given Hawai’i’s colonial history and the fact that a disproportionate amount of people experiencing homelessness are Native Hawaiian, my status as a member of the colonizer group necessarily impacted my relationships with program participants.

RAD RESOURCE: Assess your privilege by calculating your privilege capital. Identify and purge your biases with the Bias Cleanse.

HOT TIP: Be creative and flexible in choosing methods.

Participants will have varying skills and capabilities, and you will need to creatively strategize ways to involve program participants in the evaluation process. The Photovoice method was useful for the HF participants because it allowed for both visual and verbal contributions to knowledge building.

RAD RESOURCES: Need help brainstorming methods? See Action Catalogue – an interactive tool for choosing methods and Participatory Methods.org.

The American Evaluation Association is celebrating Community Psychology TIG Week with our colleagues in the CP AEA Topical Interest Group. The contributions all this week to aea365 come from our CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Greetings!  Welcome to the Community Psychology TIG Week!  I, Carissa Coleman, a Community Psychologist from James Bell Associates as well as the other members of the TIG Leadership Team, welcome you to a week of Community Psychology and our influence in evaluation work.

Our Community Psychology ideals spread across many disciplines including psychology, social work, education, medicine, and all types of prevention work.

We invite you to visit our website at http://comm.eval.org/communitypsychology/home/ to learn more.

Moving our Field: Toward Theory, Systems, and Dynamic Methods

As a Community Psychologist, I, Leonard A. Jason from DePaul University, would like to offer three ideas that have the potential to energize and transform our field. They involve theoretical perspectives, appreciation of the complexities of the natural world, and dynamic methodological tools that can be used to capture these complex processes.

Many of us work in the field of evaluation to better understand the relationship between people and their contexts in ways that might alleviate human suffering.  Yet, as argued in a recent special issue on Theories in the Field of Community Psychology, the ideological nature of our work that prioritizes efforts to improve people’s lives can result in less willingness to consider the possible contribution of theory.  I am not arguing that our work will coalesce around only one theory, but I believe there has been an unfortunate reluctance to attempt to develop predictive theory, in part because it is seen as a distraction from taking action. However, there is no obvious reason why sound theory cannot be developed that increases the effectiveness of our social action efforts and accomplishes our goal to better understand the complexities of people and groups living within multifaceted ecosystems.

Theory must contend with a natural world that is endlessly beautiful and elegant, but also one that often feels mysterious, unpredictable, and filled with contradictions. Dynamic feedback loops are the norm within this organic stew, and as a consequence, our work would be more contextually rich if it transcended reductionistic and simplistic linear cause and effect methods. Theories can help us capture a systems’ point of view, where the reality of the ever-changing world is made up of mutual interdependencies regarding how people adapt to and become effective in diverse social environments.

Rad Resource:  Are there methods that help us conceptualize and empirically describe these transactional dynamics? There are, such as those contained within the Handbook of Methodological Approaches to Community-Based Research, which profiles a new generation of quantitative and qualitative research methods that are holistic, culturally valid, and support contextually- and theoretically-grounded community interventions. Mixing qualitative and quantitative research methods can provide deeper exploration of causal mechanisms, interpretation of variables, and contextual factors that may mediate or moderate the topic of study. Theories and sophisticated statistical methods can help us address questions of importance for the communities in which and with whom we work by capturing the dynamics of complex systems and providing us the potential to transform our communities in fresh and innovative ways.

The American Evaluation Association is celebrating Community Psychology TIG Week with our colleagues in the CP AEA Topical Interest Group. The contributions all this week to aea365 come from our CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

We are Natalie Wilkins and Shakiyla Smith from the National Center for Injury Prevention and Control, Centers for Disease Control and Prevention.

As public health scientists and evaluators, we are charged with achieving and measuring community and population level impact in injury and violence prevention. The public health model includes: (1) defining the problem, (2) identifying risk and protective factors, (3) developing and testing prevention strategies, and (4) ensuring widespread adoption. Steps 3 and 4 have proven to be particularly difficult to actualize in “real world” contexts. Interventions most likely to result in community level impact are often difficult to evaluate, replicate, and scale up in other communities and populations.[i]

A systems framework for injury and violence prevention supplements the public health model by framing injury within the community/societal context in which it occurs.[ii] Communities are complex systems- constantly changing, self-organizing, adaptive, and evolving.  Thus, public health approaches to injury and violence prevention must focus more on changing systems, versus developing and testing isolated programs and interventions, and must build the capacity of communities to implement, evaluate, and sustain these changes.[iii] However, scientists and evaluators face challenges when trying to encourage, apply, and evaluate such approaches, particularly in collaboration with other stakeholders who may have conflicting perspectives. A systems framework requires new methods of discovery, collaboration, and facilitation that effectively support this type of work.

Lessons Learned:

  • Evaluators can use engagement and facilitation skills to help stakeholders identify their ultimate goals/outcomes and identify the systems within which these outcomes are nested (Goodman and Karash’s Six Steps to Thinking Systemically provides an overview for facilitating systems thinking processes).
  • Evaluators must also address and communicate around high-stakes, conflictual issues that often undergird intractable community problems. “Conversational capacity”[iv] is an example of a skillset that enables stakeholders to be both candid and receptive in their interactions around challenging systems issues.

Rad Resources:

  • Finding Leverage: This video by Chris Soderquist provides an introduction to systems thinking and how it can be applied to solve complex problems.
  • The Systems Thinker: Includes articles, case studies, guides, blogs, webinars and quick reference “pocket guides” on systems thinking.

i Schorr, L., & Farrow, F. (2014, November). An evidence framework to improve results. In Harold Richman Public Policy Symposium, Washington, DC, Center for the Study of Social Policy.

ii McClure, R. J., Mack, K., Wilkins, N., & Davey, T. M. (2015). Injury prevention as social change. Injury prevention, injuryprev-2015.

iiiSchorr, L., & Farrow, F. (2014, November). An evidence framework to improve results. In Harold Richman Public Policy Symposium, Washington, DC, Center for the Study of Social Policy.

iv. Weber, C. (2013) Conversational capacity: The secret to building successful teams that perform when the pressure is on.  McGraw Hill Education: New York, NY

The American Evaluation Association is celebrating Community Psychology (CP) TIG Week with our colleagues in the CP AEA Topical Interest Group. The contributions all this week to aea365 come from our CPTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· ·

Hi! I am Cathy Lesesne and I work at ICF International doing public health related evaluation and research. My passion is doing work that affects the lives of adolescents, particularly those with the most need and the least voice in how to meet those needs. I do a lot of work in and with schools and school districts focused on optimal sexual health for teens and how to ensure youth have skills and ability to make healthy choices no matter when they decide to engage in sexual activity.

I often see well-intentioned school or school district staff creating solutions for youth and testing them rather than involving youth in solution identification and evaluation of the success. It is clearly easier to retain the power to determine the solutions and to see if they work in the end through evaluation. However, in my own work I have seen the power of youth engagement and involvement in both developing programs and services as well as in helping to evaluate and improve those resources.

Rad Resources: As evaluators, we often have the ability to make recommendations to our clients and partners working with youth AND we have the power to approach our evaluation work with youth in empowering and engaging ways. But we don’t always know how. I highly recommend that you dig into the Youth-Adult Partnerships in Evaluation (Y-AP/E): A Resource Guide for Translating Research into Practice and find your own ways to apply the wide range of ideas, tip sheets, and examples for engaging youth as partners in evaluation. Many of these examples may also help your clients or partners think of ways to better engage youth in the development of programs and services that reflect them and their real interests and needs. If youth are empowered to be partners in developing and testing solutions, they become allies instead of subjects; sources of solutions instead of sources of data.

The American Evaluation Association is celebrating Community Psychology (CP) TIG Week with our colleagues in the CP AEA Topical Interest Group. The contributions all this week to aea365 come from our CPTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

I am Carrie Lippy, an independent evaluator working with community-based and culturally specific domestic violence agencies. For the last two years, I have worked closely with the NW Network, an agency providing intervention and prevention services to lesbian, gay, bisexual, transgender, and queer (LGBTQ) survivors of abuse.

Thinking about community-level impacts often brings to mind regional or place-based notions of community. For example, interventions targeting neighborhoods, cities, or counties. However, many culturally specific programs aim to impact identity-based, rather than place-based communities. Identity-based communities are those developed among people with shared identities, such as sexual orientation or gender identities.

Below are some lessons learned & rad resources for evaluating community-level impacts for identity-based communities.

Lessons Learned:

  • Be clear on who the community is. Defining identity-based communities can be tricky. For example, when looking at impacts on LGBTQ communities, evaluators need to be mindful of the impressive diversity of LGBTQ people, recognizing that even the terminology used to identify members of these communities may differ widely. Terminology can differ by factors such as age, race/ethnicity, and region. Some members of LGBTQ communities may even identify as heterosexual (e.g., some transgender people).
  • The importance of online community spaces for identity-based communities. Since identity-based communities typically have less connection to geographic areas, online spaces hold particular importance for connecting community members. In fact, even online spaces that are not culturally specific can still reach many identity-based communities. Recently, the National LGBTQ Domestic Violence Capacity Building Learning Center partnered with the National Domestic Violence Hotline to examine the experiences of LGBTQ survivors of domestic violence. Although the online survey was posted on the non-LGBTQ-specific Hotline website, nearly 600 LGBTQ survivors completed the survey, illustrating the reach of even non-culturally specific online spaces.
  • A need for alternative sampling strategies. Some identity-based communities can be especially challenging to reach, making measuring community-level effects quite difficult. In my work with the NW Network, we’ve found snowball sampling strategies particularly effective for reaching marginalized members of LGBTQ communities, including some transgender communities of color or LGBTQ immigrants. Snowball sampling techniques utilize existing connections in communities to recruit research participants.

Rad Resources:

For those interested in learning more about culturally-specific research and practice in LGBTQ communities, check out:

  • A free, online library with resources on domestic violence in LGBTQ communities. The library was created by the National LGBTQ DV Capacity Building Learning Center, a joint project of the NW Network and the National Coalition of Anti-Violence Projects.

The American Evaluation Association is celebrating Community Psychology (CP) TIG Week with our colleagues in the CP AEA Topical Interest Group. The contributions all this week to aea365 come from our CPTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi! My name is Amy Hilgendorf and I am the Associate Director for Engaged Research at the University of Wisconsin-Madison Center for Community and Nonprofit Studies (the CommNS). We specialize in community-based action research and evaluation partnerships with grassroots and nonprofit groups and offer support to others who do this work.

In recent years, we have partnered with county-based and statewide coalitions that are seeking to address childhood obesity by applying a model of collective impact. John Kania and Mark Kramer first characterized collective impact as consisting of five key conditions that can help unite multi-sector collaborative efforts towards greater community impact than what isolated efforts can achieve. Those five conditions are: a common agenda, mutually reinforcing activities, continuous communication, shared measurement systems, and backbone support. The coalitions we work with have found the collective impact model offers valuable guidance for the kinds of processes that will set them up for achieving impact, but questions remain about how to actually evaluate the impacts of collective impact.

Rad Resource:

The Collective Impact Forum is an online hub of information, resources, and peer networking related to collective impact. The searchable resources section includes a host of “Evaluation” resources. One tool is the Guide to Evaluating Collective Impact by Hallie Preskill, Marcie Parkhurst, and Jennifer Splansky Juster. While much of this guide focuses on evaluating the process of collective impact, the third part lists suggested behavior changes and systems changes that may result from collective impact initiatives and provides ideas of indicators and approaches for evaluating these changes.

Lessons Learned:

We have found it critical to remember that collective impact is not necessarily a new concept, but rather one that has emerged from a long tradition of collaborative and coalition practice and thinking. Literature on this topic stretch back more than 30 years, especially in the community psychology field, and includes theory and practical tools for assessing the process and impact of collaborative work.

In particular, the Community Coalition Action Theory developed by Fran Butterfoss and Michelle Kegler synthesizes much of this research to suggest how coalition practices can lead to different kinds of community impacts. These theorized impacts include community change outcomes, such as policy achievement and program expansions; community capacity outcomes, like new skill development and new partnerships; and, over time, the health and social outcomes that are the target of the coalition’s work. Additionally, we have found that Michelle Kegler and Deanne Swan’s efforts to empirically test the relationships in this theory offers especially useful guidance for “connecting the dots” between evaluation of coalition processes, including implementation of collective impact practices, and evaluation of community impacts.

The American Evaluation Association is celebrating Community Psychology (CP) TIG Week with our colleagues in the CP AEA Topical Interest Group. The contributions all this week to aea365 come from our CPTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Courtney Barnard and I am a social worker, coalition coordinator, and program evaluator for a children’s health care system in Fort Worth, Texas.

Lessons Learned: I recently attended a workshop on local children’s health data and racial equity. I quickly realized that, though with good intentions, my colleagues and I often conflate race and ethnicity with other factors, such as poverty, education, and health status. In my community these factors strongly correlate with one another. Not only was I making assumptions about people and data based on these correlations, I was not intentionally analyzing outcomes by race. I was left wondering the impact of the consequences on my data analysis, recommended and implemented strategies, and ultimately the community affected by these decisions.

The Race Matters Institute of Just Partners, Inc. (RMI) seeks to ensure that ALL children, families, and communities thrive. When evaluating the effects of our efforts on the community, we must purposefully do so through a racial equity lens. Using this lens allows us to identify how various factors may affect racial groups with different resources and needs, thus shaping our analysis, recommendations, and strategies.

According to RMI, racial equity results when you cannot predict an outcome by race and when data reveals closing gaps in outcomes. It is quantifiable and measurable. When evaluating community impact through a racial equity lens, we must note:

  1. Attention to racial equity is key to advancing the mission for all community members, unless the stated mission explicitly targets a certain race or ethnicity.
  2. Applying a racial equity lens requires that data be broken apart by race. We must collect data by race and then systematically disaggregate the data in a consistent manner.
  3. Racial equity is about our shared fate as a community. It requires all voices and perspectives from the community be represented, valued, and included in policies and practices.

Rad Resource: The Annie E. Casey Foundation developed the Racial Equity Impact Analysis (REIA), a 5-question tool that can be used to assess any policy, program, or practice (PPP), either existing or proposed. This tool can help identify alterations in said PPP in order to achieve greater results for all community members.

  1. Are all racial/ethnic groups who are affected by the policy/practice/decision at the table?
  2. How will the proposed policy/practice/decision affect each group?
  3. How will the proposed policy/practice/decision be perceived by each group?
  4. Does the policy/practice/decision worsen or ignore existing disparities?
  5. Based on the above responses, what revisions are needed in the policy/practice/decision under discussion?

The American Evaluation Association is celebrating Community Psychology (CP) TIG Week with our colleagues in the CP AEA Topical Interest Group. The contributions all this week to aea365 come from our CPTIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Older posts >>

Archives

To top