AEA365 | A Tip-a-Day by and for Evaluators

TAG | cultural responsiveness

My name is Chandra Story. I was a 2016 AEA Minority Serving Institution Fellow this year and am a faculty member at Oklahoma State University in Health Education and Promotion/Public Health. My background includes project management and evaluation for federally funded projects, along with non-profit organizations. I have had the opportunity to partner with amazing community members across the country to describe and define what evidence means. The purpose of this blog is to explore and share a few tips on evidence based practice and practice based evidence.

A few definitions and thoughts:

Evidence based practice (EBP) is considered the foundation of public health practice. As a scholar, I am aware of the importance of evidence as a framework. However, as a culturally responsive evaluator, I need to allow community members to compare current EBP with their culture and definitions of health. By engaging community members, we are adding to the evidence base.

Practice based evidence (PBE) is the result of meaningful partnerships between academia and communities to identify and develop appropriate evaluation strategies. Due to cultural nuances, evidence may be defined in different ways. For example, increases in self- esteem among youth due to participation in cultural practices can be considered as evidence of program success in some communities.

Hot Tip:

In closing, I feel that both EBP and PBE are needed for effective and culturally responsive evaluation.  As an evaluator, I am responsible for investigating how success is defined by the community. With the right conversations, evaluators and communities can partner for better health outcomes.

The American Evaluation Association is AEA Minority Serving Institution (MSI) Fellowship Experience week. The contributions all this week to aea365 come from AEA’s MSI Fellows. For more information on the MSI fellowship, see this webpage: http://www.eval.org/p/cm/ld/fid=230 Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

My name is Art Hernandez and I am a Visiting Professor at University of the Incarnate Word in San Antonio, Texas.

I participated in the second yearlong experience of the AEA Minority Serving Institutions Fellowship program and have served as the Director for several cohorts most recently this past year.  I teach and practice Evaluation and am very interested in the operationalization of and the development of metrics related to cultural responsiveness in practice.   I am a member of the Indigenous, Multicultural and La Red TIGS (Topical Information Groups).

Lesson Learned:

Evaluation theory and practice (including issues and ideas related to Cultural Responsiveness) is constantly evolving and developing.  A unique academic and professional discipline, Evaluation is informed by advances in and elaborations of the “state of the science and art” of inquiry constructed from a variety of scientific and social scientific disciplines. The result is that its “expert” practitioners and theorists must be familiar with and consider diverse literature from an ever widening range of disciplines and be prepared to challenge and revise their thinking and practice.  This year’s cohort recognized the need for multi- and interdisciplinary thinking in Evaluation realizing the benefits and sharing their discoveries.

Hot Tip:  

“Expert” Evaluators recognize and endeavor to learn as much as possible related to the great variety of modes of inquiry recognizing the connection of this to the multiple ways of knowing and its importance as one way to assure cultural responsiveness.

Rad Resources:

Aidan Kenny (2006).  Evaluation: Emergence, Mode of Inquiry, Theory & Practice.  https://papers.ssrn.com/sol3/papers.cfm?abstract_id=946402 (Retrieved 12-14-16)

Bijal A Balasubramanian; Deborah J Cohen; Melinda M Davis; Rose Gunn; L Miriam Dickinson; William L Miller; Benjamin F Crabtree and Kurt C Stange (2015).  Implementation Science 10:31.  DOI: 10.1186/s13012-015-0219-z

http://implementationscience.biomedcentral.com/articles/10.1186/s13012-015-0219-z (Retrieved 12-14-16)

The American Evaluation Association is AEA Minority Serving Institution (MSI) Fellowship Experience week. The contributions all this week to aea365 come from AEA’s MSI Fellows. For more information on the MSI fellowship, see this webpage: http://www.eval.org/p/cm/ld/fid=230 Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hello from D.C.! My name is Kristin Mendoza, an alumna of the AEA Graduate Education Diversity Internship (GEDI) program 2014-15 cohort. I am a recent MPH graduate from George Washington University and I am excited to share with you my experience as a GEDI at the Office of Science Planning and Assessment (OSPA) at the National Cancer Institute. OSPA serves as evaluation, assessment and strategic planning consultants across the institute.

As a GEDI scholar I worked on the Program Assessment Branch on a number of projects. I also had the unique opportunity to work at NCI with another GEDI. It was an incredibly enriching experience and the training provided by AEA continues to drive my professional development today.

Helpful Hint #1: Always, always engage your stakeholders. In some cases, the evaluator’s role also includes sitting down with your stakeholders/clients and identifying the priority focus areas. Evaluators will be constrained by budget, resources, time, etc. and it is important to maintain the quality of the work as best as they can. Communicating with your stakeholders will help mitigate any set backs due to resource constraints.

Helpful Hint #2: Culturally responsive evaluation is defined in many ways and it is important to assess how your work environment, office or organization practices it.

Helpful Hint #3: Flexibility is key. Conducting an evaluation may introduce desired or undesirable results/findings. It is important for the work plan to be flexible, to communicate with the client when things come up and to identify ways to further explore surprising results/findings.

Rad Resource for New Evaluators: The evaluation theory tree originally presented by Dr. Christina Christie and Dr. Marvin Alkin currently at the University of California, Los Angeles. There are many versions of the tree that have come after their original publication. This resource helps new evaluators understand and visually see the different evaluation theories in existence as well as the practitioners. It definitely helped put things in perspective for me.

The American Evaluation Association is celebrating Graduate Education Diversity Internship (GEDI) Program week. The contributions all this week to aea365 come from AEA’s GEDI Program and its interns. For more information on GEDI, see their webpage here: http://www.eval.org/GEDI Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Natalia Woolley. I am a GEDI scholar from the 2014-15 cohort, and a graduate student in the Community Health Sciences department at UCLA. As part of the GEDI program, I interned at Kaiser Permanente (KP), in the Community Benefit Department’s evaluation unit.

At KP, I provided support for the Community Health Needs Assessment (CHNA), a federally mandated process all non-profit hospitals must conduct every three years. During the internship I focused on the methods used to collect and analyze primary and secondary data. I also collaborated in the department’s efforts to ensure primary data collection methods were systematically responsive to the cultural diversity found in the communities served by KP.

Lessons Learned:

Operationalizing culturally responsive practices is a challenge. Although many scholars have defined culture and articulated its importance when conducting evaluations, it is still a challenge to operationalize some cultural concepts. Nevertheless, I believe acknowledging the challenge is an important step into making needs assessments more culturally responsive.

Successful primary data collection should be culturally responsive. Hospitals must collect primary data as part of the CHNA. This process allows hospitals to better understand the communities’ main health issues, priorities and resources. To successfully connect with community members, hospitals should ensure their outreach and engagement are culturally responsive.

Hot Tips:

Helpful Hints: Secondary data can inform culturally responsive primary data collection. Secondary data provides a great deal of information about the groups living in each community. For example, secondary analysis results can provide a snapshot of the community demographics, including the population percentage with limited English proficiency. Evaluators can use this information to include language appropriate resources in the data collection process.

However, secondary data might miss some marginalized groups. To go beyond the secondary data, it is helpful to identify and contact organizations working with marginalized groups. For instance, Los Angeles County has an extensive database of organizations providing services to groups in need (https://www.211la.org/). Another possible option is to solicit input from community health workers servicing these groups.

The American Evaluation Association is celebrating Graduate Education Diversity Internship (GEDI) Program week. The contributions all this week to aea365 come from AEA’s GEDI Program and its interns. For more information on GEDI, see their webpage here: http://www.eval.org/GEDI Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Danielle Cummings. I am an alumna of the Graduate Education Diversity Internship (GEDI) program and NYU’s Masters in Public Administration program. As a GEDI, I attended countless workshops, webinars, and discussion groups about culturally responsive evaluation. I noticed that, perhaps unsurprisingly, people who attended these events were either evaluators who were already concerned about cultural responsiveness, or new evaluators whose supervisors required their participation. Here are two roadblocks that prevent evaluators from engaging in professional development opportunities and conversations on cultural responsiveness, a truth and a myth:

Lessons Learned:

Truth: It’s an art, not a science. This is, perhaps, the aspect of culturally responsive evaluation that is most irksome to evaluators who tend to be trained in a social science discipline. There is no 10-step cultural responsiveness process; it’s honed through trial and error. No two situations will employ an identical culturally responsive design, so a culturally responsive practice is difficult to prescribe and nearly impossible to measure. 

Myth: If you use good research methods, you don’t need cultural responsiveness. Several recommendations that cultural responsiveness advocates promote look a lot like steps taken by a conscientious evaluator. Integrate qualitative and quantitative methods to get a more complete picture. Consider the social and political context when developing an evaluation plan and interpreting data. It’s easy to believe that a strong evaluation is, by nature, a culturally responsive evaluation.

It’s hard to view the elusive art of cultural responsiveness as a practical skill, but it can improve the quality and accuracy of evaluation findings as much as other more tangible skills. Here are three tips for turning a skeptic’s attention to cultural responsiveness.

Hot Tips:

Emphasize validity. Framing the need for cultural responsiveness in terms of validity will perk up most evaluators’ ears. Karen Kirkhart’s work on multicultural validity provides a strong case for making culture a priority.

Start the conversation. One often doesn’t think about culture until forced. It’s possible to get compelling evaluation results without cultural responsiveness, so the pressure to acknowledge culture rarely exists. Evaluators who have seen the culture light, so to speak, might consider speaking with their colleagues about the difference cultural responsiveness has made in their relationships with clients, outcomes authenticity, and their recommendations’ utility and implementation.

Make it an organizational priority. Support from organizational leadership is key to making cultural responsiveness a priority among staff members. If leadership supports cultural responsiveness, there may be increased professional development opportunities on cultural responsiveness, and the time required for culturally responsive approaches is more likely to be considered a budgetary priority when designing an evaluation.

Rad Resource:  Kirkhart’s presentation at the inaugural Culturally Responsive Evaluation and Assessment conference is a good place to explore her work.

The American Evaluation Association is celebrating Graduate Education Diversity Internship (GEDI) Program week. The contributions all this week to aea365 come from AEA’s GEDI Program and its interns. For more information on GEDI, see their webpage here: http://www.eval.org/GEDI Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Art Hernandez and I am a Professor and Dean at Texas A&M University Corpus Christi.

I participated in one of the very early yearlong experiences as an AEA MSI Fellow and have served as the Director for several cohorts most recently this past year. I serve and have served as evaluator and teacher of evaluation and am very interested in the processes of cultural responsiveness in practice especially in regards to measurement and assessment.

Lesson Learned: The negative feelings associated with “difference” and the desire to live in a “normal” world with “normal” people often limits our desire to be in contact much less significantly interact with members of different cultural groups. Among other things, the lack of opportunity for significant experience/interaction and the associated feelings results in stereotyping as a means of coping and explaining.

Hot Tip: It is essential to have a significant “relationship” with the people who are involved in the activity being evaluated. This means developing and establishing significant relationships and doing so for its own sake rather than merely as a device to establish “cultural responsiveness”. In order to have any type of meaningful relationship it is important first to have a good sense of self – knowing your values, biases and “world view” and to be open to any differences in those attitudes and beliefs you might encounter in others. Finally, it is imperative that you reserve judgment and risk making “respectful mistakes.” Respectful mistakes are misunderstandings based in honest interest and founded in honest positive regard for the other person(s). 

Rad Resource: Cultural Competence and Community Studies: Concepts and Practices for Cultural Competence

The Stranger’s Eyes describes a community project and the differences in perspectives between the “benefactors” and those who were to benefit. A link provides access to a reflection guide of questions to guide the consideration of the presented case study. Provided by SIL International.

The American Evaluation Association is AEA Minority Serving Institution (MSI) Fellowship Experience week. The contributions all this week to aea365 come from AEA’s MSI Fellows. For more information on the MSI fellowship, see this webpage: http://www.eval.org/p/cm/ld/fid=230 Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Art Hernandez and I am a Professor and Dean at Texas A&M University Corpus Christi.

I participated in one of the very early yearlong experiences as an AEA MSI Fellow and have served as the Director for several cohorts most recently this past year.   I serve and have served as evaluator and teacher of evaluation and am very interested in the processes of cultural responsiveness in practice especially in regards to measurement and assessment.   I am a member of the Indigenous, Multicultural and LA RED TIGs.

Lesson Learned: Cultural responsiveness is important for many of the reasons well-articulated in the AEA Statement and in numerous articles and presentations. However, besides all the reasons which have been promulgated, I have discovered that sometimes evaluation efforts are perceived by participants as having some degree of risk attendant either to the process, outcomes or implications or some combination of all three. Often Latina/o evaluators who come from similar cultural backgrounds can actually exacerbate this perceived risk resulting in the psychological response which is known as “fight or flight” which is characterized by resistance or non-engagement. 

Hot Tip: Cultural knowledge, respect, and real relationship are important to minimize the sense of risk and maximize the nature and quality of cooperation with the evaluation effort.   Latina/o evaluators should never assume cultural responsiveness as merely a matter of cultural familiarity, cultural heritage or facility with the language and instead understand and practice cultural responsiveness as a predisposition and relational action.

Rad Resources:

Applying Culturally-Responsive Communication in Hispanic/Latino Communities – Education Toolkit. Susan G. Komen (2014).

The 10 Largest Hispanic Origin Groups: Characteristics, Rankings, Top Counties – Pew Research Center: Pew Hispanic Center (2010).

The American Evaluation Association is celebrating Latina/o Responsive Evaluation Discourse TIG Week. The contributions all this week to aea365 come from LA RED Topical Interest Group members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Greetings from Washington, D.C.! My name is Kwamé A. McIntosh, member of the American Evaluation Association’s Graduate Education Diversity Internship (GEDI) 2012-2013 cohort. Today, I am eager to share with you my experience that I had as an Assistant Education Evaluator for the National Oceanic and Atmospheric Association’s (NOAA) Office of Education (OED).

As a scholar, I was given an opportunity to conduct a process and outcome evaluation of five years of statistical data about the effectiveness of two scholarship programs for undergraduate students to expand their knowledge of the atmospheric, oceanic, and/or environmental sciences. Since the students participated in their internships within the summer, I would never have the opportunity to interact or engage with the clients. This meant developing a relationship with staff members who worked directly with the population to gain a viable understanding of the program and its participants, though the staff members for this program were located at another NOAA location. Though each site is under the umbrella of the OED, I was still an evaluator who was responsible for unveiling the effectiveness of the program.  Though it took time, through responsive pro-activity versus reactivity, I gained the insight needed to provide practical deliverables while assisting in solidifying the bridge between both facilities.

Lesson #1: Trust is earned, not given. The process of gaining trust may be the difference between empowering your client or becoming the enemy.

Lessons #2: It is the evaluator’s duty to be purposeful in engagement, not the client. In Hazel Symonette’s “Walking Pathways Toward Becoming a Culturally Competent Evaluator: Boundaries, Borderlands, and Border Crossings”, she highlights the need to have multilateral self-awareness by asking “Who do those that one is seeking to communicate with and engage perceive the evaluator as being?” Many times due to social misunderstandings of the role of evaluator, clients tend to become distrustful towards us, which will ultimately result of in difficulty in conducting the evaluation and/or not gaining valuable insight to truly impact the client. This can be addressed by constantly removing “self” out of its box and being willing to reform “self” in another completely for the sake of utilizable evaluation outcomes.

The American Evaluation Association is celebrating Graduate Education Diversity Internship (GEDI) Program week. The contributions all this week to aea365 come from AEA’s GEDI Program and its interns. For more information on GEDI, see their webpage here: http://www.eval.org/GEDI  Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Greetings, I am D. Pearl Barnett, MPA, graduate student at the University of Oklahoma, and recent graduate of AEA’s Graduate Education Diversity Internship (GEDI) program.  During my work at the Urban League of Greater Oklahoma City, we completed an evaluation of the agency’s Strategic Plan. Here, I present a few salient lessons about cultural responsiveness from my experience.

When faced with the matter of “diversity” among evaluation stakeholders and participants, these questions surfaced: Is diversity an issue when organization staff reflect their community and service recipients (clients)? Does it matter in a monoracial organization/community? The theory of representative bureaucracy holds that shared demographics among stakeholders, staff, and clients, means shared values. By this logic, organizational programs, policies, and actvities should inherently address clients’ diverse needs and desires. The truth is organizations always have diversity concerns no matter its demographic make-up. It is our duty to address these concerns in our evaluations.

Strategies:

  • Improve evaluation quality – and the relationship between organization and evaluation staff – by discussing the value of client input and the importance of allowing the data to speak for itself.

Representative bureaucracy assumes a link between demographic similarity and cultural awareness within an organization. Cultural responsiveness is the realization that though staff and clients may be demographically similar, there are still many differences (i.e., needs and values) that must be addressed among clients. Ensuring clients have a voice is the first step.

  • Develop needs assessments, making sure to include the staff and community in every area of the process.

The cultural knowledge of the staff is important and ensuring all necessary information is gathered is equally important. The needs assessments serve as tools to obtain information directly from the clients. Staff and community member interviews throughout survey development are to improve respondents’ understanding, maximize the response rate, while generating staff buy-in to the evaluation and its outcomes.

 

Rad Resource:  Effectively Managing Nonprofit Organizations.  Edited By Richard L. Edwards and John A. Yankey. Published in 2006.  Washington, DC: NASW Press.  This book addresses diversity management and new approaches to program evaluation within the context of nonprofit organizations.

The American Evaluation Association is celebrating Graduate Education Diversity Internship (GEDI) Program week. The contributions all this week to aea365 come from AEA’s GEDI Program and its interns. For more information on GEDI, see their webpage here: http://www.eval.org/GEDI  Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hello, I’m Blake Pearson, a doctoral student at the University of the Incarnate Word in San Antonio, Texas and a member of the AEA Public Statement on Cultural Competence in Evaluation Dissemination Working Group. I am a part of a subgroup compiling supplemental materials (annotated bibliography, links, favorites) for the AEA website.

Today, I’d like to share my reflections on the panel session at Evaluation 2013 titled Just Good Evaluation: Conversations with Senior Evaluators about Culturally Responsive Evaluation. This session, chaired by Tamara Bertrand Jones, Linda Schrader, and Dominica McBride, brought together evaluators to engage in conversations about culturally responsive evaluations. As a neophyte evaluator, this session was a dream come true.  In layman’s terms, I felt like a high school basketball player getting the opportunity to play a pickup basketball game with Michael Jordan, Larry Bird, and Magic Johnson.

The organization of the session created an environment that stimulated meaningful conversations and dialogue. Within each subject area, participants had the opportunity to join five table discussions led by senior evaluators on their culturally responsive experiences conducting evaluations. After 30 minutes, the participants were given the opportunity to change tables.  My plan at the beginning of the session was to attend all five tables, but ultimately I found myself glued to my chair engrossed in rich conversation and only able to attend two discussions.

The all-star cast of senior evaluators facilitated conversations and shared their perspectives at the tables on:

  • Culturally Responsive Evaluation – Stafford Hood and Joan LaFrance
  • Infusing Culture in Your Evaluation Practice – Katrina Bledsoe
  • Ancestral Knowledge in Evaluation – Melvin Hall and Katherine Tibbetts
  • Validity and Validation – Karen Kirkhart
  • Teaching Evaluation: The Role of Faculty – Rodney Hopson
  • Careers in Evaluation: Foundations & Philanthropy -Ricardo Millett
  • Publication Opportunities in Evaluation – Henry Frierson
  • The Politics of Federal Program Evaluation – Katherine Dawes
  • Designing and Using Dashboards in Tracking and Evaluation Efforts – Veronica Thomas

Lessons Learned:

–        Cultural responsiveness is a continuous process that includes reflection and reflexivity.

–        There is no perfect scale or tool to measure cultural responsiveness. In other words, an evaluator can perform a ‘textbook-good-evaluation’ and still not ensure cultural responsiveness.

–        Publish your evaluations! There is value in sharing your experience. Take advantage of publication opportunities. Visit the AEA link, Guiding Principles for Evaluators to explore specific principles that evaluators can expect to be upheld to.

Rad Resource:

–        Multiethnic Issues in Evaluation TIG: Learn more about the TIG and how to get plugged in.

–        AEA Graduate Education Diversity Internship Program (GEDI) : An excellent program that works to engage and support students from groups traditionally underrepresented in the field of evaluation.

This week, we’re diving into issues of Cultural Competence in Evaluation with AEA’s Statement on Cultural Competence in Evaluation Dissemination Working Group. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· ·

Archives

To top