AEA365 | A Tip-a-Day by and for Evaluators

TAG | culturally responsive evaluation

“We don’t see the world as it is. We see it as we are.” – Anais Nin, poet

Hello! We are Amy Sena and Jenifer “Lucy” Rogers. Cultural awareness of ourselves and of others is a critical step for fair, meaningful evaluations. As 2016-2017 GEDI scholars, we had the opportunity to learn about culturally responsive evaluation. It was an honor to learn from key practitioners, including Dr. Leona Ba whose passion in teaching culture and evaluation was evident during her webinar and summer institute course. We would like to share two tips about culturally responsive evaluation that were particularly relevant during our internship years.

Hot Tip 1: Be aware of your culture.

As GEDI scholars we started our journey by learning how to be self-aware and to understand our own culture and biases – a lesson championed, honed and taught by Dr. Hazel Symonette at the beginning of each GEDI program year. Dr. Ba reinforced this practice during her teachings by sharing her own cultural journey working in various settings. We learned that culture belongs the evaluators, program participants and the broader context of the evaluation. Be careful of common misunderstandings such as culture is static, or that race or ethnicity are the main foci. Culture is, instead, dynamic with multiple layers to consider (e.g. national, economic, organizational, generational, gender, religion, etc.).

Hot Tip 2: Attempt to understand all the values and assumptions at play before engaging or implement the evaluation.

Hear the values of relevant stakeholders, which shed light on what is believed to be good, important, or valuable in a culture, influencing the assumptions present in an evaluation. It is critical for culturally responsive evaluators to clarify values and assumptions in a systematic way. By discussing values and understanding assumptions of stakeholders, the evaluation will be more successful in asking the “right” questions.

Rad Resource:  We suggest taking a look at this guide by Dr. Hazel Symonette to calibrate and check the self as an evaluator prior to engaging others.

The American Evaluation Association is celebrating Graduate Education Diversity Internship (GEDI) Program week. The contributions all this week to aea365 come from AEA’s GEDI Program and its interns. For more information on GEDI, see their webpage here: http://www.eval.org/GEDI  Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi! We are Nicole Robinson, Emily Connors, Kate Westaby, Tiffine Cobb, Regina Lowery, and Elise Ahn and we’re board members of ¡Milwaukee Evaluation! Inc., the Wisconsin statewide AEA affiliate. As an affiliate and professional development collaborative of Wisconsin-based evaluators, we have three goals:

  • To promote the science of evaluation
  • Provide networking and capacity building opportunities
  • Develop a pipeline of evaluators from underrepresented groups

In the past few years we have focused on field building initiatives centered around building the capacity of evaluators to incorporate culturally responsive evaluation (CRE) and social justice into their practice. We see this goal as paramount to create a thriving field ready to respond to the evaluation needs of a multicultural world. This past year, during our flagship event, the Social Justice & Evaluation conference, is where we promoted this work. We provided CRE 101 sessions in addition to sessions helping evaluators address “isms” during the evaluation process from start to finish, how to assess what the current political climate can impact evaluation and the people we serve, or how social justice can be infused into practices such as results-based accountability.

Lessons Learned:

We recently administered a survey to Wisconsin evaluators and asked them about how much they use CRE. The full results will be shared in the future, but we can share a couple points for discussion. For example, 57% of evaluators who responded to the survey have never reviewed AEA’s statement on cultural competence and 37% had no formal training on cultural competence. The open-ended responses provided a richer picture of evaluation in Wisconsin. While we are still analyzing this data, we wanted to share one quote that captures the complexity of this discussion, linking the absence of CRE to stagnant outcomes among other areas:

“It is all about power and money. The same folks are getting the same grants or contracts and conduct evaluations in the same way. It isn’t rocket science why some of the same chronic outcomes and poor quality of life has not changed. Evaluation and research studies need to be built differently by different people. If we keep producing basically the same monolithic group of academics how will things ever change? This is embedded in the systems and institutions of education, policy, procurement, political, and monetary practices. People who educate the next generation of academics and award contracts, grants, keynotes, or presidential sessions MUST be held accountable for structurally ensuring and requiring diversity in curricular content, human resources, funding priorities, contract/grant awards, keynotes, publications, etc. or things won’t change.”

Stayed tuned for more this week from our Wisconsin evaluators!

The American Evaluation Association is celebrating ¡Milwaukee Evaluation! Week with our colleagues in the Wisconsin statewide AEA Affiliate. The contributions all this week to aea365 come from our ¡Milwaukee Evaluation! members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi all! My name is Rachel Vinciguerra and I am a master’s student of international development and social work at the University of Pittsburgh. This summer I worked on two program evaluations in Haiti: one a mid-point evaluation for a girls’ empowerment program, the other a preliminary design of M&E protocols for an emerging foster care agency. Coming from a social work background, and as an American evaluator working in Haiti, it was especially important to me the studies were culturally-responsive and took marginalized groups into consideration as major stakeholders.

Ultimately, it came down to sharing power with these groups throughout the evaluation process. I found that, when we put them at the center of design, implementation, and presentation, results were richer.

Hot Tip #1: Identify marginalized groups.

  • There are two pieces to this. First, you have to begin with considerable knowledge of the culture and community in which you are working in order to understand specific and, often complex, hierarchies of power. Second, you have to allow that structural knowledge to contextualize your early conversations with stakeholders in order to identify those groups in the program whose voices are not often heard.

Hot Tip #2: Engage marginalized groups on the same level as your organizational client.

  • Consider how you engage your organizational client as you plan for evaluation. Are they telling you what questions they want answered? Are you working with them to develop a theory of change model? Are you collaborating on the timeline of the evaluation? Now consider the marginalized groups in your evaluation and share power in the same way with them. They may be beneficiaries of the program, but they may also be groups within the organization that hired you.

Hot Tip #3: Ensure evaluation results can be understood by all involved.

  • It is research 101. Human subjects deserve access to the knowledge and research they help generate and you can make sure they get it. In the evaluations I worked on, this meant translating all reporting into Haitian Creole and communicating the results in the same diverse modalities I had for my client.

Lessons Learned:

  • Be patient. Be flexible. Be humble. Make and maintain space in your design to be responsive to marginalized groups and be ready to adapt quickly and with humility as needed.

Rad Resources:

 

The American Evaluation Association is celebrating SW TIG Week with our colleagues in the Social Work Topical Interest Group. The contributions all this week to aea365 come from our SW TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Koolamalsi njoos (Hello Colleagues/Friends).  I’m Nicole Bowman (Mohican/Lunaape) a culturally responsive (CR) and Indigenous Evaluator (CRIE) at the WI Center for Education Research (WEC and LEAD Center) and President/Evaluator at Bowman Performance Consulting, all located in Wisconsin.

In 1905, the President of UW, Charles Van Hise, provided the foundation for what has become fundamental to how I practice evaluation – The Wisconsin Idea:

“The university is an institution devoted to the advancement and dissemination of knowledge…in service and the improvement of the social and economic conditions of the masses…until the beneficent influence of the University reaches every family of the state” (p.1 and p.5).

My work as an Indigenous and culturally responsive evaluator exemplifies the WI Idea in action.  Through valuing, supporting, and resourcing culturally responsive and Indigenous theories, methods, and activities, I’m able to not only build organizational and UW’s capacity to “keep pace” (p. 3) in these areas but am empowered to be “in service” to others and not “in the interest of or for the professors” (i.e. self-serving) but rather as a “tool in service to the state…so the university is better fit to serve the state and nation” (p.4 and p.5).  My particular culturally responsive and Indigenous evaluation, policy, and governance expertise has brought university and Tribal governments together through contracted training and technical assistance evaluation work; has developed new partnerships with state, national, and Tribal agencies (public, private, and nonprofit) who are subject matter leaders in CR research and evaluation; and extended our collaborative CR and CRIE through AJE and NDE publications, AEA and CREA pre-conference trainings and in-conference presentations, and representation nationally and internationally via EvalPartners (EvalIndigenous). We’re not only living the WI Idea…we are extending it beyond mental, philosophical, and geographic boarders to include the original Indigenous community members as we work at the community level by and for some of the most underrepresented voices on the planet.
Rad Resources: 

During this week, you will read about how others practice the WI Idea. As evaluators, we play an integral role in working within and throughout local communities and statewide agencies. Daily, we influence policies, programs and practices that can impact the most vulnerable of populations and communities. Practicing the WI Idea bears much responsibility, humility, and humanity.  We need to be constant and vigilant teachers and learners.

The American Evaluation Association is celebrating The Wisconsin Idea in Action Week coordinated by the LEAD Center. The LEAD (Learning through Evaluation, Adaptation, and Dissemination) Center is housed within the Wisconsin Center for Education Research (WCER) at the School of EducationUniversity of Wisconsin-Madison and advances the quality of teaching and learning by evaluating the effectiveness and impact of educational innovations, policies, and practices within higher education. The contributions all this week to aea365 come from student and adult evaluators living in and practicing evaluation from the state of WIDo you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi! We are Morgan J Curtis (independent consultant) and Strong Oak Lafebvre (executive director of Visioning BEAR Circle Intertribal Coalition).  Along with Patrick Lemmon (independent consultant), we have the good fortune of serving as the evaluation team for the Walking in Balance (WIB) initiative.

WIB is an innovative approach to violence prevention that focuses on 12 important indigenous values that encourage better harmony with other people and the land. The primary component of WIB is a 13-session curriculum that is built on a Circle Process and that, with some adaptations, can be focused on different populations. The Circle Process involves storytelling and sharing by all participants, including the Circle Keeper who serves to move the conversation forward. A teaching team of four, seated in the four directions, diminishes the role of a single expert and promotes Circle members talking with each other rather than to the Circle Keeper.

Lessons Learned: This program presents many exciting evaluation opportunities and challenges. One of the challenges is ensuring that the evaluation is both culturally responsive and methodologically sound. As part of this challenge, all members of the evaluation team are located in different cities and the evaluation consultants have all been white folks. This process has included much trial and error in our collaborative process and in the evaluation methodologies themselves. The team wanted to design an evaluation that aligned with the program’s principles and also integrated into the circle process as seamlessly as possible. We currently have a pre and post question for each session; participants write their answers on notecards and share aloud with the circle, which flows well with the storytelling focus of the circles.  Additional questions at the beginning and end of the Circle invite participants to share aloud how each session transformed them and ways continued engagement in the Circle impacts their lives. We capture responses from all parties to track how the Circle Process transforms both the teaching team and participants.  The VBCIC teaching team loves the seamless nature of the evaluation process and finds that checking in about what happens between sessions captures changes in behavior based on learning directly linked to Circle teachings.

Hot Tip: Listening plays a key role in both the Circle Process itself and in developing the evaluation. We have established a process of following the lead of the Visioning BEAR team both by listening intently to their struggles and hopes and also by offering options for how to tweak the evaluation. They move forward with what feels right to them and report back to us. Then, we keep tweaking. We are working to make the data analysis and interpretation processes more collaborative as we move forward, too.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

We are the 2016 AEA Minority Serving Institutions (MSI) Fellows: Cirecie West-Olatunji (counselor education-Xavier University of Louisiana), Jeiru Bai (social work-University of Nebraska at Omaha), Kate Cartwright (health administration-University of New Mexico), Smita Shukla Mehta (special education-University of North Texas), and Chandra Story (public health-Oklahoma State University).

It seems that it was only a few weeks ago that we shared our biographical and personal goal statements and listened to our evaluation mentor, Art Hernandez, share a (long!) list of reading resources. Hailing from diverse academic disciplines, we wondered how we would integrate seemingly disparate ideas and philosophies to jointly construct a presentation for the annual conference. After 12 months of biweekly telephone conference calls, the week-long AEA Summer Institute, a joint AEA conference presentation, and life changes (e.g., Smita was promoted to Full Professor rank and Jieru had a beautiful 7 lb, 8 oz. baby boy), we now share key lessons learned from our multidisciplinary thinking.

msi

Lessons Learned:

#1: Set Aside Time to Read

We are often too busy to set aside time for reading, reflection, and dialogue with others. Being involved in this fellowship, I found it critical to schedule time to acquire knowledge that I could integrate into my existing skill set.

#2: Evaluation can be Creative

Prior to this fellowship, I thought that data collection methods for culturally responsive evaluation were limited. My learning experiences through the AEA conference and the summer institute have changed my paradigm! There are many creative approaches to evaluation, including ripple effects mapping.  These approaches provide proper context for evaluation while honoring communities.

#3: Transcend Disciplinary Boundaries

As a relatively new evaluator, I learned to always remember that evaluation theory and practice transcend disciplinary boundaries. When planning an evaluation, I now look beyond practices in any one discipline. A good starting place is the AEA website!

#4: Distinguish Research Methods from Program Evaluation

While I acknowledged a difference between research methods and program evaluation, the distinction became clearer after the summer institute and AEA conference. Evaluation design requires a lot more technical skills in mixed methods data collection and analyses. Conducting an evaluation also requires social skills (e.g., trust, compassion, connection, communication, facilitation) to connect with stakeholders.

We are grateful to the AEA community for creating the MSI Fellowship program. Thanks to you, we can continue crystallizing our evaluation identity and competence.

The American Evaluation Association is AEA Minority Serving Institution (MSI) Fellowship Experience week. The contributions all this week to aea365 come from AEA’s MSI Fellows. For more information on the MSI fellowship, see this webpage: http://www.eval.org/p/cm/ld/fid=230 Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

We are alumna of the AEA’s Graduate Education Diversity Internship (GEDI) program’s 13th cohort. Today’s hot tips are reflections on the importance of increasing an organization’s capacity to conduct equitable evaluations across all the stages of an evaluation. Here, we will share three tips that we learned while working at our GEDI sites.

Hot Tip #1 (Leah, Doctoral Student at University of Illinois Urbana-Champaign): Develop and utilize critical consciousness to fill the gaps. At my GEDI site, diversity, inclusion and engagement are prioritized in the organization. In developing a measure to capture progress toward this goal, we realized that peer-reviewed research supporting equitable and culturally responsive measures are limited. Most research focuses on staff diversity and work culture but does not account for the various ways spaces can be meaningfully diverse or how people can be included and engaged. One way to address the gap is by increasing critical consciousness, explained by Paulo Freire as “the ability to perceive social, political, and economic oppression and to take action against the oppressive elements of society.” We can then critically analyze the cultural validity of our instruments.

Hot Tip #2 (Monique, Doctoral Candidate at University of Wisconsin-Milwaukee): Make sure reflection is the center of culturally responsive evaluation. During my GEDI experience, I worked with organizations addressing population health outcomes in historically marginalized communities. Following trainings debriefing with program leadership, we concluded that program staff and leadership needed a better understanding of how important reflection is to the culturally responsive evaluation framework. With my site supervisor we conducted a presentation called Tools You Can Use: Program Evaluation for a state foundation’s annual grantee forum. I revised the framework presented by the Centers for Disease Control and Dr. Rodney Hopson on AEA365 to develop The Reflective Flower. This graphic, shown below, centers reflection on the part of the evaluator and key stakeholders. Print this graphic as a reminder to your team and stakeholders of HOW TO BLOOM USING CULTURALLY RESPONSIVE EVALUATION.

reflection-flower

Hot Tip #3 (Ibukun, Doctoral Student at Cornell University): Leaders should explicitly commit to culturally responsive evaluation. At my GEDI site, health equity is the organization’s main mission. To assess the organization’s impact on health equity in the community, it is crucial that leaders stay reflective and knowledgeable on health-specific culturally responsive evaluation. The foundation can influence health equity through setting grant project requirements. It is not enough for organizations’ stances and staff members to be committed to CRE; the leaders must also be supportive of these efforts. Ultimately, foundations’ leaders have the unique ability to tackle these issues through their grant-making, as they hold positions of power and have the potential to influence systemic change.

The American Evaluation Association is celebrating Graduate Education Diversity Internship (GEDI) Program week. The contributions all this week to aea365 come from AEA’s GEDI Program and its interns. For more information on GEDI, see their webpage here: http://www.eval.org/GEDI Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Hello, my name is Kenneth Pass. I am a doctoral student in the Department of Sociology at Northwestern University and a recent Graduate Education Diversity Internship (GEDI) program alumnus. During my GEDI internship at Growth Capital Network in Ann Arbor, Michigan, I engaged in various health, evaluation, and philanthropic projects with state and local community organizations. Throughout this internship I have learned important lessons on community-centered frameworks, diverse health programs throughout the state, proposal and grant evaluation, and metric and other measurement development.

Lessons Learned:

  1. Know who is speaking and who is contributing to that voice. When working with state and local community organizations that are submitting grant proposals to philanthropic and other funding organizations, it is important to know about the applicant and the community they serve, and what major partners and stakeholders are involved. This information ensures that you are able to understand the organization and what role it plays and if it is community centered in that role.
  1. Take stock of evaluation capacity and investment. Often I observed that applicants either did not have the capacity to develop and implement an evaluation or prioritize program evaluation. This was an important moment for me – and the applicants. Their lack of evaluation capacities or investment limited how they approached and understood the benefits of program evaluation. Being able to assess an applicant’s capacity and investment in evaluation and provide feedback on the meaning, significance, and benefits of evaluation is essential to helping improve community health, as well as working relationships with philanthropic and other funding organizations.
  1. Encourage potential grantees to think about disparities within communities. While evaluating applicant proposals, I considered Lessons 1 and 2. I thought more critically about how minority groups would benefit from proposed health programs and initiatives and how communities were being engaged throughout the development, implementation, and evaluation of these health programs. Applicants’ programs often involved marginalized or underserved sections of their communities so understanding how proposals addressed gender and racial/ethnic health disparities was key. Given the health burdens that women and people of color carry throughout Michigan and the United States, encouraging state and local community organizations to pay attention to the health disparities present in their communities is crucial to increasing the benefit and scope of any health program.

Through the GEDI internship, I learned more not only about health, evaluation, and philanthropy but also about the importance of discovering, valuing, and centering community voices in program evaluation.

Rad Resources:

  1. Template for Analyzing Philanthropic Programs Through a Culturally Responsive and Racial Equity Lens
  2. Advancing Evaluation Practices in Philanthropy by Aspen Institute Program on Philanthropy and Social Innovation

The American Evaluation Association is celebrating Graduate Education Diversity Internship (GEDI) Program week. The contributions all this week to aea365 come from AEA’s GEDI Program and its interns. For more information on GEDI, see their webpage here: http://www.eval.org/GEDI Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Greetings from Los Angeles! We are Drs. Ashaki M. Jackson and Stewart Donaldson – the AEA Graduate Education Diversity Internship (GEDI) program leadership team. This week, we share with you reflections from our most recent alumni, fondly called District 13 in honor of our 13th cohort.

During last year’s annual conference, GEDI scholars were charged with learning how attendees defined culturally responsive evaluation (CRE) and how it emerged in practitioners’ work, if at all. When asked to what extent CRE was important to practitioners’ work, a respondent answered, “don’t care.” It ignited scholars to reflect on where culture exists and the possibility that evaluation can be conducted without attention to culture. Their reflections yielded a new understanding of culture’s omnipresence in our selves (beliefs, values, religious practices, race and ethnicity), work (colleagues, stakeholders, funders and politics that shape our grant opportunities), environments (where we live, where we work) and even our funders’ missions. Culture is inextricable, and that matters.

The GEDI program sharpens scholars’ attention to culture and its impact on evaluation quality, validity and meaningfulness. It is a pipeline to the evaluation field. We train masters and doctoral students of color who reflect many of the new communities in which evaluations occur. The program introduces evaluative thinking in scholars from variety of disciplines and is centered on the principles of culturally responsive evaluation. Scholars participate in a yearlong internship while receiving mentoring from leadership, AEA theorists and practitioners, and internship site supervisors. Scholars also complete professional development courses, including those offered through Claremont Evaluation Center’s (CEC) annual workshops and the association’s annual conference; complete monthly webinars with established theorists and practitioners; and fulfill written group and individual deliverables to practice conveying findings to different audiences.

Hot Tip: Evaluative thinking and evaluation skills are widely practical. We invite graduate students across all fields who are interested in exploring and practicing cultural competence in evaluation, are from historically under-represented communities, and are at an institution where evaluation coursework is limited or absent to apply. AEA distributes a call each spring. Please share this opportunity widely.

Hot Tip: The program is fueled by partnerships with organizations that can provide evaluation opportunities for our scholars. Site supervisors provide our scholars practical professional experience and space in which to apply their program learning. We enthusiastically welcome applications from organizations. Previous host sites have missions centered on education, health, policy, environment, micro-loans, volunteerism and social services. If you are interested in working with the program and helping shape the next generation of evaluators, please contact us at gedi@eval.org, or watch for the Call for Applications that we distribute in early spring (prior to our call for scholar applications).

Rad Resources
CEC Professional Development Workshop Series

The American Evaluation Association is celebrating Graduate Education Diversity Internship (GEDI) Program week. The contributions all this week to aea365 come from AEA’s GEDI Program and its interns. For more information on GEDI, see their webpage here: http://www.eval.org/GEDI Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

This is part of a two-week series honoring our living evaluation pioneers in conjunction with Labor Day in the USA (September 5).

Greetings, I am Melvin Hall, a current AEA Board Member and program evaluation specialist for over forty years. I have had many excellent mentors throughout my career including Tom Hastings, Bob Stake, Terry Denny, and Ernie House.

Why I chose to honor this evaluator:

In this series to honor living evaluators I wish to honor Karen Kirkhart, as both a leading scholar and a person who has demonstrated a commitment to social justice, making the field more engaged with and respectful of human cultural and values diversity.

Pioneering and enduring contributions:

As a scholar, Karen is a tenaciously brilliant thinker who has permanently altered the evaluation literature with her introduction of multicultural validity as a central concern for quality practice. Under the banner of evaluation influence, she additionally has effectively woven together the practical understanding of how evaluation functions as a tool of society; and in that regard, argued effectively for turning the spotlight on power and privilege that generates and maintains inequity across social institutions and interactions.

An early failure of evaluation as a profession was its unease with matters of context. While known to be central to the functioning of programs and services evaluated, the field was not equipped to think well about how to handle context in practice. Karen’s work has centered cultural context in discussion of quality practice. Working through these issues with indigenous communities and others less well served by evaluation, Karen’s legacy affirms the ethical imperative to be responsive to all stakeholders to an evaluation…not just the privileged and powerful.

As a former AEA President and thought leader in the field, Karen has provided pivotal guidance and influence to important AEA initiatives. This includes the cultural reading of the Program Evaluation standards that informed the most recent revision; development of the AEA Statement on Cultural Competence; and co-developing significant published scholarship with evaluators of color bringing new and important voices into focus for the profession.

Whenever there is acknowledgement of the present and improved state of the profession, it is easy for me to see woven into the past several years of progress, the steady hand of influence provided by Karen Kirkhart. I am one whose career trajectory was elevated by her friendship and mentoring, and thus feel honored to prompt this recognition by others.

Resources:

Kirkhart, Karen E. “Seeking Multicultural Validity: A Postcard from the Road.Evaluation Practice, Vol.16, No.1, 1995, pp. 1-12.

Hood, S., Hopson, R., and Kirkhart, K. (2015). Culturally Responsive Evaluation: Theory, practice, and future implications. In Newcomer, K. and Hatry, H (Eds.). Handbook on Practical Program Evaluation (4th ed.) (pp. 281-317). San Francisco, CA: Jossey-Bass.

 

The American Evaluation Association is celebrating Labor Day Week in Evaluation: Honoring Evaluation’s Living Pioneers. The contributions this week are tributes to our living evaluation pioneers who have made important contributions to our field and even positive impacts on our careers as evaluators. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· ·

Older posts >>

Archives

To top