AEA365 | A Tip-a-Day by and for Evaluators

Search

Hello! My name is Amelia Ruerup, I am Tlingit, originally from Hoonah, Alaska although I currently reside in Fairbanks, Alaska.  I have been working part-time in evaluation for over a year at Evaluation Research Associates and have spent approximately five years developing my understanding of Indigenous Evaluation through the mentorship and guidance of Sandy Kerr, Maori from New Zealand.  I consider myself a developing evaluator and continue to develop my understanding of what Indigenous Evaluation means in an Alaska Native context.

I have come to appreciate that Alaska Natives are historic and contemporary social innovators who have always evaluated to determine the best ways of not only living, but thriving in some of the most dynamic and at times, harshest conditions in the world.  We have honed skills and skillfully crafted strict protocols while cultivating rich, guiding values.  The quality of our programs, projects, businesses and organizations is shaped by our traditions, wisdom, knowledge and values.  It is with this lens that Indigenous Evaluation makes sense for an Alaska Native context as a way to establish the value, worth and merit of our work where Alaska Native values and knowledge both frame and guide the evaluation process.

Amidst the great diversity within Alaska Native cultures we share certain collective traditions and values.  As Alaska Native peoples, we share a historical richness in the use of oral narratives.  Integral information, necessary for thriving societies and passing on cultural intelligence, have long been passed on to the next generation through the use of storytelling. It is also one commonality that connects us to the heart of Indigenous Evaluation.  In the Indigenous Evaluation Framework book, the authors explain that, “Telling the program’s story is the primary function of Indigenous evaluation…Evaluation, as story telling, becomes a way of understanding the content of our program as well as the methodology to learn from our story.” To tell a story is an honor.  In modern Alaska Native gatherings, we still practice the tradition of certain people being allowed to speak or tell stories.  This begs the question: Who do you want to tell your story and do they understand the values that are the foundation and framework for your program?  

Hot Tip: Context before methods.  It is essential to understand the Alaska Native values and traditions that are the core of Alaska Native serving programs, institutions and organizations.  Indigenous Evaluation is an excellent approach to telling our stories.

Rad Resource: The Alaskool website hosts a wealth of information on Alaska Native cultures and values.  This link will take you to a map of “Indigenous Peoples and Languages of Alaska”

The American Evaluation Association is celebrating Alaska Evaluation Network Week. The contributions all this week to aea365 come from AKEN members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

I’m Corrie Whitmore, one of the new At Large Board Members for AEA. I live in Anchorage, Alaska where I am president of the Alaska Evaluation Network and an internal evaluator for Southcentral Foundation, an Alaska Native owned and operated health care organization serving approximately 60,000 Alaska Native and American Indian people each year.

In my current role, I evaluate everything from space utilization to nurse home visiting programs and provide results to both internal operations staff and external funders. I enjoy the diverse work and opportunity to teach evaluation principles as part of our organization’s focus on capacity building. My experience in indigenous organizations and rural environments has deeply enriched my practice and I look forward to sharing my understanding of these important contexts during my service on the AEA Board.

Lesson Learned: discussing the aim and audience of evaluation work is a great way to help people understand what evaluation is and why it is important. As an internal evaluator, the audience for my work is usually program funders, operations staff, and decision makers. Working with the intended audience (aka “stakeholders”) to agree on the aim of our work together early in the project gets us all on the same page, saving time and building understanding.

Rad Resource: The CDC Framework for Program Evaluation is a simple, appealing framework that can anchor conversations about the evaluation process with an audience. I use the circle graphic showing the steps of program evaluation with operations folks to outline our project and help explain the process we will work through together.

2015 is the International Year of Evaluation and an exciting time to join the board. I look forward to learning about the infrastructure that keeps our complex organization and conference functioning and helping AEA build relationships with policymakers and organizations. I’m proud to be part of our socially responsible organization dedicated to supporting “effective and humane organizations and ultimately to the enhancement of the public good!”

See you in Denver!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Greetings my name is Maurice Samuels and I’m a Lead Evaluation and Research Associate at Outlier Research and Evaluation, CEMSE|University of Chicago. Our group recently hosted an American Evaluation Association Graduate Education Diversity Intern (GEDI). This was a wonderful opportunity for me and my colleagues to influence the development of a new member to the field. She had the experience of conducting an evaluation, more importantly we supported her thinking about and practice of cultural competence in evaluation. Below are several helpful tips to introduce evaluators to cultural competence in evaluation:

Hot Tips:

  1. Immerse yourself in the literature – It is important to have an understanding of evaluation frameworks and approaches (e.g., culturally responsive evaluation, contextually responsive evaluation, cross- cultural evaluation) that are sensitized to culture and context in order to stimulate thinking about the role of culture in evaluation. Equally important is to have a comprehensive understanding of how culture has been characterized in other fields such as anthropology, health, and social work. This is particularly helpful due to the various ways in which culture can be understood. For articles on the role of culture in evaluation check out http://education.illinois.edu/crea/publications.
  2. Use the resources available through the American Evaluation Association (AEA) – The AEA has several Topical Interest Groups (TIGs) that have an explicit commitment to culture and diversity (e.g., Multiethnic Issues in Evaluation (MIE) TIG; Disabilities and Other Vulnerable Populations (DOVP) TIG; Lesbian, Gay, Bisexual, and Transgender Issues (LGBT) TIG; Indigenous Peoples in Evaluation TIG; Feminist Issues in Evaluation TIG; International and Cross Cultural Evaluation (ICCE) TIG). In addition, commit yourself to AEA’s Cultural Competence in Evaluation and review their Introduction to the Cultural Readings of The Program Evaluation Standards and the Guiding Principles for Evaluators.
  3. Create opportunities to engage in dialogue about cultural competence – Introduce or network with people in the field with similar interest and those that are enacting cultural competence is important to making the practice concrete. Further, this encourages open conversations about culture, which helps to refine ones notions of cultural competence and provides multiple perspectives to draw upon.
  4. Encourage strong field work practices and self-reflection – When in the field it is important that the evaluator builds relationships with clients and stakeholders, understand the context of the program and the surrounding community, and gives back to the community in tangible ways such as volunteering at the program or attending program sponsored events that are not related to the evaluation.   As for self-reflection, it is important to document and share decisions made and assumptions when in the field through journaling and debriefing with a colleague.

The American Evaluation Association is celebrating Multiethnic Issues in Evaluation (MIE) Week with our colleagues in the MIE Topical Interest Group. The contributions all this week to aea365 come from MIE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Greetings AEA and evaluation family, we’re Stafford Hood, professor, University of Illinois-Urbana Champaign and Director, Center for Culturally Responsive Evaluation and Assessment (CREA) and Rodney Hopson, professor, George Mason University and Senior Research Fellow, Center for Education Policy and Evaluation.

We are members of the AEA Multi-Ethnic Issue TIG, having been long time members and having seen the TIG grow over twenty (20) years.  Additionally, we promote the historical and contemporary development of Culturally Responsive Evaluation (CRE).  Grounded in traditions of Robert Stake’s Responsive Evaluation in the 1970s and influenced by the work of Gloria Ladson-Billings, Jackie Jordan Irvine, and Carol Lee who coined Culturally Responsive Pedagogy twenty years later. CRE marries these perspectives into a holistic evaluation framework that centers culture throughout evaluation.  Of particular attention to groups historically marginalized, CRE seeks to balance their interests and matters of equity into the evaluation process.

Hot Tip:  Refer to CRE framework in the 2010 NSF User-Friendly Guide (especially the chapter by Henry Frierson, Stafford Hood, Gerunda Hughes and Veronica Thomas) and the previous Hot Tip to illustrate how CRE can be applied to evaluation practices. 

Lesson Learned: There is a recognizable growth in what some may now call our culturally responsive evaluation community, particularly in the presence of a younger and more diverse cadre of evaluators. A recent search of scholar.google.com of the terms culturally responsive evaluation (CRE) and culturally competent evaluation (CCE) anywhere in an article or chapter or title between 1990 and 2013 indicates the major increase in this discourse over a little more than a decade is illustrated in the table below:

Hopson

Rad Resources:

  • CREA is an international and interdisciplinary evaluation center that is grounded in the need for designing and conducting evaluations and assessments that embody cognitive, cultural, and interdisciplinary diversity that are actively responsive to culturally diverse communities and their academic performance goals;
  • CREA’s second conference is upcoming!: “Forging Alliances For Action:  Culturally Responsive Evaluation Across Fields of Practice” will be held September 18-20, 2014 at the Oak Brook Hills Resort, Chicago – Oak Brook, IL and feature seasoned and emerging scholars and practitioners in the field;
  • AEA Statement on Cultural Competence in Evaluation is the (2011) membership-approved document as the result of the Building Diversity Initiative (co-sponsored by AEA and W.K.Kellogg Foundation in 1999);
  • Indigenous Framework for Evaluation, which synthesizes Indigenous ways of knowing and Western evaluation practice, is summarized in a Canadian Journal of Program Evaluation 2010 paper by Joan LaFrance and Richard Nichols.

The American Evaluation Association is celebrating Multiethnic Issues in Evaluation (MIE) Week with our colleagues in the MIE Topical Interest Group. The contributions all this week to aea365 come from MIE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello! I’m Nicole Clark, a licensed social worker and independent evaluator for Nicole Clark Consulting. I specialize in working with organizations and agencies to design, implement, and evaluate programs and services specifically for women and young women of color.

Young women of color (YWOC) face many issues, including racism, sexism, ageism, immigrant status, socioeconomic status, and sexuality. How can evaluators make sure the programs we design and evaluate are affirming, inclusive, and raise the voices of YWOC?

To help you be more effective at engaging young Black, Latina, Asian/Pacific Islander, and Native/Indigenous women in your evaluation work, here are my lessons learned and a rad resource on engaging YWOC:

Lessons Learned: Not all YWOC are the same- YWOC are not a monolithic group. Within communities of color, there are a variety of cultures, customs, and regional differences to consider.

Meet YWOC where they are- What are the priorities of the YWOC involved in the program or service? When an organization is developing a program on HIV prevention while the YWOC they’re targeting are more concerned with the violence happening in their community, there’s a disconnect. What the organization (and even you as the evaluator) considers a high priority may not be to the YWOC involved.

Be mindful of slang and unnecessary jargon- Make your evaluation questions easy to understand and free from jargon. Be mindful of using slang words with YWOC. Given cultural and regional considerations (along with the stark difference in age between you as the evaluator and of the YWOC), slang words may not go over well.

Start broad, then get specific- Let’s use an example of creating a evaluation questions on reproductive rights and YWOC. Creating evaluation questions around “reproductive rights” may not be as effective to YWOC as creating evaluation questions on “taking care of yourself.” While both can mean the same thing, “taking care of yourself’ evokes an overall feeling of wellness and can get YWOC thinking of specific ways in which they want to take care of themselves. This can be narrowed down to aspects of their health they want to be more empowered on, and you can help organizations hone in on these needs to develop a program or service that YWOC would be interested in.

Rad Resource: A great example of a YWOC-led program is the Young Women of Color Leadership Council (YWOCLC), a youth initiative through Advocates For Youth. Through thoughtful engagement of young people in their work, the YWOCLC cultivates a message of empowerment for young women of color, and it serves as a great example of a true youth-organization partnership framework. Pass this resource along to the youth-focused organizations you work with!

The American Evaluation Association is celebrating Multiethnic Issues in Evaluation (MIE) Week with our colleagues in the MIE Topical Interest Group. The contributions all this week to aea365 come from MIE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hi! We are Silvia Salinas Mulder and Fabiola Amariles, co-authors of Chapter 9 of “Feminist Evaluation and Research: Theory and Practice”. Our article examines the fact that in our region, understanding and accepting gender mainstreaming as an international mandate is still slow and even decreasing in some political and cultural contexts, where the indigenous agenda and other internal and geopolitical issues are gaining prominence. Feminist evaluation may play an important role in getting evidence to create policies to improve the lives of women, but it is necessary to make feminist principles operational in the context of the multicultural Latin American countries.

Lesson Learned: We should re-consider and reflect on concepts and practices usually taken-for-granted like “participation.” In evaluations, members of the target population are usually treated as information resources but not as key audiences, owners and users of the findings and recommendations of the evaluation. Interactions with excluded groups usually reproduce hierarchical power relations and paternalistic communication patterns between the evaluator and the interviewed people, which may shape participation patterns, as well as the honesty and reliability of responses.

Hot Tip: Emphasize that everyone should have the real opportunity to participate and also to decline from participating (e.g., informed consent), and should not fear any implications of such a decision (e.g., formal or informal exclusion from future program activities). Having people decide about their own participation is a good indicator of ethical observance in the process.

Lesson Learned: Sensitivity and respect for the local culture often lead to misinterpreting rural communities as homogenous entities, paying little attention to internal diversity, inequality and power dynamics, which influence and are influenced by the micro-political atmosphere of an evaluation, oftentimes reproducing exclusion patterns.

Hot Tip: Pay attention and listen to formal leaders and representatives, but also search actively for the marginalized and most excluded people, enabling secure and confidential environment for them to speak. The role of cultural brokers knowledgeable of local culture is key to achieve an inclusive, context-sensitive approach to evaluation.

Lesson Learned: Another key concept to reflect on is “success.” On one hand, the approach of success as an objective and logically-derived conclusion of “neutral” analysis usually omits its power essence and intrinsic political and subjective dimensions. On the other hand, evaluation cultures that privilege limited funder-driven definitions of success reproduce ethnocentric perspectives, distorting experiences and findings, and diminishing their relevance and usefulness.

Hot Tip: Openly discussing the client’s and donor’s ideas about “success” and their expectations regarding a “good evaluation” beyond the terms of reference diminishes resistance to rigorous analysis and constructive criticism.

Rad Resources:

Silvia Salinas-Mulder and Fabiola Amariles on Gender, Rights and Cultural Awareness in Development Evaluation

Batliwala, S. & Pittman, A. (2010). Capturing Change in Women’s Realities.

The American Evaluation Association is celebrating Feminist Issues in Evaluation (FIE) TIG Week with our colleagues in the FIE Topical Interest Group. The contributions all this week to aea365 come from our FIE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

We are Lisa Aponte-Soto and Leah Christina Neubauer from Chicago.  Aponte-Soto teaches at DePaul University, is an independent consultant in the areas of cultural competency, Latino/a health, and diversity talent management; and, a member of the Graduate Diversity Internship Program (GEDI) 2009-2010 cohort. Neubauer is based in DePaul’s MPH Program and is the current President of the Chicagoland Evaluation Association (CEA).

At Evaluation 2013, a group of Latino/a evaluators and evaluators working with Latino-serving organizations gathered in the session “Fueling the AEA pipeline for Latino Evaluator Practitioners and Researchers.”

The session highlighted the importance of developing a pipeline of Latino/a evaluators whose lived experiences position them to practice evaluation through a culturally responsive lens. Extracting from personal and professional experiences, panelists Aponte-Soto, Neubauer, Maria Jimenez, and Saul Maldonado, contributor Gabriela Garcia, and discussants Debra Joy Perez and Rodney Hopson, shared their personal and multi-ethnic identities and how these influence engaging in culturally responsive evaluation (CRE) practices within and among Latino cultures.

Did you know that the AEA 2007/2008 scan report identified 5% of members as Latino/a evaluators? Yet, Latinos comprise the fastest growing population in the U.S., presently accounting for 16.3% of Americans (U.S. Census, 2010) and a projected one-third of the population by 2050. As the U.S. Latino population continues to grow, evaluators and evaluation practices must responsively address the varied needs of Latino communities and culture in order to determine the appropriateness of programs serving Latinos.

Lessons Learned: Top 5:

  1. Future directions include creating a formalized space for dialogue and knowledge sharing around Latino issues that impact evaluation practice by establishing an AEA Latino Issues TIG.
  2. Novice Latino evaluators need additional professional leadership development that provides formal training and supportive mentoring from senior evaluators.
  3. Cross-gender, same gender, Latino, non-Latino mentoring relationships are all valuable to the development of emerging evaluators; senior evaluators must be willing to invest in their protégé.
  4. Cross-cultural partners are needed to meet the growing needs of the Latino community and assessing the appropriateness of the programs.
  5. Developing a CRE framework calls for expanding the existing critical paradigm by including LatCrit theory and the voices of other indigenous Latino-focused writers.

Hot Tip: Latino students interested in pursuing a career in evaluation practice should acquire academic training from graduate program with an evaluation component or seek supplemental training in a supportive professional environment like the AEA GEDI.

Rad Resource: The Latina Researchers Network provides ongoing mentoring support, employment opportunities, and professional resources including webinars on scholarly evidence-based knowledge sharing and talent development. The Network is available to both men and women online and through social media portals. The group will host a conference at John Jay College from April 3-5, 2014.

Clipped from http://latinaresearchers.com/

This week, we’re diving into issues of Cultural Competence in Evaluation with AEA’s Statement on Cultural Competence in Evaluation Dissemination Working Group. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

No tags

Greetings, I am June Gothberg, incoming Director of the Michigan Transition Outcomes Project and past co-chair of the Disabilities and Other Vulnerable Populations topical interest group at AEA.  I hope you’ve enjoyed a great week of information specific to projects involving these populations.  As a wrap up I thought I’d end with broad information on involving vulnerable populations in your evaluation and research projects.

Lessons Learned: Definition of “vulnerable population”

  • The TIGs big ah-ha.  When I came in as TIG co-chair, I conducted a content analysis of the presentations of our TIG for the past 25 years.  We had a big ah-ha when we realized what and who is identified as “vulnerable populations”.  The list included:
    • Abused
    • Abusers
    • Chronically ill
    • Culturally different
    • Economically disadvantaged
    • Educationally disadvantaged
    • Elderly
    • Foster care
    • Homeless
    • Illiterate
    • Indigenous
    • Mentally ill
    • Migrants
    • Minorities
    • People with disabilities
    • Prisoners
    • Second language
    • Veterans – “wounded warriors”
  • Determining vulnerability.  The University of South Florida provides the following to determine vulnerability in research:
    • Any individual that due to conditions, either acute or chronic, who has his/her ability to make fully informed decisions for him/herself diminished can be considered vulnerable.
    • Any population that due to circumstances, may be vulnerable to coercion or undue influence to participate in research projects.

vulnerable

Hot Tips:  Considerations for including vulnerable populations.

  • Procedures.  Use procedures to protect and honor participant rights.
  • Protection.  Use procedures to minimize the possibility of participant coercion or undue influence.
  • Accommodation.  Prior to start, make sure to determine and disseminate how participants will be accommodated in regards to recruitment, informed consent, protocols and questions asked, retention, and research procedures including those with literacy, communication, and second language needs.
  • Risk.  Minimize any unnecessary risk to participation.

Hot Tips:  When your study is targeted at vulnerable populations.

  • Use members of targeted group to recruit and retain subjects.
  • Collaborate with community programs and gatekeepers to share resources and information.
  • Know the formal and informal community.
  • Examine cultural beliefs, norms, and values.
  • Disseminate materials and results in an appropriate manner for the participant population.

Rad Resources:

The American Evaluation Association is celebrating the Disabilities and Other Vulnerable Populations TIG (DOVP) Week. The contributions all week come from DOVP members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

· · ·

We are we Matt Militello and Chris Janson. We have been working together since 2002. Our first collaboration was as educators in a public high school in Michigan where Militello was an assistant principal and Janson a counselor. We both left the K-12 setting to obtain doctorate degrees and now continue work together on research and grants.

We also conduct a number of evaluations on funded projects that include: the National Science Foundation, the U.S. Department of Education, and the W.K. Kellogg Foundation. We created EduTrope for our evaluation and consulting work. What has set our evaluation and consulting work apart is our use of Q methodologya means to quantify people’s subjectivity.

Hot Tip: Q methodology begins with the construction of a set of statements. Participants then sort the statements in a forced distribution (see example figure below) from least believed or perceived (on the left) to most believed or perceived (on the right) prior to a meeting/gathering.

Militello

The sorts are factor analyzed to create groups. When participants arrive at the meeting we assign them a table. They sit with others who sorted the statements in a statistically similar fashion. Next, we empower participants to interpret their group’s distribution of statements. We ask them to create a name for their group.

This video is a demonstration of the Q process for evaluation from beginning to end.

Lesson Learned: Currently we are evaluating a Kellogg Foundation initiative: Community Learning Exchange (see communitylearningexchange.org). The Fall 2012 gathering was hosted by the Salish & Kootenai College in Montana. The theme was “Transforming Education from an Instrument of Historical Trauma to an Instrument of Healing.” We created a video representation of an indigenous story narrated by community members. The video was the used to gather input from tribal elders. Based on the feedback we created 31 statements.

60 gathering participants sorted the statements. Click the link below to participate in the actual sort. The process begins by watching the video that is embedded in this link.

Rad Resources: For more information on Q methodology visit www.qmethod.org.

Finally, this video provides testimony by people who have experienced the Q process in our evaluation work.

Want to learn more from Matt and Chris? Register for: Q Methodology: A Participatory Evaluation Approach That Quantifies Subjectivity at Evaluation 2013 in Washington, DC.

This week, we’re featuring posts by people who will be presenting Professional Development workshops at Evaluation 2013 in Washington, DC. Click here for a complete listing of Professional Development workshops offered at Evaluation 2013. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Older posts >>

Archives

To top