AEA365 | A Tip-a-Day by and for Evaluators

CAT | Indigenous Peoples in Evaluation

Hi! We are Morgan J Curtis (independent consultant) and Strong Oak Lafebvre (executive director of Visioning BEAR Circle Intertribal Coalition).  Along with Patrick Lemmon (independent consultant), we have the good fortune of serving as the evaluation team for the Walking in Balance (WIB) initiative.

WIB is an innovative approach to violence prevention that focuses on 12 important indigenous values that encourage better harmony with other people and the land. The primary component of WIB is a 13-session curriculum that is built on a Circle Process and that, with some adaptations, can be focused on different populations. The Circle Process involves storytelling and sharing by all participants, including the Circle Keeper who serves to move the conversation forward. A teaching team of four, seated in the four directions, diminishes the role of a single expert and promotes Circle members talking with each other rather than to the Circle Keeper.

Lessons Learned: This program presents many exciting evaluation opportunities and challenges. One of the challenges is ensuring that the evaluation is both culturally responsive and methodologically sound. As part of this challenge, all members of the evaluation team are located in different cities and the evaluation consultants have all been white folks. This process has included much trial and error in our collaborative process and in the evaluation methodologies themselves. The team wanted to design an evaluation that aligned with the program’s principles and also integrated into the circle process as seamlessly as possible. We currently have a pre and post question for each session; participants write their answers on notecards and share aloud with the circle, which flows well with the storytelling focus of the circles.  Additional questions at the beginning and end of the Circle invite participants to share aloud how each session transformed them and ways continued engagement in the Circle impacts their lives. We capture responses from all parties to track how the Circle Process transforms both the teaching team and participants.  The VBCIC teaching team loves the seamless nature of the evaluation process and finds that checking in about what happens between sessions captures changes in behavior based on learning directly linked to Circle teachings.

Hot Tip: Listening plays a key role in both the Circle Process itself and in developing the evaluation. We have established a process of following the lead of the Visioning BEAR team both by listening intently to their struggles and hopes and also by offering options for how to tweak the evaluation. They move forward with what feels right to them and report back to us. Then, we keep tweaking. We are working to make the data analysis and interpretation processes more collaborative as we move forward, too.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Roxann Lamar and I work in research and evaluation at the Center for Human Development, University of Alaska Anchorage. Our local AEA chapter, the Alaska Evaluation Network (AKEN) hosted a discussion on cultural competence, particularly relevant toNative cultures. About 19% of Alaskans have all or partial Native heritage.

The AEA’s statement on cultural competence in evaluation is comprehensive, covering a multitude of issues involved in working together in a diverse world. What is presented here is a perspective to think about – how people might respond to thelanguage we choose to use– not that any languageis universally right or wrong.

Lesson Learned: Our event was called, “Cultural Competence in Evaluation.” Our panel of cross-cultural experts included persons of DegXit’an Athabascan, Gwich’in Athabascan, Navajo, and non-Native heritage. All had a lifetime of personal and professional experience with cultures indigenous to Alaska. They reminded usat the startthat the words we use are important and informed us they found the term “cultural competence” to be distasteful. Theyhighly encouraged us to use the term “cultural humility” and noted it is not a new idea.They also suggested“cultural relevance” as an acceptable alternative that makes more sense in some contexts.

Our panelists explained the problem with“competence”is that it implies we will reach a point where we can say,“We areculturally competent.”That is what is inferredwhen people go to a workshop for a certain number of hoursand earn a certificate in cultural competence.Our panelists pointed out that these trainings oftendo more harm than good. For example, focusing on characteristics of specific cultures inadvertently encourages stereotyping.The panel’s audience was intrigued, and discussions among colleagues continued long after the event.

Hot Tip: In many places or contexts, a term like “cultural humility” is a respectful choice. Without a lot of explanation it conveys a humble posture of learning about self and others. It implies openness, equity, and flexibility in working with anyone.

Rad Resource: With a little looking around, I found Cultural Humility: People, Principles, & Practices. This is a 30-minute, 4-part documentary by Vivian Chávez (2012). It is focused on relationships between physicians and patients, but the principles can beappliedin other applications.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org .aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Greetings, I am June Gothberg, incoming Director of the Michigan Transition Outcomes Project and past co-chair of the Disabilities and Other Vulnerable Populations topical interest group at AEA.  I hope you’ve enjoyed a great week of information specific to projects involving these populations.  As a wrap up I thought I’d end with broad information on involving vulnerable populations in your evaluation and research projects.

Lessons Learned: Definition of “vulnerable population”

  • The TIGs big ah-ha.  When I came in as TIG co-chair, I conducted a content analysis of the presentations of our TIG for the past 25 years.  We had a big ah-ha when we realized what and who is identified as “vulnerable populations”.  The list included:
    • Abused
    • Abusers
    • Chronically ill
    • Culturally different
    • Economically disadvantaged
    • Educationally disadvantaged
    • Elderly
    • Foster care
    • Homeless
    • Illiterate
    • Indigenous
    • Mentally ill
    • Migrants
    • Minorities
    • People with disabilities
    • Prisoners
    • Second language
    • Veterans – “wounded warriors”
  • Determining vulnerability.  The University of South Florida provides the following to determine vulnerability in research:
    • Any individual that due to conditions, either acute or chronic, who has his/her ability to make fully informed decisions for him/herself diminished can be considered vulnerable.
    • Any population that due to circumstances, may be vulnerable to coercion or undue influence to participate in research projects.

vulnerable

Hot Tips:  Considerations for including vulnerable populations.

  • Procedures.  Use procedures to protect and honor participant rights.
  • Protection.  Use procedures to minimize the possibility of participant coercion or undue influence.
  • Accommodation.  Prior to start, make sure to determine and disseminate how participants will be accommodated in regards to recruitment, informed consent, protocols and questions asked, retention, and research procedures including those with literacy, communication, and second language needs.
  • Risk.  Minimize any unnecessary risk to participation.

Hot Tips:  When your study is targeted at vulnerable populations.

  • Use members of targeted group to recruit and retain subjects.
  • Collaborate with community programs and gatekeepers to share resources and information.
  • Know the formal and informal community.
  • Examine cultural beliefs, norms, and values.
  • Disseminate materials and results in an appropriate manner for the participant population.

Rad Resources:

The American Evaluation Association is celebrating the Disabilities and Other Vulnerable Populations TIG (DOVP) Week. The contributions all week come from DOVP members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

· · ·

My name is Dan McDonnell, and I am a Community Manager at the American Evaluation Association.

For me, social media is life, both personally and professionally. I met my wife on Twitter and had a live tweet feed and hashtag at our wedding. But what I love most about social media is the data: I spend hours analyzing the growth, engagement and effectiveness of various social media communities. What types of Facebook posts get people talking? What tweets are being retweeted the most?

To help you through the social media evaluation process, I’ve put together a couple of tips to help you evaluate the impact of your social media campaigns.

Hot Tip: Determine Key Performance Indicators
What goals are you trying to accomplish with your social media presence? Select key performance indicators (KPIs) that tie to your overall marketing strategies. If you’re trying to increase traffic to your website through social, look at social media referrals on Google Analytics. If you’re trying to promote year-round engagement and in-depth discussion with your audience, focus on Twitter @mentions, Facebook comments and shares, as well as comments on LinkedIn as measures of success.

googleanalytics

Image courtesy of Google Analytics.

Hot Tip: Conduct a Listening Campaign
A tried and true social media listening campaign will provide you with a wealth of intelligence to evaluate. See what industry leaders and influencers are talking about on their social media channels. Research what hashtags and topics are trending on Twitter and which keywords appear commonly in social media conversation around your industry.

wordart

 Image courtesy of Wordle.com.

Rad Resource: Measure with Sprout Social

 Sprout Social graphic

 Image courtesy of Sprout Social

Social media analytics software Sprout Social is a great resource to use to immediately start tracking your campaign performance. Normally, you have to dig individually into each platform to get this data, and Twitter’s baseline reporting is notoriously limited, necessitating the use of third party applications. Sprout streamlines the process, allowing you to easily track stats across Twitter, Facebook and Google Analytics, Google + and more. You can review the demographics of your followers or measure numerous reach and engagement metrics, like new followers on Twitter or shares on Facebook.

Once you’ve connected your profiles to Sprout and have tracked a statistically significant data set (3-6 months, minimum is recommended), the more in-depth analysis can truly begin.

One word of caution: there is such a wealth of data available that it can be easy to get go deep into the rabbit hole of analytics. My recommendation is to focus on a handful of measurements to maximize the time you spend on evaluation, and minimize the time spent on pulling data. Happy evaluating!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello, I am Vanessa Hiratsuka, secretary of the Alaska Evaluation Network (AKEN) and a senior researcher at Southcentral Foundation (SCF), a tribally owned and managed regional health corporation based in Anchorage, Alaska, which serves Alaska Native and American Indian people.

As part of Commitment to Quality, a key organizational value, Southcentral Foundation (SCF) prioritizes continuous quality improvement (CQI), quality assurance, program evaluation, and research.

Although the strategies and tools used in CQI, quality assurance, program evaluation, and research are similar, we do different things. One of our challenges is to help staff across the organization understand who does what. Because these four fields differ in aim and audience, exploring the goals of a project (aim) and who will use its findings (audience) provides a useful framework to determine where a project fits.

Hiratsuka graphic

At SCF, improvement staff work directly with SCF department and clinic processes to develop and implement project performance measures and outcome indicators as well as help staff (audience) improve processes to better meet customer-owner needs and inform business directions (aim).  Quality Assurance staff conduct quality monitoring to ensure programs are complying (aim) with SCF processes and the requirements of our accrediting bodies (internal and external audiences).

SCF internal evaluators measure programs’ performance (aim) and provide feedback to programmatic stakeholders — including staff, leadership, and funders (audience). The SCF research department’s projects address questions of clinical significance to contribute to generalizable knowledge (aim) for use within SCF and for dissemination in the scientific literature around American Indian and Alaska Native health (audience).

Lessons Learned:

–        Define the aim and intended audience early in the process! This helps identify the stakeholders, level of review, and oversight needed during all stages of a project, including development, implementation, and dissemination of findings.

–        Broadly disseminate findings! Findings and recommendations from all disciplines are only useful when they are shared. At SCF, findings are shared at interdivisional committee meetings and with staff who oversee the work of departments. Multipronged dissemination ensures involvement from all levels of SCF and supports innovation and the spread of new knowledge.

–        Project review can be complicated!  At SCF, research projects must be vetted through a tribal concept review phase, an Institutional Review Board review, and finally a tribal review of the proposal.  Later, all research dissemination products (abstracts for presentation, manuscripts, and final reports) are also required to undergo a tribal research review process. These take time, so it is important to understand the processes and timelines and build review time into your project management timelines.

Check out these posts on understanding evaluation:

  1. 1.    Gisele Tchamba on Learning the Difference between Evaluation and Research
  2. 2.    John LaVelle on Describing Evaluation

The American Evaluation Association is celebrating Alaska Evaluation Network (AKEN) Affiliate Week. The contributions all this week to aea365 come from AKEN members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hello! My name is Amelia Ruerup, I am Tlingit, originally from Hoonah, Alaska although I currently reside in Fairbanks, Alaska.  I have been working part-time in evaluation for over a year at Evaluation Research Associates and have spent approximately five years developing my understanding of Indigenous Evaluation through the mentorship and guidance of Sandy Kerr, Maori from New Zealand.  I consider myself a developing evaluator and continue to develop my understanding of what Indigenous Evaluation means in an Alaska Native context.

I have come to appreciate that Alaska Natives are historic and contemporary social innovators who have always evaluated to determine the best ways of not only living, but thriving in some of the most dynamic and at times, harshest conditions in the world.  We have honed skills and skillfully crafted strict protocols while cultivating rich, guiding values.  The quality of our programs, projects, businesses and organizations is shaped by our traditions, wisdom, knowledge and values.  It is with this lens that Indigenous Evaluation makes sense for an Alaska Native context as a way to establish the value, worth and merit of our work where Alaska Native values and knowledge both frame and guide the evaluation process.

Amidst the great diversity within Alaska Native cultures we share certain collective traditions and values.  As Alaska Native peoples, we share a historical richness in the use of oral narratives.  Integral information, necessary for thriving societies and passing on cultural intelligence, have long been passed on to the next generation through the use of storytelling. It is also one commonality that connects us to the heart of Indigenous Evaluation.  In the Indigenous Evaluation Framework book, the authors explain that, “Telling the program’s story is the primary function of Indigenous evaluation…Evaluation, as story telling, becomes a way of understanding the content of our program as well as the methodology to learn from our story.” To tell a story is an honor.  In modern Alaska Native gatherings, we still practice the tradition of certain people being allowed to speak or tell stories.  This begs the question: Who do you want to tell your story and do they understand the values that are the foundation and framework for your program?  

Hot Tip: Context before methods.  It is essential to understand the Alaska Native values and traditions that are the core of Alaska Native serving programs, institutions and organizations.  Indigenous Evaluation is an excellent approach to telling our stories.

Rad Resource: The Alaskool website hosts a wealth of information on Alaska Native cultures and values.  This link will take you to a map of “Indigenous Peoples and Languages of Alaska”

The American Evaluation Association is celebrating Alaska Evaluation Network Week. The contributions all this week to aea365 come from AKEN members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Greetings from Alaska. I’m kas aruskevich, principal of Evaluation Research Associates (ERA), I work in rural Alaska with a great team of evaluators, associates, and local intermediaries. In the unique Alaskan context in which we work, telling the story through video helps us to show the context of people, place, and situations. Video clips, compiled into a video report, can be used as evidence of accomplishment as well as to educate an audience (often the funder) holistically about a project. Shorter impact videos can also motivate participants, giving the evaluation an effect beyond reporting.

Most of us have used written interview quotes in our evaluation reports. As example, below is a quote from an interview with a Gaalee’ya STEM project student:

Uvana atiga Nanuuraq (my name is Nanuuraq) I’m from a place called Noatak, my name is Brett James Kirk, 18 years old, incoming freshman at the University here in Fairbanks. So far what I know about STEM seems great. I really agree with how they’re incorporating the indigenous ways with the western ways here because we have a chance to talk about the similarities and differences between the two. And I’m looking forward to all the other meetings throughout the school year.

Compare the 40 second video clip below of the text quoted above. If the video does not show in your browser or email reader, go to https://vimeo.com/62366707 to view it on Vimeo

Gaalee’ya-AEA from kas aruskevich on Vimeo.

Lessons Learned – Generally:

  1. Good audio is EXTREMLY important.
  2. Shooting footage is easy, editing the video is challenging.
  3. Editing is time consuming. One minute of finished video may take 8 or more hours of editing – and that’s after clips are selected and cut to approximate size.
  4. Take good pictures. It easy to put motion to a photograph and use it as background to an audio quote taken from an interview.

A good evaluative video starts with data collection in the form of video and photos that gives evidence of accomplishment and provides visual description.

Lessons Learned – Taming the technology:
For the majority of video reports I work with a local videographer who has also mentored me in both camera use (Cannon 7D) and audio (Zoom H4n 4-Track Recorder). After three years of video production, I primarily stick to photographs and video editing (Final Cut Pro 7). I’ve produced video reports 20 minutes in length and less, however now I prefer to produce supplemental impact videos that are 3 minutes and less. Remember it’s technology, and with technology comes glitches.

Rad Resources to explore:

But most important, know how to conduct an appropriate evaluation, be reciprocal, gather good evidence, and report out. The rest is technology.

We’re focusing on video use in evaluation all this week, learning from colleagues using video in different aspects of their practice. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Karen Vocke, an English Education professor at Western Michigan University.  My work has long focused on migrant farm worker education at multiple levels—families, educators, and programming.

Migrant workers are often referred to as the “invisible people” because of their status as one of America’s most marginalized, vulnerable, and undereducated populations. These families’ livelihoods derive from harvesting a variety of crops; they often move frequently in order to remain employed. Migrant farm workers are a resource vital to the nation’s agricultural industry and are part of many rural communities, yet the educational progress of the children greatly lags behind mainstream standards. Services and opportunities for these students and their families are often fragmented, both from an educational standpoint and auxiliary community support perspective.  These children, many of whom speak little or no English, may attend as many as three schools in one academic year as families travel from worksite to worksite. Still other families “settle out,” remaining in a community and working in agriculture-related jobs when they can. Literacy education and language support opportunities are limited for families.  Educational and service programs vary dramatically in resources and services.

Lesson Learned:

Evaluators and researchers need to consider the issues of transiency and culture when working with this population.  Gaining access, whether it be to examine a program or the population itself, is a time-consuming process.  Trust is paramount. My own research has been based on access to migrant families attained by spending weeks in the company of community insiders, visiting camps and educational programs.  Access, based on trust and mutual respect, elicits the most authentic responses to evaluation and research query.

Hot Tips:

Most importantly, access to migrant populations must be facilitated by one of that community, a “gatekeeper” of sorts.  For example, my own visits to migrant camps were always in the company of the school district “recruiter,” a liaison between the school and camps.  Access and participation can only be authentic when a collaborative and culturally sensitive foundation has been built.

Rad Resources:

Learn as much about the unique migrant culture before attempting any evaluation or research endeavor.  Several exemplary programs and informational websites include the following:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

I am Humberto Reynoso-Vallejo a Director for Program Evaluation with the Center for Health Policy and Research at the University of Massachusetts Medical School.

Conducting evaluation entails the incorporation of multicultural features of individuals and/or organizations into the process of data collection and analysis. These multicultural features are socially constructed and are translated into layers of identity reflected as multicultural identity. Multicultural identity has a powerful impact on individuals since it confers particular social meanings to each layer (e.g. we may have to work with a lesbian African-American woman, or a recent male immigrant from Guatemala with a particular medical condition, or an organization with employees from diverse racial/ethnic backgrounds). Life chances and opportunities are determined by these layers of identity that place some individuals in privileged positions based on certain dominant principles in society (e.g. White, male, heterosexual, able-bodied.).

Multicultural processes include the incorporation of multiple layers of identity in the evaluation. These layers work in complex ways and we may significantly enrich our work when including them in the analysis. Organizations or stakeholders can be seen as entities with a multicultural identity, some more aware than others.

Hot tips:

  • Take appropriate notes of multicultural processes during the evaluation process which may be eventually included on your reports or articles as important material.
  • Be aware of your own multicultural identity and how this interacts with stakeholders. Be open and reject myths such as color blindness. Try to build coalitions with diverse groups.
  • Conduct an assessment of the multiculturality of the organization/stakeholders in terms of their social, cultural, and political representation; value, celebrate and capitalize on differences; and level of engagement in eliminating forms of oppression such as racism, sexism, and ageism. Quantity and quality of services may be related to an organization’s ability to provide a socially just working environment for all its employees.
  • Be emphatic and place yourself in the other person’s shoes paying attention to your reactions during that process.
  • Be aware of your reactions to difference and, when pertinent, included this in your deliverables.
  • Using “I” statements rather than “we” or “you”, allows you to be more fully present in the interchange and avoids the mistake of trying to represent people whose multicultural background is similar to yours, or make erroneous assumptions about people you are interacting with that may share similar identities.
  •  Avoid jumping to conclusions about people you are interacting with based on socially learned preconceived notions about certain population groups.
  • Pay special attention to content (what we say) and process (how we say it).
  • Be aware of Intent (what you are trying to convey) and impact (the person’s reaction of what you are saying).

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hi, I’m Dominica McBride, President of The HELP Institute, Inc. and a member of the AEA Public Statement on Cultural Competence in Evaluation Dissemination Working Group. I also have my PhD in Counseling Psychology and have provided psychotherapy. This tip focuses on the affective and psychological side of cultural competence in everyday evaluation practice.

Culturally competent evaluation practice requires self-awareness and self-reflection. So much of our evaluation practice is guided by our decisions. It would be nice to think that our decisions are mostly driven by our frontal lobes – the seat of deliberation and reasoning; however, most of our choices are influenced by our subconscious mind, as discovered through recent neuroscience research. Our subconscious mind (influenced by the limbic part of our brains also known as the emotional brain) is constituted of our experiences, exposures, and emotions. Our experiences literally shape the wiring of our brains and repeated exposures to similar messages connect our brain cells, which leads to more automatic thoughts. So, if someone is exposed to repeated messages depicting Muslims as terrorists, for example, our brains begin to incorporate this. These thoughts become a part of us, even subconsciously, and can negatively affect our interactions and decisions in working with the group, especially in the absence of antithetical experiences. Microaggressions, which are unintentional slights towards a person related to their group affiliation, can begin to develop. They can also show up in interactions and decision making within an evaluation, like forgetting or overlooking the inclusion of a certain group in research or evaluation design. For example, a 21st Century study “found” a lack of facial recognition abilities in African-Americans compared to Euro-Americans. However, due to cultural incompetence, the participants were only shown Caucasian faces. When corrected with cultural competence, there was no difference.

The Statement states “cultural competence is a stance taken toward culture” and “culturally competent evaluators respect the cultures represented in the evaluation.” To be culturally competent and value and respect culture and different communities, the Statement asserts that we must challenge our stereotypes and ameliorate our biases. We have to examine and address the biases hidden in our subconscious that influence our decision making, interactions with others, and evaluation practice.

Hot Tips:

  • Take the Implicit Association Test. This test will inform you of some of your implicit biases.
  •  Examine your biases through journaling and deliberately find and create experiences that counter your stereotypes and make conscious note of experiences that do not support them.

Rad Resources:

  • Blink is a good book that describes our subconscious mind and its influences on decision-making and interactions with others
  • Crash is a provocative movie graphically demonstrating explicit and implicit biases and their effects on others

The American Evaluation Association will be celebrating Cultural Competence Week. The contributions all this week come from the Cultural Competence committee. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

· · ·

Older posts >>

Archives

To top