AEA365 | A Tip-a-Day by and for Evaluators

TAG | affiliates

My name is Tanya Ostrogorsky, Assistant Vice Provost for Assessment and Evaluation at Oregon Health & Science University, and I’ve been involved with Oregon Program Evaluators Network since 2002.  I ‘grew up’ studying research methods and data analysis and looking back I was a functioning as an evaluator before I knew what that meant. It wasn’t until my doctoral program that I took my first program evaluation course and attended an OPEN conference. Since then I’ve held leadership positions on seven different occasions including a long stretch as OPEN President during a difficult time in the organization’s history.

The purpose of this post is not to tell you about my trajectory as a local affiliate leader, but to share lessons learned through my observations about the role and function of the local affiliates in supporting AEAs mission. I also want to remind us how critical the local affiliates are to the development of local talent as well as the national leadership pipeline. Finally, I want to highlight the under-realized sources of energy, excitement, and real diversity that are in our midst.

Recently, 126 conference attendees ranging from students to newly minted graduates to early careerist to long-timers gathered to hear about the Top 10 Trends in Evaluation with Dr. Stewart Donaldson.  My first reaction to that day was a strong sense of pride in watching a local affiliate consistently deliver significant professional development opportunities for 16 years. My second reaction, as I scanned the room, was on the diverse and exciting mix of attendees that represent our past, our present, and our future.

So, what’s my point? Just as AEA needs to leverage and develop the local affiliates, past local affiliate leaders need to ensure the next generations of evaluators are provided the organizational history and encouragement to pick up where we left off. In both cases, we have a professional responsibility to support and encourage our peers in taking the next step in their leadership development. We need to offer encouragement and harness their energy. Yes, they will stumble and they will re-create the wheel, but so did we.

Lesson learned: We must leverage the talents and energy of the local affiliates to develop the leadership pipeline needs. My hope is that AEA can bring their focus to the power of local affiliates to create a strong organizational legacy. At the same time, it is local affiliate leadership responsibility to ensure that we do our part and have a strong community to support AEA.

Hot Tip: Local/regional AEA affiliates offer many opportunities to build our evaluation community. Find yours here and take the next step!

Clipped from http://www.eval.org/p/cm/ld/fid=12

The American Evaluation Association is celebrating Oregon Program Evaluators Network (OPEN) Affiliate Week. The contributions all this week to aea365 come from OPEN members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Kim Firth Leonard, and I have the honor of authoring the first post on the aea365 blog for Oregon Program Evaluators Network (OPEN) week. I have been an AEA member since 2008, and am currently President of OPEN, a local affiliate of AEA founded in 1997. I work as Assessment Research Coordinator at Marylhurst University in Portland Oregon and do contract work in program evaluation via Leonard Research and Evaluation, LLC.

This week’s posts were by OPEN members who have played important volunteer and leadership roles for OPEN. The posts demonstrate the value of our local network by sharing lessons we’ve gathered in reflecting on our work together as evaluators and as volunteers with OPEN.

I have learned much about evaluation and about building learning communities through OPEN. The bulk of the work done by OPEN’s volunteer Council and Committees is in organizing and supporting local events. OPEN’s mission is to provide a regional, interdisciplinary forum for professional development, networking, and exchange of practical, methodological, and theoretical knowledge in the field of evaluation. It is through these events that we build learning communities, and in doing so strengthen our work individually, and as a field.  

Get Involved: Whether you have a local affiliate or just an informal network of other evaluators in your area, you too can host, lead, contribute to, or benefit from local evaluation events.

  • Host: Events don’t have to be massive undertakings to be successful. Small, informal gatherings can be just as valuable as large conferences. “Have an idea? Go for it” is practically our events committee motto.
  • Lead: Local events are great places to practice your presentation and training skills. Discussion groups, like OPEN’s new-ish Book Club are low-pressure and offer opportunities to discuss emerging topics.
  • Contribute: Volunteer to help organize events for unique networking opportunities. Learning event planning skills is icing on the cake.
  • Benefit: It’s all about learning together. Valuable learning about one another and the field can happen at any get together – so attend local events whenever you’re able.

Lesson Learned: OPEN has always been welcoming to community members who don’t identify as evaluators, exactly, but do related work or want to learn more about evaluation. In the last year or so we’ve been emphasizing this openness (ha!) and we’ve found that collaborating with and learning from others in related fields greatly enriches our evaluation learning community. Sessions at our recent conference intended to create opportunity to learn from and with others in our community, including non-profit leaders, were well received.

Rad Resource: Materials from our 2013 conference are available on our website.

Rad Resource: Your own learning community is at your local affiliate or among other local AEA members.

Clipped from http://oregoneval.org/

The American Evaluation Association is celebrating Oregon Program Evaluators Network (OPEN) Affiliate Week. The contributions all this week to aea365 come from OPEN members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hello, I am Vanessa Hiratsuka, secretary of the Alaska Evaluation Network (AKEN) and a senior researcher at Southcentral Foundation (SCF), a tribally owned and managed regional health corporation based in Anchorage, Alaska, which serves Alaska Native and American Indian people.

As part of Commitment to Quality, a key organizational value, Southcentral Foundation (SCF) prioritizes continuous quality improvement (CQI), quality assurance, program evaluation, and research.

Although the strategies and tools used in CQI, quality assurance, program evaluation, and research are similar, we do different things. One of our challenges is to help staff across the organization understand who does what. Because these four fields differ in aim and audience, exploring the goals of a project (aim) and who will use its findings (audience) provides a useful framework to determine where a project fits.

Hiratsuka graphic

At SCF, improvement staff work directly with SCF department and clinic processes to develop and implement project performance measures and outcome indicators as well as help staff (audience) improve processes to better meet customer-owner needs and inform business directions (aim).  Quality Assurance staff conduct quality monitoring to ensure programs are complying (aim) with SCF processes and the requirements of our accrediting bodies (internal and external audiences).

SCF internal evaluators measure programs’ performance (aim) and provide feedback to programmatic stakeholders — including staff, leadership, and funders (audience). The SCF research department’s projects address questions of clinical significance to contribute to generalizable knowledge (aim) for use within SCF and for dissemination in the scientific literature around American Indian and Alaska Native health (audience).

Lessons Learned:

-        Define the aim and intended audience early in the process! This helps identify the stakeholders, level of review, and oversight needed during all stages of a project, including development, implementation, and dissemination of findings.

-        Broadly disseminate findings! Findings and recommendations from all disciplines are only useful when they are shared. At SCF, findings are shared at interdivisional committee meetings and with staff who oversee the work of departments. Multipronged dissemination ensures involvement from all levels of SCF and supports innovation and the spread of new knowledge.

-        Project review can be complicated!  At SCF, research projects must be vetted through a tribal concept review phase, an Institutional Review Board review, and finally a tribal review of the proposal.  Later, all research dissemination products (abstracts for presentation, manuscripts, and final reports) are also required to undergo a tribal research review process. These take time, so it is important to understand the processes and timelines and build review time into your project management timelines.

Check out these posts on understanding evaluation:

  1. 1.    Gisele Tchamba on Learning the Difference between Evaluation and Research
  2. 2.    John LaVelle on Describing Evaluation

The American Evaluation Association is celebrating Alaska Evaluation Network (AKEN) Affiliate Week. The contributions all this week to aea365 come from AKEN members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Greetings! My name is kas aruskevich and I am principal of Evaluation Research Associates LLC. I live in Fairbanks and work primarily in rural Alaska. Alaska is known for its great natural beauty, extreme temperatures, and unique context of diverse and far-flung communities assessable only by air. Alaska is the largest state in the U.S.

Alaska map

Rural communities often have a small population and rarely have a local evaluator for hire. Consequently, a program evaluator is most often hired from outside the community or region. Helicopter evaluation is a depreciating term used to describe a drop in – evaluate – depart approach. Today’s post talks about methods to strengthen and add depth to evaluations that involve distance between evaluator and evaluand.

Hot Tip: First, context is important. Familiarize yourself with the community and region before you travel. Gather demographic data of the community, leading industry, and cultural composition. Learn about the organization hosting the program, before your first contact. Plan your site-visit around a community event so you can see the community in a broader context.

Rad Resource: The importance of context is discussed in New Directions for Evaluation Fall 2012, Issue 135.

Hot Tip: Next, work to build open communication with program staff. Begin with a teleconference to provide an opportunity to meet staff and organization and discuss program status. Teleconferences also give you a chance to describe your evaluation style and see if you are a ‘fit’ for the organization and the evaluation project.

ALWAYS include participatory methods. I don’t ‘come in’ as the expert with an unchangeable evaluation design, but instead write up suggestions for the evaluation to negotiate before a plan is finalized. As an itinerant evaluator you can’t be on site as often as you might like. Using a participatory evaluation approach, program staff can be involved in the evaluation through taking photos or identifying program participants or stakeholders to interview.

Rad Resource – Read more about participatory evaluation in Cousins and Chouinard’s new book Participatory Evaluation Up Close.

Hot Tip: Lastly, work to build a friendly relationship based on mutual interests with at least one person in the organization or community. After years of conducting evaluations, friendly relationships have evolved into continuing friendships. These friendships have mutual benefits, in-part, they are a bridge for the evaluator to learn community specific cultural protocols–very important to conduct evaluations in cross-cultural settings – which in turn can strengthen the program through appropriate evaluation.

Lesson Learned: Itinerant evaluation can be much more than a helicopter site-visit approach. Regular communication and working together with program staff as a team can expand the evaluative evidence collected and increase report credibility, relevance, and use by the program staff.

The American Evaluation Association is celebrating Alaska Evaluation Network (AKEN) Affiliate Week. The contributions all this week to aea365 come from AKEN members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

We are Alexandra Hill and Diane Hirshberg, and we are part of the Center for Alaska Education Policy Research at the University of Alaska Anchorage.  The evaluation part of our work ranges from tiny projects – just a few hours spent helping someone design their own internal evaluation – to rigorous and formal evaluations of large projects.

In Alaska, we often face the challenge of conducting evaluations with very small numbers of participants in small, remote communities. Even in Anchorage, our largest city, there are only 300,000 residents. We also work with very diverse populations, both in our urban and rural communities. Much of our evaluation work is on federal grants, which need to both meet federal requirements for rigor and power, and be culturally responsive across many settings.

Lesson Learned: Using mixed-methods approaches allows us to both 1) create a more culturally responsive evaluation; and 2) provide useful evaluation information despite small “sample” sizes. Quantitative analyses often have less statistical power in our small samples than in larger studies, but we don’t simply want to accept lower levels of statistical significance, or report ‘no effect’ when low statistical power is unavoidable.

Rather, we start with a logic model to ensure we’ve fully explored pathways through which the intervention being evaluated might work, and those through which it might not work as well.  This allows us to structure our qualitative data collection to explore and examine the evidence for both sets of pathways.  Then we can triangulate with quantitative results to provide our clients with a better sense of how their interventions are working.

At the same time, the qualitative side of our evaluation lets us lets us build in measures that are responsive to local cultures, include and respect local expertise, and (when we’re lucky) build bridges between western academic analyses and indigenous knowledge. Most important, it allows us to employ different and more appropriate ways of gathering and sharing information across indigenous and other diverse communities. 

Rad Resource: For those of you at universities or other large institutions that can purchase access to it we recommend SAGE Research Methods.  This online resource provides access to full text versions of most SAGE research publications, including handbooks of research, encyclopedias, dictionaries, journals, and ALL the Little Green Books and Little Blue Books.

Rad Resource: Another Sage-sponsored resource is Methodspace, an online network for researchers. Sign-up is free, and Methodspace posts selected journal articles, book chapters and other resources, as well as hosting online discussions and blogs about different research methods.

Rad Resource: For developing logic models, we recommend the W.K. Kellogg Foundation Logic Model Development Guide.

Clipped from http://www.methodspace.com/

The American Evaluation Association is celebrating Alaska Evaluation Network (AKEN) Affiliate Week. The contributions all this week to aea365 come from AKEN members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

Greetings from the Last Frontier. I’m Alda Norris, webmaster for the Alaska Evaluation Network (AKEN) and evaluation specialist for the University of Alaska Fairbanks Cooperative Extension Service (CES).

The faculty and staff I work with at CES are experts in a variety of fields, from horticulture, entomology and forestry to economics, nutrition and child development. That adds up to quite an interdisciplinary organization! Our diversity makes for fantastic collaborations, as well as complicated syntheses. Lucky for me, my PhD is in interpersonal communication, which applies across the board.

Lessons Learned:  Ask people to tell you the inspiration behind their projects. Every group has a story to tell.What common goals bring these people together?Inquiring about the “why” and not just the “what” of a program really benefits capacity building efforts. I got to know CES better while writing a Wikipedia entry. Hearing and reading about the contributions Extension has made in Alaska since the 1930s deepened my understanding of what led up to each of our program’s current priorities and logic models.

  • Help yourself with history. Too often we are mired in a static view of where an organization is now, rather than having an appreciation for how it has changed, and continues to change, over time. Even in a “young” state like Alaska, there is rich historical data we can learn from.
  • Boost your evaluation planning by gathering information on your/the client organization’s “story” from a variety of sources. Talk to emeritus professors, compare the org chart of today to past decades, and comb through newspaper archives. Becoming familiar with past waves of change is very helpful in understanding the meaning behind current missions, goals and structures (and people’s attachments to them).

Hot tip: Communicate about communication! Add a question about communication preferences to your next needs assessment. Don’t assume you know what level of technology and form(s) of interaction your colleagues and clients are comfortable with. Before you do a survey, figure out what modes of communication the target population values. For example, if oral history is a large part of a sample group’s culture, how well will a paper and pencil form be received?

Rad Resources:

  1. The National Communication Association (NCA) can help you step up your message design game. Take advantage of free advice from experts on verbal and nonverbal communication by reading NCA’s newsletter, Communication Currents.
  2. AnyMeeting is a freetool that you can use to reach a wider audience. With it, you can host online meetings and make instructional videos, both of which are really handy when working in a geographically diverse setting. AnyMeeting also has screenshare clarity in its recordings that Google Hangouts lacks.

The American Evaluation Association is celebrating Alaska Evaluation Network (AKEN) Affiliate Week. The contributions all this week to aea365 come from AKEN members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hello! My name is Amelia Ruerup, I am Tlingit, originally from Hoonah, Alaska although I currently reside in Fairbanks, Alaska.  I have been working part-time in evaluation for over a year at Evaluation Research Associates and have spent approximately five years developing my understanding of Indigenous Evaluation through the mentorship and guidance of Sandy Kerr, Maori from New Zealand.  I consider myself a developing evaluator and continue to develop my understanding of what Indigenous Evaluation means in an Alaska Native context.

I have come to appreciate that Alaska Natives are historic and contemporary social innovators who have always evaluated to determine the best ways of not only living, but thriving in some of the most dynamic and at times, harshest conditions in the world.  We have honed skills and skillfully crafted strict protocols while cultivating rich, guiding values.  The quality of our programs, projects, businesses and organizations is shaped by our traditions, wisdom, knowledge and values.  It is with this lens that Indigenous Evaluation makes sense for an Alaska Native context as a way to establish the value, worth and merit of our work where Alaska Native values and knowledge both frame and guide the evaluation process.

Amidst the great diversity within Alaska Native cultures we share certain collective traditions and values.  As Alaska Native peoples, we share a historical richness in the use of oral narratives.  Integral information, necessary for thriving societies and passing on cultural intelligence, have long been passed on to the next generation through the use of storytelling. It is also one commonality that connects us to the heart of Indigenous Evaluation.  In the Indigenous Evaluation Framework book, the authors explain that, “Telling the program’s story is the primary function of Indigenous evaluation…Evaluation, as story telling, becomes a way of understanding the content of our program as well as the methodology to learn from our story.” To tell a story is an honor.  In modern Alaska Native gatherings, we still practice the tradition of certain people being allowed to speak or tell stories.  This begs the question: Who do you want to tell your story and do they understand the values that are the foundation and framework for your program?  

Hot Tip: Context before methods.  It is essential to understand the Alaska Native values and traditions that are the core of Alaska Native serving programs, institutions and organizations.  Indigenous Evaluation is an excellent approach to telling our stories.

Rad Resource: The Alaskool website hosts a wealth of information on Alaska Native cultures and values.  This link will take you to a map of “Indigenous Peoples and Languages of Alaska”

The American Evaluation Association is celebrating Alaska Evaluation Network Week. The contributions all this week to aea365 come from AKEN members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

I’m Corrie Whitmore, president of the Alaska Evaluation Network (AKEN) and an internal evaluator working for Southcentral Foundation (SCF).  SCF is an Alaska Native owned and operated health care organization serving approximately 60,000 Alaska Native and American Indian people living in Anchorage, the Matanuska-Susitna Valley, and 60 rural villages in the Anchorage Service Unit. SCF has had program evaluation in-house since 2009. We are a small department with two evaluators, so it is important for us to build our skills and keep up to date with changes in evaluation practice by staying engaged with the American Evaluation Association (AEA) and local evaluation practitioners.

The Alaska Evaluation Network (AKEN), which is a great resource for all Alaskans interested in evaluation, was founded in 2012 with an emphasis on improving the quality of evaluation research, theory and practice in Alaska and creating forums for dialogue, relationship–building, learning, and collaboration.

Alaska offers a unique environment for evaluation.  According to the 2010 census, we have 730,000 people spread over an area larger than Texas, New Mexico, and Arizona combined. Population density is low, some communities are only accessible by airplane or boat, and many evaluators work in tribal contexts. Building a community of practice encourages AKEN’s members to support evaluation practices that are responsive to the uniqueness of Alaska’s geographic, social, cultural, and administrative context; encourage effective evaluation; improve evaluation capacity within the state; and advocate for evaluation leadership.

AKEN’s goals are to: increase the understanding of evaluation’s purpose and use in Alaska; build evaluator and organization capacity around evaluation approaches, methods, and cultural competency; promote evaluation as a profession; and support the contribution of evaluation to the generation of theory and knowledge about effective human action in Alaska and the circumpolar north.

To date, AKEN has more than 50 members spread from Fairbanks, Alaska, to Southern California (spanning 3000 miles). This geographic dispersal is a strength – we are committed to including evaluators across the state and those living elsewhere who work in Alaska – and a challenge. To address that reality, all of our meetings have a teleconference or web conferencing option, meeting minutes are posted to our website, and much business is done by email.

Lessons Learned:  Connecting with other evaluators in your region can enrich your work and support capacity building. While it would be great to sit across the table from each other, it can be just as valuable to connect using technology!

Get Involved: by joining AKEN or your own local affiliate. These groups are online at the AEA Affiliate List. If your area doesn’t have an affiliate yet – start one! We’d be glad to share our experience with you.

The American Evaluation Association is celebrating Alaska Evaluation Network (AKEN) Affiliate Week. The contributions all this week to aea365 come from AKEN members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Holly Lewandowski. I am the owner of Evaluation for Change, Inc. a consulting firm that specializes in program evaluation, grant writing, and research for nonprofits, state agencies, and universities. I worked as an internal evaluator for nonprofits for ten years prior to starting my business four years ago.

There have been some major changes in the nonprofit world as a result of the economic downturn -within the last four years especially. I’ve witnessed nonprofits that were mainstays in the community shut their doors because the major funding source they relied on for years dried up. Funding has become scarcer and much more competitive. Funders are demanding grantees demonstrate strong outcomes in order to qualify for funding. As a result, many of my clients are placing a much greater emphasis on evaluating outcomes and impact and less on evaluating program implementation in order to compete. The problem is you can’t have one without the other. Strong programs produce strong outcomes.

Here are some tips and resources I use to encourage my clients to think evaluatively to strengthen their programs and thus produce quality outcomes.

Hot Tips:

  • Take time to think. As an outside evaluator, I am very aware of the stress program staff and leadership are under to keep their nonprofits running. I am also aware of the emphasis for nonprofits to produce in order to keep their boards and funders happy. What gets lost, though, is time to think creatively and reflect on what’s going well and what needs to be improved. Therefore, I build in time in my work plan to facilitate brainstorming and reflection sessions around program implementation. What we do in those sessions are in the following tips.
  • Learn by doing. During these sessions, program staff learns how to develop evaluation questions and how to develop logic models.
  • Cultivate a culture of continuous improvement through data sharing. Also at these sessions, process evaluation data is shared and discussed. The discussions are centered on using data to reinforce what staff already knows about programs, celebrate successes, and identify areas for improvement.

Rad Resources:

  • The AEA Public eLibrary has a wealth of presentations and Coffee Break Demonstrations on evaluative thinking and building capacity in nonprofits.
  • If you are new to facilitating adults in learning about evaluation, check out some websites on Adult Learning Theory. About.com is a good place to start.

The American Evaluation Association is celebrating the Chicagoland (CEA) Evaluation Association Affiliate Week with our colleagues in the CEA AEA Affiliate. The contributions all this week to aea365 come from our CEA members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.


· · · · ·

My name is Mimi Doll, the owner of Candeo Consulting, Inc., an independent consulting firm that builds organizations’ capacity to create meaningful change in the communities they serve. Sometimes we can prevent scope creep with good planning, other times no matter how good our preparation is, clients either don’t have a clear sense of what they want or simply change their minds.

Hot Tip:

  • Always Develop a Scope of Services and Contract. Developing a detailed scope of services, including project tasks, work hours, pricing, timeline, roles and responsibilities, makes clear to the client what services and deliverables you plan to provide, and those you don’t.  Your scope serves as a communication tool about how you will proceed with the project and provides your client an opportunity to react and clarify their expectations about the work.  Similarly your contract lays out a legally enforceable agreement about how you and your client will conduct business together, including key issues such as services offered, payment terms, data ownership, contract termination and renewability.  Should you reach that “worst case” scenario when you and your client reach an impasse, your contract makes clear the parameters to which you’ve agreed.

Rad Resource: For more information about contracts and small business-related legal issues, see Nolo’s Online Legal Forms.

Hot Tip:

  • Hone Those Communication Skills.Sometimes there are client-consultant disagreements about how a project should proceed, even after the contract has been signed.  These moments call for strong communication skills: listen actively to your client, state your positions clearly, manage strong emotions (yours/your client’s) and maintain professionalism.  Remember, conflicts often arise from differing perception of a situation rather than objective facts; it’s important to be able to take the client’s perspective.  Make your goal about coming to a mutual agreement.

Rad Resource: see HelpGuide.org’s conflict resolution skills.

Hot Tip:

  • Be Clear on Your Own Standards. When the client’s expectations about the project change between start and finish of the work, it’s important to be clear about your own standards by writing them down.   Consider the following:
  • Logistics & Scope Changes: How does this impact your project’s time frame, budget and staffing?  Where can you be flexible and where can you not?  Do alterations erase company profits; place too great a burden on your time/staffing capacity?
  • Work Quality/Integrity & Scope Changes: Do requested alterations reduce the quality or rigor of data collection, create conflicts of interest, and lessen the impact of your work?  In some cases these decisions are clearly outlined by professional standards, while other times we must develop our own professional standards.

Rad Resource: See AEA Guiding Principles for Evaluators.

The American Evaluation Association is celebrating the Chicagoland (CEA) Evaluation Association Affiliate Week with our colleagues in the CEA AEA Affiliate. The contributions all this week to aea365 come from our CEA members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.


· · · ·

Older posts >>

Archives

To top