AEA365 | A Tip-a-Day by and for Evaluators

TAG | strategic planning

I’m Tom Chapel. My “day job” is Chief Evaluation Officer at the CDC where I help our programs/ partners with evaluation and strategic planning. I took on both roles because large organizations do strategic planning and evaluation in different silos, even though both silos start with “who are we?” “what are we trying to accomplish?” and “what does success look like?”

In response, we’ve crafted an approach to strategic planning which employs logic models, but in a different way than for evaluation. The key steps: Compose a simple logic model of activities and outcomes (or what some might call a “theory of change”). I want stakeholders to understand the “what” of their program (activities) and the “so what” (the sequence of outcomes/impacts). Usually, we add arrows to reflect the underlying logic/theory.

  1. Choose/affirm an “accountable outcome”. It’s great to include “reduced morbidity and mortality” in the model as a reminder of what we’re about. But be sure to explain that these are areas for “contribution” and not outcomes attributable solely to their efforts.
  2. Have the “output talk”. The model shows which activities drive which outcomes. Outputs are the chance to define how the activity MUST be implemented for those outcomes to occur. This discussion sets up creation of process measures for the evaluator later on but at this point provides clarity for planners and implementers on the levels of intensity/quality/quantity needed.
  3. Help them identify “killer assumptions”. There are dozens of inputs and moderating factors (context) over which a program has less or no control. Look for ones so serious that if that input or moderator is not dealt with the program really can’t achieve its intended outcomes. Depressing as this exercise can be, it spurs creative thinking— how might we work around/refine our activities to accommodate it?
  4. Tie it all together with a (short) list of key strategic issues. Hit the high points —mission, vision, SWOT and move on to goals and objectives. This technique avoids the painful wordsmithing that often comes with traditional strategic planning.

Lessons Learned:

  • Use existing resources. The organization may have a mission and vision, an existing strategic plan, a business plan, or a set of performance measures. Extract the starter model from these resources so they see the logic model as a visual depiction of how they already think about their program and not something completely new.
  • Do the process in digestible bites and WITH the program. You want people to follow the storyline and that happens more often if they are part of the model construction.
  • If in return for minimal word-smithing we inflict endless arrow-smithing, fatigue will soon set in. Declare victory when the group is 85% in agreement with the picture.

Rad Resource: Phillips and Knowlton: The Logic Model Guidebook (2nd edition)

The American Evaluation Association is celebrating Logic Model Week. The contributions all this week to aea365 come from evaluators who have used logic models in their practice. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Jun/14

12

Fred Nickols on Simplified Strategic Planning

Hello, I am Fred Nickols, a consultant, writer, former executive, and currently Managing Partner of Distance Consulting LLC.  I fancy myself a toolmaker to knowledge workers and part of that involves eliminating unnecessary complexity.  Nowhere is that more needed than in strategic planning.

The topic of strategic planning is an absolute quagmire of competing and conflicting models, concepts, and practices.  Some ruthless simplification is needed.  To start, consider this:  planning is nothing more (or less) than the formulation of a set of intended outcomes accompanied by a set of activities intended to lead to those outcomes.

A simplified view of strategic planning (or any other kind of planning for that matter) suggests a three-stage process:

1)   Taking stock of the situation

2)   Picking your targets and setting your objectives

3)   Formulating your plans

Lessons Learned: Taking Stock of the Situation. This can range from informal discussions focusing on the conditions confronting or existing within the company through more formal, structured exercises, all the way up to carrying out sophisticated analyses (e.g., the classic SWOT analysis).  Do what fits. Nickols

Picking Your Targets and Setting Your Objectives. “Target” refers to some variable that is the focus of your attention (e.g., sales or profits or share of market or productivity rate.  “Objective” refers to some specified value for a Target (e.g., an increase in gross sales of 20% per year for the next four years or capturing 40% of a particular market.  Being clear about targets and objectives is quite helpful.

Formulating Your Plans. With targets selected and objectives set, next comes figuring out how you’re going to realize them.  What avenues are open to you for influencing the targets you’ve selected?  Can you influence them enough to reach the objectives you’ve set?  What kinds of actions and resources are required?  Who does what when?  Does your company have the resources and the capabilities to achieve the objectives it set?  Can it carry out the actions it is contemplating?  If so, keep going.  If not, perhaps you need to rethink matters.  Such rethinking might lead you to revise your objectives, pick a different target or set of targets, or perhaps go all the way back to the beginning and take a fresh look at the situation confronting you and your company.  The iterative nature of planning is indicated by the little circle of arrows in the center of Figure 1.

Some people are overwhelmed by the apparent complexity of the strategic planning process.  Two pieces of advice: (1) Relax, because it isn’t really all that complex once you strip it down to its essentials, and (2) No company has ever been known to fail for lack of a strategic plan.

Rad Resource:  An expanded version of this blog post.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello! I’m Michelle Baron, an Independent Evaluation Strategist. In my work in higher education, I’ve encountered a mixture of evaluation champions and critics. Today I’d like to address the importance of strategic planning in building a culture of evaluation.

Strategic planning is considered by many to be an organizational road map by outlining the organizational vision and mission, establishing clear and attainable objectives and goals, and then developing processes for how to achieve them.    Strategic planning and evaluation go hand in hand in moving the organization and its programs forward to benefit its stakeholders. Strategic planning is simply crucial to the evaluation process: without a road map of criteria, standards, and goals, it’s almost impossible to achieve desired success.

Evaluators have a unique role in helping organizations with both ends of the spectrum: creating a foundation through strategic planning, and then conducting evaluations to examine and monitor progress.

Hot Tip #1: Start at the top. Buy-in from top management for strategic planning is of the utmost importance for its success.

Hot Tip #2: Conduct a SWOT analysis (Strengths, Weaknesses, Opportunities, and Threats) of the entity or its programs/services. Doing so not only enlightens people to a variety of ideas and questions to consider, but can also indicate the level of support for those topics.

Cool Trick: Brainstorming sessions are often an excellent starting point for the organization itself or smaller group within that organization. The evaluator or designated member of the organization can facilitate the discussion by developing questions beforehand that may serve as prompts for the discussion, such as those dealing with objectives, goals, and resources.

Rad Resource #1: Strategic Planning for Public & Nonprofit Organizations by John Bryson, and related books by the same author, provide the groundwork and tools necessary for organizations to develop and sustain their strategic planning process.

Rad Resource #2: The Fifth Discipline: The Art and Practice of the Learning Organization by Peter Senge helps leaders establish the foundation and philosophy behind strategic planning, and helps them develop their long-term thinking for organizational growth.

With these tools and resources, evaluators may be more prepared to assist organizations in strategic planning, and have more support for and effectiveness of the evaluations for the organizations.

The American Evaluation Association is celebrating Business, Leadership and Performance (BLP) TIG Week with our colleagues in the BLP AEA Topical Interest Group. The contributions all this week to aea365 come from our BLP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· · · ·

Hello, I’m Tamara Bertrand Jones. I’m an Assistant Professor of Higher Education at Florida State University and a former Director of Research and Evaluation in Student Affairs.

Assessment in higher education has largely been used for external accountability. Divisions and departments can use these external mandates for internal improvement and make assessment part of daily practice. Cultivating a culture of assessment on your campus or in your division/department requires a few steps.

Hot Tip #1: Develop Divisional Commitment

A lone department’s assessment efforts or even those of a few innovative areas will not permeate an entire Division without support from the Vice President and their associates.  Gaining VP support for assessment efforts is key to integrating these efforts into your work and throughout the Division.  Some areas even have their own assessment staff dedicated to this work.

Hot Tip #2: Cultivate Departmental Commitment

Once the commitment from the appropriate Division level or other administrator is received, then departmental support has to be cultivated.  I hate to encourage a top down initiative at any time, but if there was any aspect that requires a top down approach, it is that of assessment.  Often upper level administrators can incentivize assessment or other activities in order to build support for this work.   Of course, if other professionals at all levels in the department are proponents, then these activities will only be easier.

Hot Tip #3: Solicit Student Involvement

Involving students in your assessment efforts not only helps to build their capacity to conduct and become better consumers of assessment, but also creates buy-in of your efforts.  Student responses to surveys or participation in other assessment efforts increases as a result.

Hot Tip #4: Relate to Institutional Strategic Plan

Divisions or departments usually develop strategic plans used to guide their work.  Linking the Division’s plan or Departmental plan to the University’s broader strategic plan ensures a direct connection.  This intentional action demonstrates how the Division/Department contributes to larger university goals and can reap many benefits for the Division/Department, including increased financial support or additional human resources.

Hot Tip #5: Ensure Accountability

Lastly, an assessment culture encourages accountability.  Programs are developed using a solid foundation of assessment, not using gut feelings, or what you think students need.  Our work becomes intentional and we also build accountability into our daily work.  Our actions become even more meaningful as every action can be tied back to a larger purpose.

Rad Resource: The Association for the Assessment of Learning in Higher Education’s ASSESS listserv is a great source of current discussion and practice related to assessment.  To subscribe, visit  http://www.coe.uky.edu/lists/helists.php

The American Evaluation Association is celebrating Assessment in Higher Education (AHE) TIG Week. The contributions all this week to aea365 come from AHE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · · · ·

Hello! I’m Silvana Bialosiewicz, a doctoral student at Claremont Graduate University and Senior Research Associate at the Claremont Evaluation Center. Lately I’ve been giving a lot of thought to the relationship between evaluation and organizational strategic development, and how evaluators can support their clients by participating in strategic development efforts.

I’d like to share some of the valuable lessons I learned engaging in this process as part of a formative evaluation conducted on behalf of a local chapter of Habitat for Humanity. The factors that motivated me to build strategic planning into the evaluation were my action-oriented stakeholders and the Organizational Development expertise of my co-evaluator.

Word of Caution: As an evaluator, engaging in strategic development activities isn’t always appropriate. Let the context of the program, the goals of the evaluation, and your relationship with the organization be your guide when deciding if the situation lends itself to these activities.

Hot Tip: Start the strategic planning conversation early. In the evaluation planning phase, open up a dialogue with your client regarding their strategic planning processes, and the ways in which evaluative data could be used to create action plans, and direct energy and resources towards organizational goals,

Hot Tip: Build strategic planning into the evaluation contract. Knowing that these activities will take place from the start can help foster buy-in into the evaluation process and help the organization hold itself accountable to taking action based on evaluation findings.

Hot Tip: Develop personalized reflective self-assessment tools to support the process. Using the program’s theory of change and the evaluation findings, pose a series of questions to your stakeholders to stimulate reflection and discussion regarding where they are versus where they’d like to be.

Hot Tip: Be the voice of the data. If you find yourself sitting at the strategic planning table, speak on behalf of the data. As the evaluator you can support your stakeholders in making data-driven decisions by helping them continuously examine whether strategic plans are aligned with evaluation findings.

Hot Tip: Manage your expectations for organizational change. Understand that organizational development and change is a slow process and program stakeholders have competing priorities and busy schedules. Allow the key-decision makers to captain the ship and dictate the scope and timeline of the development activities.

Rad Resource: The Readiness for Organization Learning and Evaluation (ROLE) Instrument developed by Preskill and Torres is a great tool to help you and the organization determine if strategic planning is appropriate.

Rad Resource: Cassandra O’Neill recently hosted an AEA webinar on this topic entitled The Intersection of Evaluation and Strategy. This quick presentation provides an interesting perspective on the topic, as well as additional resources for interested parties.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

 

 

 

 

Hello, I am Leah Goldstein Moses, founder and CEO of the Improve Group and 2012 President of the Minnesota Evaluation Association. When I founded the Improve Group in 2000, I was learning to be a consultant at the same time I was refining my evaluation skills. My practice has grown from myself and a loose network of other independent consultants to a consulting firm of 18 staff.  Running a company is different than independent consulting. It took nearly 4 years before I had my first employee. From that first day, I had new responsibilities – making payroll, setting HR policies, and developing a new network of advisers and resources. I’ve learned a lot of lessons along the way on how to be successful as a ‘not so small’ independent evaluation firm.

Lessons Learned:

  • Take strategic risks – but prepare for the consequences. As our company grew, the risks also grew. I now take a strategic approach to big decisions, such as a new hire or pursuing a large project. I ask myself: what if this doesn’t work out? What are the potential risks? What choices will we need to make in reaction to those risks?
  • Find a focus and identity. We love evaluation. We also do strategic planning and research, but always with an underlying evaluation focus. We are known as evaluators but we decided to work across sectors.  So we do evaluation in arts, human services, formal education and informal learning, health, transportation, development, and corrections. We are experts in evaluation and have found ways to supplement our expertise in these sectors. You might find a different identity and focus – either in one sector, a set of methods, or something else. If you can describe who you are to your clients, collaborators, and community, you will be fine.

Rad Resources:

  • Interested in learning more about growing and sustaining an evaluation business? Attend Improving Evaluation Practice Management During Chaotic Economic Times: Three CEOs Reflect on Strategic and Innovative Diversification, Budgeting, and Employee Support and Development at the AEA conference on Thursday, Oct 25, 4:30-6:00 PM, where I’ll be presenting with Gary Ciurczak, Richard Hezel and Samantha Hagel.
  • I use the AEA365 blog, the mande listserv and the evaltalk listserv to stay fresh on current evaluation topics.
  • I enjoyed the Momentum Effect by J.C. Larreche; a book that examines the factors that helped companies grow year after year.

The American Evaluation Association is celebrating the Independent Consulting TIG (IC) Week. The contributions all week come from IC members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Greetings aea365 community! I’m Ann Emery and I’ve been both an external evaluator and an internal evaluator. Today I’d like to share a few of the reasons why I absolutely love internal evaluation.

Lessons Learned: Internal evaluation is a great career option for fans of utilization-focused evaluation. It gives me opportunities to:

  • Meet regularly with Chief Operating Officers and Executive Directors, so evaluation results get put into action after weekly staff meetings instead of after annual reports.
  • Participate on strategic planning committees, where I can make sure that evaluation results get used for long-term planning.

Lessons Learned: Internal evaluators often have an intimate understanding of organizational history, which allows us to:

  • Build an organizational culture of learning where staff is committed to making data-driven decisions.
  • Create a casual, non-threatening atmosphere by simply walking down the hallway to chat face-to-face with our “clients.” I hold my best client meetings in the hallways and in the mailroom.
  • Use our organizational knowledge to plan feasible evaluations that take into account inevitable staff turnover.
  • Tailor dissemination formats to user preferences, like dashboards for one manager and oral presentations for another.
  • Participate in annual retreats and weekly meetings. Data’s always on the agenda.

Lessons Learned: Internal evaluators can build evaluation capacity within their organizations in various ways:

  • I’ve co-taught Excel certification courses to non-evaluators. Spreadsheet skills can help non-evaluators feel more comfortable with evaluation because it takes some of the mystery out of data analysis.
  • I’ve also led brown bags about everything from logic models to research design. As a result, I’ve been more of a data “coach,” guiding staff through evaluation rather than making decisions on their behalf.

Hot Tips: Internal evaluators can use their skills to help their organizations in other ways, including:

  • Volunteering at program events. When I served food to child and teen participants at Thanksgiving, my time spent chatting with them helped me design more responsive data collection instruments.
  • Contributing to organization-wide research projects, such as looking for patterns in data across the participants that programs serve each year.
  • Partnering with graduate interns and external evaluators to conduct more in-depth research on key aspects of the organization.

Cool Trick: Eun Kyeng Baek and SeriaShia Chatters wrote about the Risks in Internal Evaluation. When internal evaluators get wrapped inside internal politics, we can partner with external evaluators like consulting firms, independent consultants, and even graduate interns. Outsider perspectives are valuable and keep things transparent.

Rad Resources:

AEA is celebrating Internal Evaluators TIG Week. The contributions all week come from IE members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

· · · · · · · · ·

Hello, my name is Kori Kanayama of Kaoru Kanayama Consulting.  I’m going to share an approach to assessing a nonprofit organization’s profitability and mission impact as a springboard to organizational sustainability. I presented this assessment methodology at the Spring AZENet conference.

This methodology can be done by nonprofit leaders, and by internal or external evaluators. It requires two kinds of data.

The first type of data is financial, and the task is to determine if each of the organization’s programs is making or losing money. The second type of data is mission impact data where a decision must be made on which indicators will be used to measure the impact of each program.  Examples of indicators that could be used to assess impact include: Alignment with Core Mission, Program Excellence, Community Building Value, and Leverage.

After determining whether each program (or line of business) is making or losing money and whether the impact of each program is high or low, the data can be displayed on a simple quadrant matrix.  Each program will end up in one (or could straddle two) of the four quadrants which show implications for the programs:

  Low Profitability High Profitability
High Impact HEARTAvoid making the choice to either shut it down or raise more money. Keep it, but control its costs. Too many of these make an organization unsustainable. STARSeems to run itself, but resist the temptation to loosen oversight. Invest in these programs because this is where strategic growth opportunities are. 
Low Impact STOP SIGNInstead of holding onto this program, close it down or give it away.  If making one last effort, do it with a budget and a deadline. Do an analysis on the impact of closing a business line, and pay attention to its effect on shared administrative costs. MONEY TREEPrograms in this category need to be nurtured to increase their impact. Until impact is increased resist expanding, adding, or growing the program. 

Hot Tip: Allocate income and expenses as accurately as possible. Trying to prop up unprofitable programs defeats the purpose of the analysis.  The point is to clarify the subsidizing relationships between business lines.

Hot Tip: This analysis can be incorporated into strategic planning, implemented at a board or senior staff retreat, or used as a stand-alone operational planning tool. See Jan Masaoka’s Blue Avocado blog post on alternatives to strategic planning which highlights the need to look at program impact and financial stability together.

Rad Resources:

Source Document: Nonprofit Sustainability by Jeanne Bell, Jan Masaoka and Steve Zimmerman.

My presentation at the AZENet conference.

The American Evaluation Association is celebrating Arizona Evaluation Network (AZENet) Affiliate Week with our colleagues in the AZENet AEA Affiliate. The contributions all this week to aea365 come from our AZE members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

For the past five decades Roger Kaufman (RK) has been a leader in the field of needs assessment – contributing 40 books and 200+ articles, and consulting with organizations around the world. He is a professor emeritus from Florida State University and a Distinguished Research Professor, Sonora (Mexico) Institute of Technology. For aea365, Dr. Kaufman spoke with Ryan Watkins, responding to questions that can help guide your needs assessment efforts.

Lessons learned:

Convincing managers that it is worth paying for a needs assessment rather than jumping into the design, implementation, and evaluation of an activity

(RK) This is all about incentives. Ask first “what rewards will you be given and what punishments can you expect for failure?” Simply showing people the costs of not getting the alignment between societal value-added, organizational contributions, and individual and small group can be sobering. One simply has to look at the international companies that have gone bankrupt in the last 20 years reveals that not having this complete alignment can be devastating to all. Of course, one suggesting a needs assessment first and having that rejected will still be blamed if desired results and payoffs are not achieved.

Relationship between needs assessments and evaluations

(RK) Needs assessment is proactive and evaluation is reactive/after-the-fact. Both look at gaps in results and consequences but one attempts to fill the gaps before starting interventions and the other looks at the gaps that have been successfully addressed. Both are vital.

Three additional skills critical for any evaluator to develop before conducting a needs assessment
(RK)

  1. Understand and perform on the basis of the differences and relationships between needs assessment and evaluation.
  2. Be data and results driven.
  3. Include, align, and integrate all three levels of planning: Mega/societal, Macro/organizational, and Micro/individual and small group.

Role of society in needs assessments and evaluations

(RK) We all live in this shared and shrinking world. We realize that a change in any part of our shared world has potential impacts on all other parts of the world. Organizations are simply means to societal ends and if (as noted by professor emeritus Dale Brethower) if you are not adding value to society you are subtracting value.

Focusing on Mega is not just professionally required, it is also ethical to do so.

Rad resources: For more in-depth guidance, try:

The American Evaluation Association is celebrating Needs Assessment (NA) Week with our colleagues in the NA AEA Topical Interest Group. The contributions all this week to aea365 come from our NA TIG colleagues. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Frank Meintjies and I work as a consultant in South Africa. While most of my evaluation work has related to poverty reduction initiatives, I have also undertaken evaluation work on HIV and AIDS programmes.

Evaluation and strategic planning are so closely linked. Some organisations want to dive in to strategic planning without doing a good evaluation first. If you do that, it would be a mistake; it would be a classic case of ‘more haste less speed’.

If you evaluate first, you lay a solid foundation, on so many levels. The review will build a common base of understanding among those involved in crafting the new strategy. Participants will develop (or renew) a shared understanding of what they do; how and why they do what they do; what progress is being made; and, whether the work and objectives are still relevant to a fast-changing context.

Hot Tip:
Make time to take a cool look at your organisation, its achievements, capabilities and impact, before plunging into strategic planning. Get information from beneficiaries and data from the field; this will serve as fuel for the creative thinking processes.

Tip: If you are under time pressure to hold a strategic planning, you may have to opt for a rapid evaluation. Such an evaluation is undertaken within a short time frame, but levels of validity and reliability are high enough to enable program staff to make confident, informed decisions. To learn more about the use of rapid evaluation, see the International Training & Education Center for Health’s resource at http://tinyurl.com/7huxffe to learn more about the use of rapid evaluation.

Lesson Learned: If you aren’t clear what the current state of play is then you are trying to look into the future from a muddy and clouded vantage point. Having a clear perspective of who and what your organisation is will sharpen your gaze as you examine the the future, with all its constraints and possibilities.

Lesson Learned: If you have undertaken a recent evaluation, or even if you undertake ongoing monitoring and evaluation, a major evaluation exercise before strategic planning will most likely be unnecessary. However, it will be good to revisit the key points that emerge out of such evaluations as a precursor to, or during, the strategic planning activity.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Archives

To top