AEA365 | A Tip-a-Day by and for Evaluators

CAT | Independent Consulting

My name is Tamara Hamai, and I am President of Hamai Consulting, an evaluation firm focused on improving child well-being from cradle through career.  I’ve been consulting since 2007, and have learned a lot of lessons the hard way.  Ever since adopting the Book Yourself Solid system (created by Michael Port), I’ve benefited from 90 Day Planning to stabilize my cash flow and work load.

Hot Tips:

Every 90 days, set aside a few hours to map out your revenue goal and how you will achieve it.

Step 1: Budget your known and projected expenses for the next 90 days.  Make sure to include your salary!  Set a gross revenue goal that will cover all of your expenses, plus some profit as an emergency fund for the future. What services do you need to sell at what prices to bring in this revenue?  Identify the best (i.e., one big contract) and worst case (i.e., lots of little gigs) scenarios.

Step 2: What marketing do you need to generate the leads that will produce the sales needed to achieve your revenue goal?  For example, looking back at your past networking activities, what and how much did you have to do to land a new project?  What and how much would you need to do of those same activities over the next 90 days to hit your target revenue?  Identify at least 2 marketing strategies (e.g., networking and proposals) that would each get you to your target, just in case one fails.

Step 3: Tackle at least one system improvement each 90 days that will help you get things done more efficiently and effectively.  For example, you might realize you don’t have the data you need to know what types or how much of networking you need to engage in before you win a contract.  You could plan to build a networking tracking system (which could be as simple as a spreadsheet) to start collecting this information.

Step 4: Create a daily action plan.  What projects need to be done to achieve your 90 day goals?  Map out each task that must be completed to implement your marketing and system improvement plans, then set deadlines.  Identify tasks for each day for all 90 days.

Lessons Learned:

Be realistic with your goals.  Set small revenue goals at first, and get bigger as you get more successful.

Add extra profit in your revenue goals early on to build up an emergency fund that will eventually eliminate the feast-famine roller coaster.

Rad Resource:

Book Yourself Solid, by Michael Port: http://bit.ly/2e6CrTz

The American Evaluation Association is celebrating Independent Consulting TIG Week with our colleagues in the IC AEA Topical Interest Group. The contributions all this week to aea365 come from our IC TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

My name is Elayne Vlahaki and I am the President of Catalyst Consulting Inc., an independent evaluation consulting firm based in Vancouver, British Columbia. I have held multiple sessional appointments teaching the Program Planning and Evaluation course within the Faculty of Health Sciences at Simon Fraser University. You can follow me on Twitter @Catalyst_Tweets.

A colleague recently explained to me that their organization has been underwhelmed by evaluation consultants they have hired in the past. He emphasized the gap between what they were looking for and what was ultimately produced. I left this conversation thinking about the importance of understanding your clients’ needs and then delivering on them in order for projects to be successful. Here are some tips for how you can be systematic about identifying and understanding potential clients’ needs.

Hot Tips:

  • Do Your Homework. Prior to meeting with a potential client, aim to learn as much as possible about the organization and broader environment in which it operates. This will demonstrate your credibility and improve the relevance of your proposed evaluation approach. Cruising the Internet ten minutes before your meeting won’t cut it. Be systematic about your research process and be sure to explore the wide range of information sources available to you, from program documents to industry reports.
  • Listen Carefully. How can you understand a potential client’s needs without giving them the opportunity to tell you? This may sound simple but meaningfully listening without planning a response or thinking about solutions to their challenges can be difficult. Actively listening will show that you are genuinely interested in learning about their organization.
  • Ask Questions. Asking thoughtful questions is one of the most valuable tools you have to learn about your potential client. Ask open-ended questions that will help you define the scope of their needs, which will then help you define the rough boundaries for your proposed solutions.
  • Ongoing Assessment. Remember that clients’ evaluation needs will shift over time. It is critical that you review and respond to their changing needs to ensure that your evaluation findings will actually be used to inform decision-making and change.
  • Learn From Business. Most often evaluation consultants are technical and subject-matter experts, but tend to be less familiar with the consulting process from a business perspective. Developing and refining our consulting skills can help us better identify our clients’ needs, but can also equip us with a range of other tools to be successful. Gail Barrington has produced a wide range of training resources about independent consulting skills for evaluators. Check out her book entitled, “Consulting Start-up and Management: A Guide for Evaluator and Applied Researchers”.

The American Evaluation Association is celebrating Independent Consulting TIG Week with our colleagues in the IC AEA Topical Interest Group. The contributions all this week to aea365 come from our IC TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Jennifer Dewey, and I am a Senior Director with Walter R. McDonald and Associates, Inc. (WRMA). Over the years, I have worked for organizations that receive a substantial amount of business in health and human services from Federal, state, and local-level entities. A key method to obtaining this work is partnering with subcontractors and consultants to respond to requests for proposals, or RFPs.

“Prime” responders (those who will take 51% or more of the work) look to subcontractors (an organization) and independent consultants (an individual) to enhance their bids. Subcontractors and consultants do this by providing content or technical knowledge that the prime doesn’t have enough of, or doesn’t have at all. For example, a history of working with certain populations (e.g. military and veterans, indigenous peoples) or specialized statistical expertise (e.g., social network analysis). Subcontractors and consultants may enhance a bid by being based in one or more locations where the project will take place, bringing their knowledge of the local government, population(s), and community structure to the work.

Many of these partnerships are generated through networking, where a prime representative knows an independent consultant or staff member at a potential subcontractor that can bring the needed knowledge and skills to an RFP response.

Rad Resource: Familiarize yourself with available Federal contract vehicles, such as AHRQ (www.ahrq.gov), CDC (www.cdc.gov), GSA MOBIS (www.gsa.gov), HHS PSC (www.ngsservices.com/program_support_center.html) HRSA (www.hrsa.gov), SAMHSA (www.samhsa.gov), and others to learn about past and future contracts. Consulting organizations often list their contract vehicles on their website.

Hot Tip: Make yourself and/or your organization easy to find through LinkedIn profiles with direct contact information, and websites with detailed descriptions of services, projects, and staff member qualifications.

Once you establish a partnership, prove your worth by delivering high-quality, timely work as part of the RFP process. Brainstorming and generating ideas about the scope of work, while challenging in itself, is easy compared to the business of putting pen to paper (or fingers to keyboard).

Hot Tip: Leverage your unique subject matter expertise and technical knowledge by being a thinking partner with the prime, helping them understand and work through the challenges implicit in the project. As requested, follow up with well-written tasks that address the RFP’s evaluation criteria within the allotted page count.

Hot Tip: Cement your value by providing professional bios, resumes, project examples, and organizational capacity statements per the prime’s timeline and in the requested format.

Primes view subcontractor and independent consultant contributions to the RFP process as a litmus test for contract performance. Whether the bid is won or lost, high performance will increase your opportunities for future work.

The American Evaluation Association is celebrating Independent Consulting TIG Week with our colleagues in the IC AEA Topical Interest Group. The contributions all this week to aea365 come from our IC TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi! My name is Laura Keene, owner of Keene Insights in Los Angeles, CA. Not surprisingly, networking is an important part of my job, but the truth is: we all have to network. Even if you have a 9-to-5 job, you may be searching for new staff, collaborators, or other resources for your company, you may be looking for a new avenue of work within your organization, or you may be hunting for a dream job elsewhere.

Lesson Learned: Networking ain’t what it used to be.

When I first started my business, the idea of networking was daunting. Like many of you, I imagined that in order to sell my services I needed to mold myself into a 1950s used car salesman, i.e., be schmoozey and pushy. Turns out, a lot has changed since then.

In his book, To Sell is Human, Dan Pink writes: “Selling in all its dimensions – whether pushing Buicks on a car lot or pitching ideas in a meeting – has changed more in the last ten years than it did over the previous hundred.” He argues that because we live in a world where we have a mountain of information at our fingertips, sellers no longer have an advantage over buyers.

As a result, selling, and the use of networking as a sales strategy, has become more about connecting, sharing, and building strong relationships with people over time. When I learned that networking was less about closing deals and more about meeting new people, developing friendships, and sharing myself and my work with those friends (without worrying about when or if they’re going to hire me), it became a lot easier to do.

Hot Tip: Connect instead of network

Networking is still hard work, especially for us introverts, but the pressure is off. You don’t need to get the contract or land the new job.  You just need to meet and get to know some cool new people. Here are a few tips for doing so:

  • Relax and be yourself
  • Ask questions; find out about their work, their hobbies, their family
  • Share; let them learn about your work (and your passion for it), your hobbies, your family
  • Ask for a business card and jot notes about the person on the card…because the next step is to follow-up, share your connections and expertise, and build a relationship of trust.

Rad Resources

Check out Daniel Pink’s book To Sell is Human and, for those consultants out there who want a new angle on growing your business, pick up Michael Port’s Book Yourself Solid.

The American Evaluation Association is celebrating Independent Consulting TIG Week with our colleagues in the IC AEA Topical Interest Group. The contributions all this week to aea365 come from our IC TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello, I am Matt Feldmann, the principal researcher and owner of Goshen Education Consulting, Inc. (http://www.gosheneducationconsulting.com) and the chair for the Independent Consulting TIG (IC TIG).  My company focuses on educational evaluation and data support for clients in Southern Illinois. I had a great time at the annual conference. While it was exhausting, I also, weirdly, feel refreshed by meeting with all of my colleagues.

This week we are featuring presenters from our Meet the Pros: Intermediate Consulting Skill-Building Self-help Fair conference session. I believe that this annual conference session is our best IC TIG session because it provides immediately useful information, and it often is the gateway for many new independent consultants to become involved with our TIG. The session was particularly useful for our attendees because we had eight tables setup for folks to circulate in a “speed-dating format.” The attendees received information quickly and this was a great opportunity to network with others.

Lesson Learned: The IC TIG is for more than just independent consultants.

Before I go any further, please note that the Independent Consulting TIG is all about good consulting business practices. Many of our topics are relevant beyond small businesses and have ready application to small evaluation shops such as university centers and institutes, internal evaluation practices, and evaluation departments within larger organizations. Our TIG oriented posts this week are relevant to any evaluators who would like to learn better business practice.

Rad Resources:

In case you are interested in some previous blog posts that would be similar to our Meet the Pros topics consider seeing these past AEA 365 Blogs.

Holly Lewandowski on being new to independent consulting

Matthew Von Hendy’s take on searching for RFPs

Judah Viola’s perspective on building capacity to succeed as an independent consultant

My take on focusing on a niche for your practice.

The American Evaluation Association is celebrating Independent Consulting TIG Week with our colleagues in the IC AEA Topical Interest Group. The contributions all this week to aea365 come from our IC TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Sarah Brewer and I am a performance management specialist for Deloitte Consulting LLP. I help federal leaders design and build their performance management capability to improve their ability to lead, manage and communicate program performance.

Over the past several years, the Office of Management and Budget (OMB) has provided guidance to improve the use of administrative data to provide evidence of program performance and guide data driven decision making.  Administrative data is the data government programs collect through the implementation and management of their program activities. Despite these efforts, many program managers have struggled to turn this data into meaningful insights.  As a result program managers continue to implement their programs without using the administrative data they have available.  To get better use of administrative data, program managers should consider investing in three actions:  (1) Defining Meaningful Metrics; (2) Setting Performance Targets; and (3) Conducting Performance reviews.

Lesson Learned: Define meaningful metrics. With specific data capturing the management and implementation of the program implementation, it often becomes overwhelming for program managers to select which of the many possible metrics are the most meaningful.  As a result, program managers need to prioritize metrics that measure activities that they are intentionally trying to change.  For example, if a communications campaign is trying to reach out to the public, it is important that it prioritizes metrics that measure its outreach.

Rad Resource: Check out Deloitte’s one-pager for tips on how to identify specific priority areas and focus on measuring and monitoring those metrics.

Lesson Learned: Set performance targets. Administrative data on its own reflects what the program is doing but not how well the program is doing it.  As a result, program managers need to set targets to define performance.  For example, if the communications campaign reaches one person it achieves its broad goal of ‘outreach” but when the program manager sets the goal of reaching 10,000 people in one month – it turns the metric into a performance target.

Hot Tip: Insights are gleaned from administrative data when program managers set expectations on where they want to be and measure whether or not they have reached the performance target.

Lesson Learned: Conduct performance reviews. Finally, to turn administrative data into insights program managers should conduct quarterly performance reviews that brings the program manager together with the leadership team to discuss the prioritized metrics and performance against their targets.

Hot Tip: Only through conversation can leaders discuss what is working and what is not working and identify ways they can help improve areas of underperformance and capitalize on leading practices.

The American Evaluation Association is Deloitte Consulting LLP’s Program Evaluation Center of Excellence (PE CoE) week. The contributions all this week to aea365 come from PE CoE team members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I’m Meklit Berhan Hailemeskal, an evaluation specialist with Deloitte Consulting LLP. I work with both US federal agencies and global development agencies to plan, design, and implement program evaluation and performance measurement initiatives. Working fluidly across federal and global health programs, I have learned specific lessons and found helpful resources from one side that can readily be useful for the other.  Today I want to share a couple of useful resources from the global health evaluation space that I believe can be valuable for the federal evaluation space.

My work with federal health programs often involves supporting grantees in one form or other in implementing the required performance measurement and evaluation activities. The questions I hear often from grantees is how the data provided to the funding agency are going to be used and how they can be useful to the grantee. While I will not attempt to answer those questions here, I would like to share some examples of learning platforms used to facilitate evaluation data use from the global evaluation space.

Lesson Learned: Use of evaluation results requires an intentional and systematic approach to translate evaluation/performance measurement findings into realistic and meaningful programmatic recommendations, and a mechanism to work with program managers to monitor the implementation of those recommendations.

Rad Resource: The Independent Evaluation Group for the World Bank Group maintains a Management Action Record Database to document and monitor post-evaluation action.  The database lists the key findings and recommendations that emerge from evaluation findings and tracks the progress of implementation of these recommendations at the program level. This database serves as a tool to promote and build accountability for the use of evaluation results.

Lesson Learned: From time to time, it is necessary to reflect in a systematic way on what the impact of evaluation/performance measurement actually is – what have we collectively learned from our evaluation/performance measurement efforts and how has that influenced how we work and what we are able to achieve? Having a systematic process and standardized tools to facilitate this reflection helps hold evaluators and program teams accountable for incorporating evaluation findings and recommendations into program planning and implementation.

Rad Resource: Better Evaluation recently published Evaluations That Make a Difference – a collection of stories from eight countries about how evaluation results (and processes) have been used to influence change within organizations and the lives of people. These stories are great illustrative examples of the meaningful difference that evaluation can bring about when there is intentional and strategic reflection on evaluation results.

While the resources presented are from a global context, I have found that their intent and use continues to inspire and influence domestic evaluations.

The American Evaluation Association is Deloitte Consulting LLP’s Program Evaluation Center of Excellence (PE CoE) week. The contributions all this week to aea365 come from PE CoE team members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I am Jonathan Pearson, a senior manager in Deloitte Consulting LLP’s government performance practice. I support government health programs overseas and right here at home to develop and implement program performance systems.

I’m here to tell you that government programs should be more “selfish.”  Yes, “selfish.”  At least when it comes to the design of their performance frameworks. They put in the work to create frameworks and systems to gather and report data, but rarely do those performance frameworks give back equally. Let’s just say it’s not a symbiotic relationship.

After all, what’s the purpose of government performance systems? To check the box? To be compliant with GPRAMA? Compliance is obviously important, but it’s also pretty low bar given the exciting things performance systems can achieve for government. So why not dream big? Why shouldn’t government expect more of its performance frameworks? Well, they should!

Hot Tip #1: Think about who determines your funding and what they are expecting from your program. What’s the lifeblood of government programs?  Sustainable funding is! Those that hold the purse strings want to know about the impact their investments are making.  So why not incorporate the information needs of funders into government performance systems in the first place? Do some research into congressional inquiries about your programs.  What about senior agency leadership? What do they want to know? Use your performance systems to brag about your programs to specific stakeholders that influence your financing. And give them something to tweet, not only the 200-page evaluation report.

Rad Resource: Search congress.gov or gao.gov to find inquiries and reports about your program.

Hot Tip #2: Empower your programs. What’s just about as important as sustainable funding?  Providing high-quality services! So why not design performance metrics that give program leaders the information they need to measure and improve the performance of their programs? Seems obvious, right? Except sometimes it’s not.  Metrics should identify positive (and negative) outliers and measure achievements across the logic model.  Interview your program managers to see what their critical business questions are and convert their responses into measures.  Empower the program managers!

Hot Tip #3: Avoid confusing implementers on the ground with irrelevant data clutter. You can develop spectacular technical guidance (and you should), but what really sends a clear message about program priorities are performance measures.  Implementers at the state and local level interpret measures as the priority activities for the program. Because programs wouldn’t collect performance measures on something that wasn’t a program priority, right? Right. So let’s use measures to clearly communicate program priorities to implementers and not distract them with irrelevant data clutter.

Voila!

The American Evaluation Association is Deloitte Consulting LLP’s Program Evaluation Center of Excellence (PE CoE) week. The contributions all this week to aea365 come from PE CoE team members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hi! I am Jenica Reed, an evaluation specialist with Deloitte Consulting LLP. While working remotely with team members and clients, I’ve learned that communication, interpersonal skills, and other so-called “soft skills” have each played a role in the achievements of our work. The importance of these skills is often left out of trainings.  If you take the time to incorporate these skills into your work, you might be amazed by how much richer your interactions are and how the quality of your data and depth of your understanding may be improved.

Lesson Learned: Interpersonal skills and building rapport matters. Meeting in-person or taking the time to know primary contacts at the start of an engagement helps build rapport that lasts throughout the project. Building these relationships and learning personalities and mannerisms can aid communication and cooperation throughout the rest of the project.

Building rapport early on can:

  • Set the stage for honest discourse, trust and credibility
  • Provide a more careful picture of the context and players
  • Enable access to people and information
  • Create an open atmosphere for questions or concerns, and
  • Alleviate concerns of being tested or judged

Hot Tip: I have found that simple questions such as preferred method of communication (phone, email, even text) and time of day for meetings can go a long way towards improving communications.

Lesson Learned: Credibility can be built through skill and rapport. Push-back often comes from somewhere—is it lack of trust in your skills? Considering the evaluation just a requirement or worse, an audit? Are there unknown pressures or implications from findings that are weighing on their mind?

Some considerations:

  • Identify pressures and stressors they face
  • Discuss “what’s in it for them?” and gather data that is most relevant and able to be fully utilized
  • Identify leading ways to report data that will meet their circumstances and resonate with decision-makers
  • Incorporate contingency planning into the design and data collection—is there other information you may want to know about negative findings that may be uncovered? What could be explored further?

Hot Tip: Realize that valuable information isn’t only provided in formal meetings. Nervous jokes about “I wonder what – will say about this” often have real meaning and may need to be considered in design and data collection decisions. Maybe this is a new stakeholder to include, a potential critic of findings, an unexpected decision-maker. Addressing these issues can improve trust in the process and ultimately the conclusions drawn and recommendations made. There is often increased openness to receiving negative or mixed findings when credibility has been established and rapport built.

The American Evaluation Association is Deloitte Consulting LLP’s Program Evaluation Center of Excellence (PE CoE) week. The contributions all this week to aea365 come from PE CoE team members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I’m Patrick Koeppl, cultural anthropologist, mixed methods scientist, Halloween enthusiast and Managing Director at Deloitte Consulting LLP. Throughout my career, I have found mixed methods are often the leading way to conduct broad evaluation of complex systems and situations. Qualitative approaches like in-depth interviews, focus groups, participant observation, policy reviews, and many others have a place in developing understanding. Determining the validity and reliability of qualitative data collected via mixed methods poses both challenges and opportunities for authentic understanding of complex systems and phenomenon.

Lesson Learned: The science of numbers, statistics, randomized samples and double-blind studies may indeed be described as “hard,” but qualitative approaches are not “soft.” Rather, they are “difficult.”

Practitioners of the “soft sciences” often face criticisms that their endeavors are not scientific. Nay-sayers may claim that qualitative research is somehow illegitimate—and too often anthropologists, sociologists and others hide in the dark, brooding corners of the application of their craft, frustrated that their methods, approaches and findings may not be taken seriously by the “real scientists” who frame the discussion. Qualitative evaluators fall into this trap at their own peril—there is nothing inherently unscientific about qualitative methods and the findings and inferences drawn from qualitative data.

Hot Tip: It is the practitioner, the scientist, who should bring rigor and science to qualitative methods. Set up your approach with rigor by asking yourself:

  • Are the evaluation questions clear?
  • Is the evaluation design congruent with the evaluation questions?
  • How well do findings show meaningful parallelism across data sources?
  • Did coding checks show agreement across interviewers and coders?
  • Do the conclusions ring true, make sense, and seem convincing to the reader?

Lesson Learned: Qualitative data are the source of well grounded, richly descriptive insights and explanations of complex events and occurrences in local contexts. They often lead to serendipitous findings and launch new theoretical integrations.  When reached properly, findings from qualitative data have a quality of authenticity and undeniability (what Stephen Colbert calls “truthiness”).

Hot Tip: Establish scientific rigor to determine reliability and validity in the following ways:

  • Use computer assisted data analysis tools such as ATLAS.ti or NVivo for data analysis
  • Develop a codebook and data collection protocols to improve consistency and dependability
  • Engage in triangulation with complementary methods and data sources to draw converging conclusions

Finally, putting qualitative data results into the context of a story and narrative to convey a concrete, vivid and meaningful result is convincing and compelling to evaluators, policy makers, and practitioners. Such questions and tools warrant the scientific use of qualitative data collection and analysis in the quest for “useful” evaluation.

The American Evaluation Association is Deloitte Consulting LLP’s Program Evaluation Center of Excellence (PE CoE) week. The contributions all this week to aea365 come from PE CoE team members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Older posts >>

Archives

To top