AEA365 | A Tip-a-Day by and for Evaluators

Hello, my name is Evangeline Danseco and I work as a Performance Measurement Coach at the Ontario Centre of Excellence for Child and Youth Mental Health. I help community-based non-profit agencies develop a performance measurement framework for their organization.

In my recent work with community-based agencies, I encourage them to think about including both SMART indicators and WISE indicators in their performance measurement framework. Key performance indicators let an organization know how well it is progressing toward achieving the goals they set out. Indicators that are SMART are those that are Specific, Measurable, Achievable, Relevant and Timely. When indicators are framed the SMART way, there is greater clarity among staff when collecting and analyzing the data.

Sometimes, it is difficult to summarize our work with numbers using the SMART criteria. The stories and inspiring elements of the work do not surface enough and yet exert a tremendous influence on the organization’s development and culture. WISE indicators consider the Whole system, are Inspiring, consider the Story and the Synergy among indicators, and are Engaging. Considering the Whole system means that a holistic perspective of the organization’s mission is evident among the indicators, and that a systems approach or a theory of change is reflected among all of the indicators. Indicators can be inspiring and point to the things that the organization is improving upon, not only focus on errors or mandatory requirements. The Story and Synergy among the indicators provide an accurate picture of what is happening in the entire organization. Finally, Engaging indicators need to involve key stakeholders at every step of the process of defining, collecting and using the indicators. When identifying key performance indicators, make sure that the indicators can galvanize people into action. Try to have space for these WISE indicators that are important and complement other indicators that are measured in traditional ways.

Hot Tip: A scorecard is a useful way of summarizing key performance indicators that agencies want to monitor and improve upon. Domains in a scorecard typically include financial indicators, staff or human resource indicators, program effectiveness or impact, and indicators relating to the key stakeholders of the agency.

Rad Resource: Our recent report summarizes some of the domains and examples of child and youth mental health indicators measured at a system level such as state-wide or country level indicators. These indicators help different stakeholders such as policy-makers, researchers, families and clinicians understand how well a system is doing.

The American Evaluation Association is celebrating Behavioral Health (BH) TIG Week with our colleagues in Behavioral Health Topical Interest Group. The contributions all this week to aea365 come from our BH TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hi, I’m Lisa Melchior, President of The Measurement Group LLC, a consulting firm focused on the evaluation of health and social services for at-risk and vulnerable populations. I’d like to share some lessons I’ve learned in close to 30 years of working with community-based behavioral health and other programs that have addressed issues in populations with co-occurring substance use and mental health problems.

Effectively addressing multiple vulnerabilities in behavioral health is critical to optimizing participant outcomes. For example, Vivian Brown, George Huba, and I found that women in substance abuse treatment with high therapeutic burden (multiple co-occurring behavioral health, physical health, and other vulnerabilities) were more likely to terminate treatment early. However, if they were retained long enough, they were just as likely to succeed as other clients with less “burden.” Engaging and retaining participants long enough to benefit from the programs we evaluate is critical to achieving the intended outcomes of those interventions.

Hot Tip:

What does this mean for us as evaluators? Our evaluation designs, measures, and analyses need to address the multiple vulnerabilities that clients hope to address through the programs we evaluate – which are often addressed by multiple systems of care. The Transtheoretical Model can be a useful framework for approaching the measurement of cross-systems outcomes in client-focused evaluations. Yet we also need to be mindful of efficiency in our measurement and consider data collection within the context of the program.

Lessons Learned:

Don’t be limited by a program’s label or funding source. People in behavioral health programs have multiple needs, addressed by multiple systems. For example, employment may be an important predictor of subsequent success in reducing criminal justice system recidivism; as stated in a recent LA Times article, “in addition to substance abuse and mental health issues, chronic unemployment is one of the primary barriers to smooth re-entry.” If a program’s funding is for behavioral health services, don’t overlook including other indicators if they are pertinent to the intervention and its evaluation.

Take time to learn how multiple systems interact in the context of your program so the evaluation reflects those relationships accurately. For example, in a program we currently evaluate – a housing and treatment intervention for homeless young adults with behavioral health conditions – case-finding is conducted by a team specializing in outreach to the homeless, as opposed to staff from the behavioral health treatment team. As these are separate divisions within the organization, with different funders, it was important to understand these details and not make assumptions based on similar programs we previously evaluated.

As a practical issue, having a dedicated point person on the evaluation team who coordinates with program staff is critical! Especially with multisystem programs, there are many moving parts. Having an evaluation team member who is seen by program staff as an extension of their team is invaluable for ensuring high quality data.

The American Evaluation Association is celebrating Behavioral Health (BH) TIG Week with our colleagues in Behavioral Health Topical Interest Group. The contributions all this week to aea365 come from our BH TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello, I’m Roger A. Boothroyd from the University of South Florida. One thing I have learned from conducting mental health services research over the past 30 years: Research repeatedly documents that approximately two-thirds of adults diagnosed with a mental health disorder have at least one physical health condition. It is also well known that comorbidity of mental health disorders and substance abuse disorders is high, ranging between 35 and 45%. Further, many adults with mental illness are likely to be arrested. Finally, over half of adults with a mental health disorder do not receive treatment. Thus: 1) mental health issues seldom occur by themselves but often occur with other comorbid conditions; and 2) of adults with mental health disorders who enter treatment, comorbid conditions often result in them being served simultaneously by multiple service systems.

For children and youth with emotional and behavioral challenges, the issue of simultaneous multiple service system involvement is even more complex. Children and youth attend school, so the educational system is necessarily involved. Often, they are involved with the child welfare and/or juvenile justice systems; and, of course, their families play a significant role in their day-to-day lives. Thus, the question for us as evaluators is: How can we realistically evaluate the effectiveness of a program or an intervention without assuming a more systems level evaluative perspective?

Lesson Learned: Some 20 years ago, I was involved in an evaluation that explored why so few adults with severe mental illness who sought vocational rehabilitation services received them and were successful in obtaining jobs. Our evaluation included a systems thinking framework that involved modeling how individuals with severe mental illness entered and moved through the mental health and vocational rehabilitation systems. At the start of the evaluation, the prevailing hypothesis (mine included) was that there were not enough resources available for vocational rehabilitation services for adults with severe mental illness. Yet, when the cross-systems model was constructed, many adults with severe mental illness were receiving vocation rehabilitation services. The real problem was the lack of sufficient numbers of jobs for those adults who were trained; and the lack of jobs prevented them from exiting the vocational rehabilitation system. In fact, the model predicated that if more resources had been devoted to vocational rehabilitation services, the functioning of both systems would have gotten much worse. The answer was straightforward: Open up more jobs. The county mental health and vocational rehabilitation departments worked together with their Chamber of Commerce and local businesses to secure job placements for adults who had completed vocational rehabilitation training. As the flow of adults through these systems improved, the capacity to train other adults increased – all without new resources. This was my first introduction into systems thinking and seeing firsthand the importance of assuming a broader evaluation perspective.

This week, evaluators from the Behavioral Health (formerly Alcohol, Drug Abuse, and Mental Health) Topical Interest Group will share their strategies, experiences, and insights gained from conducting behavioral health-related evaluations that assumed this broader systems-level perspective.

The American Evaluation Association is celebrating Behavioral Health (BH) TIG Week with our colleagues in Behavioral Health Topical Interest Group. The contributions all this week to aea365 come from our BH TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello loyal readers! I am Sheila B Robinson, aea365’s Lead Curator and sometimes Saturday contributor and I’m looking for a new partner!

Get Involved: We’re looking for an aea365 intern  – a volunteer curator willing to donate perhaps an hour or two a week to help aea365 continue its commitment to maintaining a high quality daily blog by and for evaluators.

The aea365 intern will primarily assist with:

  • Recruiting contributors – sending invitations, communicating with leaders of sponsoring groups
  • Shepherding contributors – sending reminders, asking questions, giving thanks
  • Uploading contributions – entering posts into the aea365 wordpress-based website (very easy to learn!)
  • Contributing  – one aea365 post per month

The commitment requires on average approximately 1-2 hours per week for six months (with the option to continue if it’s working well!) beginning approximately October 1 and running through approximately April 1.

Lesson Learned: I began my position as Lead Curator with a six-month commitment and that was 4.5 years ago! Yes, it’s just that fun and rewarding. I learn so much from reading each post, and I love “meeting” evaluators through my communication with them.

Hot Tip: The ideal intern has contributed before to aea365, or at a minimum is a regular reader familiar with the format, breadth, and style of entries. This individual has good writing skills and communications skills and is interested in making connections across the evaluation community. Finally, the work can be done remotely, from anywhere, and thus the intern should be self-directed, organized, and adept at meeting deadlines.

Serving as an aea365 intern is a great way to build your professional network and expand your knowledge of the breadth and depth of the field. The intern will receive ongoing mentoring throughout the term of the internship as well as support in learning how to use WordPress.

This is a volunteer position, and as such compensation will be in the form of our sincerest gratitude, thanks, and recognition of your contribution!

To apply – on or before Friday, September 8, send the following to aea365@eval.org: (1) a brief letter of interest noting your favorite type(s) of aea365 posts and why, and (2) an example original aea365 post following our contribution guidelines and demonstrating your writing/editing capacity (this can be one that has been published or a new one you compose for this purpose).

Cool Trick: Be sure to follow the contribution guidelines, and proofread your work when submitting your example post.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hi, I’m Courtney Clingan, a Senior Research Analyst with The Improve Group, a research and evaluation consulting firm based in St. Paul, Minnesota. To wrap up our week of posts on evaluation teams, I want to share a memorable experience I had when staff from a client organization truly became part of the evaluation team.

We recently started working with HOPE Coalition, a Red Wing, Minnesota-based agency that serves people who have lived through domestic violence, sexual assault, child abuse, and homelessness. Like other nonprofits that serve clients with complex needs and have small budgets, HOPE Coalition was looking to establish sustainable and realistic evaluation methods.

To empower HOPE Coalition staff to use their knowledge of client needs and to give them skills they could apply after our project is finished, we designed a workshop that made the staff part of the team. We started with a clear overview of logic models. Then we transitioned into a more hands-off, coaching role as we reached the critical “stepping back” point where we handed more responsibility over to staff, who took what we had taught them and designed logic models for other agency programs.

Lesson Learned: This workshop design created space for a valuable collaboration. While my colleague Stacy Johnson and I contributed our expertise in evaluation design, HOPE Coalition staff contributed their experience from working with a hard-to-survey population.

Jo Seton, Grant Writing and Communications Coordinator for HOPE Coalition, said she liked how we explained logic models by building one for HOPE Coalition on the wall with sticky notes, and kept it clear and simple, without being condescending. Then we tapped into the true experts on our evaluation team – HOPE Coalition staff – and they took the reins.

“By logic model number five we had it down to a fine art,” Jo said.

Hot Tip: We studied up on the client in a way that complemented the clients’ own experience. Jo sent us information on HOPE Coalition to read beforehand and we took full advantage of this in preparation. She said she appreciated that we had a “nuts and bolts” understanding of her agency before we arrived – even as her staff provided the deep understanding of programs and services. It meant we could spend less time on learning about the client and more time on evaluation strategy.

As a result of our work with HOPE Coalition, the agency has developed a “culture of evaluation,” one of its goals before starting the work. By engaging clients as team members, staff grew a sense of ownership in the process that will be sustainable for HOPE Coalition’s evaluation efforts.

The American Evaluation Association is celebrating Evaluation Teams Week. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi, we’re Audrey McIntyre and Michael Prideaux. We recently completed internships at The Improve Group, an evaluation consulting firm based in Minnesota.

As new evaluators, we have interesting perspectives from serving on evaluation teams during our internships. We worked with The Improve Group colleagues and clients on projects for the Angel Foundation, the Highland Friendship Club, and the Minnesota Department of Agriculture. In our work, we were truly members of the evaluation teams: We designed surveys and conducted interviews, analyzed data, and helped clients understand the findings.

Lesson Learned: Communication is key

One of the main avenues to success in an evaluation project team is strong communication. That extends to sharing core values. If you’re working from the same premise to the same goal, you only have to figure out the steps in between, rather than also having to put in time to determine a shared starting point.

Aside from moving a project forward, we found that strong communication allows team members to learn from each other. Especially being new to evaluation, we really valued hearing others’ ideas on projects and learned a lot just from listening to what our team members suggested. We met regularly with organization leaders about our projects to co-develop ideas on how to engage clients – it was through these meetings that we, too, were able to contribute our ideas and perspectives to The Improve Group’s work.

Hot Tip: Take full advantage of bright, engaged interns on evaluation teams

By being integrated into The Improve Group’s project work, we were able to contribute fully to the organization. Often organizations delegate less interesting tasks, like data entry, to interns. And while that is an important skill to grow, and we did do some data entry, we also did a lot of brainstorming, problem-solving, and development of things that made a difference and contributed to the team – which is what we loved the most.

Working on projects as interns also allowed us to be contributing to a larger goal as we were learning. Take Audrey’s contributions to The Improve Group’s project providing technical assistance and program evaluation to Minnesota Alcohol, Tobacco, and Other Drugs grantees as an example. She had some experience with data analysis at the time, but not enough to think of half the things the team suggested regarding how to analyze, visualize, and report the information we had gathered. If she hadn’t worked on a team, she wouldn’t have been able to do the good work she did on that project.

Rad Resource:  AEA’s Graduate Education Diversity Internship program provides paid internship and training opportunities during the academic year. Additional internship opportunities are posted on local AEA affiliate sites in April each year.

The American Evaluation Association is celebrating Evaluation Teams Week. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello! I am Jennifer Obinna, Ph.D., M.S.S.W., and I lead evaluation teams at The Improve Group, a research and evaluation consulting firm in St. Paul, Minnesota. Whether I am leading an internal team that reports to me or a collaborative team with clients and community partners, I look for ways to share power so that everyone can contribute their best.

My approach to leading teams has three dimensions: facilitative leadership, building on strengths, and supporting collaboration.

Facilitative leadership” is my preferred style, because it supports a collaborative environment. Sometimes people compare and contrast characteristics of facilitative and hierarchical leaders this way:

  • Hierarchical leaders assume top-down authority, know what to do, seek the “right” decision, and rely on individuals.
  • Facilitative leaders assume the power of the team as a unit, know the various “how to” methods, seek team ownership, and rely on the team’s collective ability to take action.

Facilitative leadership is important when you want buy-in, adaptability, and creativity from the team. In our work, we are seeing a shift to facilitative leadership because it is dynamic and empowered.

Lesson Learned: Build on strengths and passions

In our team discussions, we often think about how some tasks come easily to us, while others take more time and effort to master. We can liken this to using one’s dominant hand compared to the non-dominant one. Knowing what comes easily to each team member and which tasks will take longer helps us think through task assignments and what kind of support team members will need to succeed.

Similarly, we pay attention to which tasks make us feel energized and which leave us feeling cold and depleted. A team member may be competent at tasks that leave them uninspired. Others may have a passion to learn something new, but need more time and coaching as they master new skills. Paying attention to strengths and passions helps develop team members individually, shows the team how to support each other with a balanced workload, and allows teams to thrive.

Lesson Learned: Support collaboration

At its best, collaboration is a form of co-creation. Early in a project is a fertile time to collaborate. We start each workplan with a sketch. Instead of having one team member responsible for the initial sketch, we encourage all team members to brainstorm ideas to define and refine the actions together. This connects team members to the work and creates better products and outcomes.

Rad Resources

The American Evaluation Association is celebrating Evaluation Teams Week. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hello! We are Chithra Adams, Director of Evaluations at the Human Development Institute at the University of Kentucky and Rebecca Stewart, Chief Practice Officer at The Improve Group, a research and evaluation firm based in Minnesota. For today’s post on evaluation teams, we are tackling a topic that no team leader relishes – what to do when something goes wrong.

Anyone who leads an evaluation team knows that sooner or later, something will go wrong. It could be related to managing an evaluation project (e.g., missed deadlines or underestimating the effort needed for a task) or to the evaluation itself (e.g., poor quality data or misinterpreted findings). While teams can have processes in place to prevent or minimize mishaps, it’s also important for leaders to reflect on their reaction to mistakes.

What should you do when something goes wrong?

Lessons Learned

  • Anticipate challenges and prevent them before they happen. Brainstorm possible problems you might encounter, as well as some solutions. For example, our team conducted an evaluation involving site visits all over Minnesota in the winter time. After a harrowing journey through a blizzard, we collaborated with the client on how to rearrange site visits if travel became unsafe due to weather conditions.
  • Intervene early. Sometimes there are early warning signs of a looming problem. Maybe you are not getting the expected response rate to a survey or feedback has been delayed. In these cases, having a thoughtful conversation with the team can help identify solutions before the problem worsens.
  • If the problem has already occurred, acknowledge your frustration with the situation (not the person) to the team. Remind yourself and the team that it is human to make a mistake and it is human to be upset about it. Then, move on to identifying how to prevent the situation from happening again.
  • Acknowledge the mistake to the client. This shows the team members that you have their back. More importantly, when leaders absorb the risk of mistakes, it allows teams to be creative.
  • Give time and space to reflect. Often when teams reflect together, solutions arise organically. Leaders can practice deep listening (rather than problem-solving) during team reflection. This will provide insight to the circumstance that led to the mistake and possibly uncover ways to avoid the mistake in the future.

Hot tips

  • Having a conversation early in the project about “when we fail” rather than “if we fail” helps the team prepare for setbacks.
  • Adopting a curious or learning mindset, rather than a punitive or blame-shifting one, builds trust with team members and allows for growth as a team.

Rad Resource:

Check out this article from the Harvard Business Review (HBR) on learning from failure.

This article from HBR gives good tips for how to have a difficult conversation with a team member.

For resources on mindfulness for work and life, see mindful.org

The American Evaluation Association is celebrating Evaluation Teams Week. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello, my name is David McKay and I am a senior evaluator at Kentucky University’s Center for Excellence in Developmental Disabilities Education, Research, and Service. We help to improve lifelong opportunities and services for individuals with disabilities, their families, and the community. Our evaluation team works on a variety of projects, the number of which has grown over the years we’ve worked together. I want to share some lessons I’ve learned on the advantages and challenges of a team approach.

Lessons Learned: Highlighting Advantages

  • The biggest advantage is that work products are much better as a result of working in teams. Working collaboratively, we create higher quality analysis and better applications of that analysis to client needs.
  • Another advantage is a team’s diverse points of view. We all bring different complementary strengths. We’re able to see a more accurate picture because sometimes we bump heads and we listen to one another.
  • Teamwork makes everyone better. Team members step up their game because no one wants to let the team down or hold the team back. And we set the bar collectively because we all bring our own strengths.
  • Working in teams fosters creativity and nurtures new ideas. Working together, someone will have one idea, someone else will have another idea, and that will feed into another. Pretty soon we’re moving in a really fun way. All of our different directions weave together in a way that works.

Lessons Learned: Addressing Challenges

  • In general, teams move at a slower pace than individuals. The tradeoff for a better product is worth the time once you understand and accept that reality.
  • We need more thoughtful and reflective hiring practices. You can’t just hire for skills, you have to hire people who will complement the team. However, this can be a delicate balance between hiring people who “fit” and hiring people who bring valuable, diverse styles and perspectives. Cultivate a team with different areas of expertise, backgrounds, experiences and points of view. You can do this by casting a wide net into places talent and ideas may be hiding (such as nonprofit networks, the sciences, corporate spaces, etc.) and carefully on-boarding people so that they get acclimated early.
  • Make sure that roles, responsibilities and tasks are delineated clearly. Sometimes if tasks aren’t clearly assigned, productivity slows down and it can be difficult to assess the performance of individuals or the team.

Cool Tricks

  • Empathy and trust are a must, which can be hard for some people.
  • The team must understand the vision behind the work. The team must understand what it is trying to accomplish. A wonderful team with no vision will just spin in circles.

Hot Tips

  • Be flexible.
  • Share calendars. Write everything down. Set deadlines.
  • Establish and communicate your culture early.
  • Build trust early and consistently.
  • Establish a culture of gratitude.

The American Evaluation Association is celebrating Evaluation Teams Week. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello, we are Chithra Adams, Director of Evaluations at the Human Development Institute at the University of Kentucky, and Leah Goldstein Moses, CEO of The Improve Group, a research and evaluation consulting firm based in St. Paul, Minnesota.

There is an African proverb that says, “If you want to go fast, go alone. If you want to go far, go together.” This week, we are curating a series of posts about the efficacy and potential pitfalls of working in teams. This post offers reflections on some key lessons learned in leading evaluation teams.

Lessons Learned:

  • Different situations call for different types of leadership. But some leadership behaviors may conflict with each other. For example, a leader might adopt an open and facilitative mindset during a brainstorming session. On the other hand, a task-oriented mindset might be needed to ensure that an evaluation project is implemented on time.
  • Teams adopt different strategies along a continuum to getting their work done – from full co-creation and collaboration to splitting tasks among team members. It can be useful to determine what strategy is best suited for the project at hand. If teams fall into a habit of working the same way all the time, they miss out on the benefits of other ways of working.
  • The level of team functioning contributes to both the success of a project and the experience of a project. When your team works well together, you can get more done, your team can go into greater depth by attending to various perspectives, AND the members of the team can have a great time with each other.
  • Finally, sometimes it takes a while to see the impact of evaluation. So take time to celebrate incremental successes and practice gratitude.

Hot Tips:

  • Like any skill, leadership improves with practice. Use time to reflect and foster self-awareness about your own responses and how they may influence your team’s dynamic. Remember to be kind to yourself and forgive your mistakes.
  • Likewise, extend this attitude of kindness and forgiveness to team members. Many teams adopt norms or values as they form. “Assume good intent” can be a useful motto.

Rad Resources:

Could clarifying values help your team? Try this exercise from Entrepreneur Magazine around developing your team’s shared values.

If you feel your team is in a rut, this blog lists great activities for team-building.

The American Evaluation Association is celebrating Evaluation Teams Week. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Older posts >>

Archives

To top