AEA365 | A Tip-a-Day by and for Evaluators

CAT | Military and Veteran’s

I’m Erika Steele, a Health Professions Education Evaluation Research Fellow at the Veteran Affairs (VA) National Center for Patient Safety (NCPS).  For the past few years I have been collaborating with Research Associates at the Center for Program Design and Evaluation at the Dartmouth Institute for Health Policy and Clinical Practice (CDPE) to evaluate the VA’s Chief Resident in Patient Safety Program (CRQS). The CRQS is a one-year post-residency experience to develop skills in leadership and teaching quality improvement (QI) and patient safety (PS).  Since 2014, Research Associates at CDPE have conducted annual evaluations of the CRQS program.  In 2015, we began evaluating the CRQS curriculum by developing a reliable tool to assess QI/PS projects lead by CRs.

One of the joys and frustrations of being an education evaluator is designing an assessment tool, testing it and discovering that your clients, apply the tool inconsistently.  This blog will focus on the lesson learned about norming or calibrating a rubric for rater consistency during pilot testing the Quality Improvement Project Evaluation Rubric  (QIPER) with faculty at NCPS.

Hot Tips:

  1. Develop Understanding the goals of the assessment tool
    Sometimes raters have a hard time separating grading from assessing how well the program’s curriculum prepares learners. To help faculty at NCPS view the QIPER as a tool for program evaluation, we pointed out patterns in CRs scores.  Once faculty started to see patterns in scores themselves, the conversations moved away individual performance on the QIPER and back evaluating how well the curriculum prepares CRs to lead a QI/PS project. 

Once raters understood the goal of using the QIPER, insistences of leniency, strictness and first impression errors were reduced and rater agreement improved.

  1. Create an environment of respect
    All raters need the opportunity to share their ideas with others for score negotiation and consensus building to occur.  We used the Round Robin Sharing (RRS) technique to allow faculty to discuss their expectations, rationale for scoring, and ways to make reaching consensus easier.  We used the graphic organizer in Figure 1 to guide discussions.

RRS helped faculty develop common goals related to program expectations for leading QI/PS projects which led to increased rater agreement on scoring projects.

Figure 1: Round Robin Sharing Conversation Guidance

  1. Build Strong Consensus
    Clear instructions are an important aspect for ensuring that raters apply assessment tools consistently. Using the ideas generated during RRS, we engaged the faculty in building a document to operationalize the items on the QIPER and offer guidance in applying the rating scale.  The guidance document served as a reference for faculty when rating presentations.

Having a reminder of the agreed upon standards helped raters to apply the QIPER more consistently when scoring presentations.

Rad Resources:

  1. Strategies and Tools for Group Processing.
  2. Trace J, Meier V, Janssen G. “I can see that”: Developing shared rubric category interpretations through score negotiation. Assessing Writing. 2016;30:32-43.
  3. Quick Guide to Norming on Student Work for Program Level Assessment.

The American Evaluation Association is celebrating MVE TIG Week with our colleagues in the Military and Veteran’s Issues Topical Interest Group. The contributions all this week to aea365 come from our MVE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello, I am Stephen Axelrad the chair of the Military and Veteran Evaluation (MVE) Topical Interest Group. One of the reasons why I wanted to start this TIG was to help evaluators navigate the complex web of stakeholders in the military community. Many evaluators have little to no experience in the military and its formal structure. However, the military has a long history of valuing systematic evidence to inform decisions about policies and programs. Many uniformed leaders turn to civilian sources to understand innovative, evidence-based methods for addressing national security issues as well as social and organizational problems that affect the military (suicide, sexual harassment, sexual assault, financial literacy, domestic violence, opioid abuse).

Hot Tips:

Civilian evaluators are not expected to know everything about the military to make effective connections. Evaluators just need to apply the same culturally responsive methods they apply to other sub-cultures to military stakeholders. Here are some tips that can set culturally responsive evaluators up for success.

  • The military is not monolithic: the popular press often refers to the military as the “Pentagon” and makes it seem like there is only one military perspective; the actual reality is far from the truth; the military community is composed of communities that vary based on Service branch (Army, Navy, Air Force, Marines, Coast Guard), Component (Active-duty, Reserve, National Guard), rank (commissioned officer, enlisted officer, enlisted), career field and other factors.
  • Not all members of the uniformed military are soldiers: another common mistake in the popular press is to refer to uniformed military members as soldiers but that only applies to the Army; the other terms for military are – sailors (Navy), airmen (Air Force), marines (Marine Corps), guardmen (National Guard, Coast Guard); these terms are central to their identities so getting the term right will help you build rapport with the uniformed military
  • Military installations are like mini-cities: the installation commander is like the mayor and there are usually one or two commands that act like the major employer; installations attract workforces with specific skill sets and interests that give each installation a unique culture
  • Leaders are change agents: one of the few consistent qualities across the military system is the value placed on leadership; leadership is frequently defined through rank and other formal authority; however, the military sees leaders at all ranks and leverages peer leaders to create positive social change

Rad Resources:

The following web sites were developed to help civilian professionals understand military structure

Lesson Learned: Best opportunity for evaluators to help with data-driven, decision making is to come within the first 90 days of a senior military leader’s taking on control of the command. During this period, leaders are in a learning mode, want data relevant to the command, and want to understand ways of improving their commands.

The American Evaluation Association is celebrating MVE TIG Week with our colleagues in the Military and Veteran’s Issues Topical Interest Group. The contributions all this week to aea365 come from our MVE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Collaborative evaluation principles have been used to bolster projects and gain representative stakeholder input. I’m Julianne Rush-Manchester of the Military and Veterans TIG. I’m an implementation science and evaluation professional working in the Department of Defense. I’ve learned some tips for facilitating stakeholder input in clinical settings that may be more hierarchical (rather than collaborative) in nature.  These tips could be applied in military and non-military settings.

Lessons Learned: 

  • Push for early involvement of stakeholders, with targeted discussions, to execute projects successfully (according to plan).  It is expected that adjustments to the implementation and evaluation plan will occur; however, these should be modest rather than substantive if stakeholders have provided input on timing, metrics, access to data, program dosage, recruitment challenges, and so forth.  This is particularly true in military settings, where bureaucratic structures dictate logistics and access.
  • Plan for unintended effects, along with intended ones, in new contexts for the program. A replicated program may look slightly different as it must accommodate for nuances of the organization (military member participants, contractors, mandatory vs. volunteer programs, program support from senior leadership). Expected outcomes may be variations of intended ones as the program adjusts to its host setting.

Rad Resources:

This article refers to the use of collaborative evaluation principles when there is an anticipation of systems change as a result of implementation (Manchester et al., 2014)The paper may be helpful in strategizing for collaborative evaluations around evidence based practices in clinical and non-clinical settings, military or otherwise.

The American Evaluation Association is celebrating MVE TIG Week with our colleagues in the Military and Veteran’s Issues Topical Interest Group. The contributions all this week to aea365 come from our MVE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I am Annette L Gardner and I am a faculty member at the University of California, San Francisco. Developing information-rich case studies can be one of the most rewarding evaluation methods.  Not only do they speak to stakeholders on a deep level but as described below they can create a legacy that endures and has the potential to reach a broad base of stakeholders.

In 2012, The Veterans Administration Office of Academic Affairs launched the Centers of Excellence in Primary Care Education (CoPCE) to seek and implement improvement strategies for interprofessional, patient-centered clinical education and methods to prepare health professions team leaders. A mixed-method study was conducted to assess implementation, trainee outcomes and new approaches to team-based interprofessional care.

I worked closely with CoEPCE Coordinating Center and the five Centers  to develop The Centers of Excellence in Primary Care Education Compendium of Five Case Studies: Lessons for Interprofessional Teamwork in Education and Clinical Learning Environments 2011-2016. Each case describes the contextual and developmental issues behind five unique examples of integrated interprofessional curriculum to support the clinical education workplace.  Peer-reviewed by the National Center for Interprofessional Practice and Education, this compendium provides tools and resources to help prepare professionals for interprofessional collaborative practice. These cases include: 

  • Boise VA Medical Center and the CoEPCE’s“Interprofessional Case Conferences for High Risk/High Need Patients- The PACT ICU Model”
  • Louis Stokes Cleveland VA Medical Center and the CoEPCE’s “Dyad Model”
  • San Francisco VA Health Care System and the CoEPCE’s “Huddling for Higher Performing Teams”
  • VA Puget Sound Health Care System Seattle Division CoEPCE “Panel management Model”
  • Connecticut VA Health Care System West Haven Campus CoEPCE “Initiative to Minimize Pharmaceutical Risk in Older Veterans (IMPROVE) Polypharmacy Model”

Hot Tips:

So what makes these cases different from other case studies? For starters, these cases were developed in an environment that values experimental designs and has the sample sizes to support them.  Sensitivity to stakeholder perceptions of ‘evidence’ were critical.  A contributing factor to the positive reception of these cases may have the been the sharing of these case study initiatives across sites and with VA leadership  prior to the development of the compendium. Their preparation represents a partnership effort with high Center involvement. Second, there was a strong desire to support adoption in other training settings. The VA took dissemination very seriously and launched an aggressive campaign to distribute the compendium through multiple platforms, including the VA website, Government Printing Office, LinkedIn, and the Institute for Healthcare Improvement Playbook. Third, VA staff are monitoring the uptake and use of the cases, a rare occurrence in evaluation design, and are soliciting input using an impact using online questionnaire.

Lessons Learned:

Partnerships and a creative approach to dissemination have the potential to keep evaluation findings from being consigned to the ‘dustbin of history’ and facilitate learning beyond the immediate program stakeholders.

Rad Resource:

VA CoEPCE Case Studies Quality Improvement Questionnaire

The American Evaluation Association is celebrating MVE TIG Week with our colleagues in the Military and Veteran’s Issues Topical Interest Group. The contributions all this week to aea365 come from our MVE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Hello, I’m Stephen Axelrad. As the chair of the Military and Veteran Evaluation (MVE) Topical Interest Group, one of the reasons why I wanted to start this TIG was to broaden the military’s understanding of what program evaluation is and broaden program evaluator’s understandings of what the military community is. While there are pockets of the military that conduct evaluations from the health, behavioral, and social science perspective, the majority of program evaluation in the military is conducted from a resource management perspective. In the Department of Defense and the military service branches, program evaluation is embedded in what is considered the program planning, budgeting, and evaluation (PPBE) process. Although different terms are used, the goals of the PPBE process are very similar to what we would recognize in most evaluations.

  1. Describe and understand the key assumptions, inputs, activities, outputs, and outcomes associated with a program, strategy, or initiative.
  2. Identify areas of effectiveness and ineffectiveness, gaps, and innovations in current programs, strategies, initiatives, and systems.
  3. Provide rigorous evidence to support quality decisions from leaders.
  4. Use data to improve program planning and execution.

Hot Tip: Many program evaluators, especially those who exist outside of the military healthcare and community support agencies have financial management, operations research, and systems engineering backgrounds. These fields understand and use some of the same methodological (e.g., surveys, interviews) and analytic techniques (e.g., regression, structural modeling) that evaluators within AEA frequently use. Finding common ground on methodology can help avoid pedagogically based misunderstandings.

Rad Resources: Before engaging with a project or initiative that involves DoD or military service branch officials, review the following web sites to become more familiar with how they may approach evaluation.

Lesson Learned: Evaluators do not have to be in the military in order to understand how to design and execute evaluations affecting the military community. Evaluators who demonstrate an understanding of the military decision-making process (MDMP) and can structure their evaluations as closely as possible to MDMP can maximize buy-in from uniformed and civilian leaders and the use of their evaluation results.

The American Evaluation Association is celebrating MVE TIG Week with our colleagues in the Military and Veteran’s Issues Topical Interest Group. The contributions all this week to aea365 come from our MVE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Stanley Capela. I am the VP for Quality Management for HeartShare Human Services of New York. In addition, I am a military peer reviewer for a nation accreditation organization where I have had the opportunity for participating on reviews for military family readiness programs as well as after school programs on military bases throughout the United States as well as military bases in Germany, Japan and Guam. The purpose of this AEA 365 is to share some tips that I have learned doing these reviews on how best to evaluate military programs.

As a military reviewer I had to undergo training to make sure I have a good understanding on how to approach these reviews. My comments will be based on my experiences as well as what I learned participating in training programs to prepare me for these reviews.

Lesson Learned: First, hierarchy is very important in the military. One key is to be sensitive to their respect for hierarchy and anticipate how you may address this. For example when doing entrance and exit conferences you may see others stand as a higher ranking officer enters the room.

Second, the military has a language that is part of their culture. Therefore, it is important to understand the high usage of acronyms.

Third, their use of various terms such as the use of customers versus clients; staff instead of personnel, service member instead of soldier/sailor or Marine; In-Briefing instead of Entrance Meeting; Cusomer File Review instead of Consumer Record Review.

Fourth, when conducting interviews, it is good to start by letting them know who you are and why you are there. If in my case when doing a review we try to let them know about the accrediting organization and why they are being interview.

Fifth, make sure you stick to a schedule; let people know there may be follow up schedules, avoid casual conversations and if there are people you want to interview outside of the original schedule it is kept to a minimum.

Sixth, when interviewing customers who are veterans, it is important to use the following guidelines: be understanding; be an active listener; listen with empathy, but minimize sympathy; and provide background and context to your interview questions.

These are some tips that I have learned over time.

Rad Resource: If you want to learn more I would suggest looking at the Council on Accreditation website www.coanet.org to learn more about the standards as well as opportunities to possibly become a military reviewer. It has been a very valuable experience and I gained a great deal of knowledge enhancing my evaluation skills.

The American Evaluation Association is celebrating MVE TIG Week with our colleagues in the Military and Veteran’s Issues Topical Interest Group. The contributions all this week to aea365 come from our MVE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I’m Pat Clifford, a consultant working with the Tristate Veterans Community Alliance and Program Chair of the Military and Veteran Evaluation (MVE) TIG. A prominent issue for our collaborative and many other veteran support organizations is the successful transition from military to civilian life. When approaching this task, it’s important for evaluators to recognize some of the primary contextual drivers.

  • Military transition is a life transition: And, like other transitional periods, it comes with unique risk and protective factors.
  • Some stakeholders and factors are common: For example, there are similar national stakeholders like the Department of Veterans Affairs (VA), the Department of Defense (DoD) as well as a strong military culture. National changes like draw downs and restructuring impact everyone across the board.
  • However, transitions play out in local contexts: Each region has different demographics, organizational players, and linkages to bases and installations.
  • And the “Sea of Goodwill” can intensify problem: Many organizations and individuals are quick to start providing services thinking they are the only ones out there. This means that resources are often not targeted effectively or coordinated.

All of these factors contribute to a complex environment for evaluators. Often well-meaning programs and models (especially those that are implemented top-down) tend to run into roadblocks in practice. To counteract this, there are things that evaluators can do to engage with their local context and drive relevance. 

Hot Tips:

  • Prioritize relationships: Often programs are focused on evaluating programs as if they operated in a social vacuum. The reality is that program outcomes hinge on credibility with their target population, and that means relationships. Intentionally explore how programs build trust and display credibility. From my experience, word-of-mouth is key in the veteran space; and cultural missteps can lead to long-term consequences.
  • Help programs recognize and agree on roles: No organization can do everything well. Too often programs want to be the “one stop shops” to help solve all of a veteran or military families’ needs. Evaluators can work with stakeholders to think critically about their scope, identify their roles and set up processes to ensure they work in partnership, not in isolation.
  • Encourage leadership by empowering: Work with programs to explore how they create a space for indigenous leadership. While national models and top-down initiatives can help bring solutions “to scale”, often they work to disempower local leaders and veteran stakeholders. Evaluators can help become a voice for local knowledge and expertise that can help inform larger initiatives.

Rad Resources:

Tristate Veterans Community Alliance

VA Office of Policy and Planning

Ending Veteran Homelessness by Addressing Failed Transition Policies

The American Evaluation Association is celebrating MVE TIG Week with our colleagues in the Military and Veteran’s Issues Topical Interest Group. The contributions all this week to aea365 come from our MVE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I’m Nathan Graeser with the Center of Innovation and Research on Veterans & Military Families (CIR) at the University of Southern California. Over the last three years, we have lead a collective impact movement dedicated to address reintegration challenges faced by our veterans, service members and their families. As the director/facilitator of the Los Angeles Veterans Collaborative, it has become increasingly clear that despite undergoing massive efforts to collect data on the needs of Veterans (see LA/OC Veterans Study), our strategies were still making little dent in the outcomes we hoped for. Our current efforts, while thoughtful and robust, still failed to address the scale and magnitude of the problem. Much more needed to be done to support creative outreach, better access to care, connecting Veterans back into their community, employment support, and better access to physical and behavioral health care. Despite the tremendous momentum and collective work being done, the list of needs and systemic failures was still daunting.

We needed a process to try out new strategies- based on data- to learn from and adapt our current efforts. Using a developmental evaluation process, we decided to encourage our community to begin to try out new ideas with the hope that we might be able to learn new ways of addressing some of these challenges. It was our hope that by trying out new ideas we might learn as a community and thus change the way we deliver our services and programs which, just maybe, might change the way Veteran experience transition in Los Angeles.

Lesson Learned: In order to encourage these new ideas in our collective impact movement, in October we launched a Community Initiative Grant offering pilots as a way to do this in real time. Members of the Los Angeles Veterans Collaborative now have access to seed money to try out new ideas with the specific purpose of sharing failures, successes and lessons learned to the larger community. Our hope is that this will encourage members to try our ideas- that they will work with the intended outcome to grow our community capacity, not necessarily only their own. The first seven pilots have been submitted and it has already been an amazing process to watch them unfold and the community behind them. We are looking forward to seeing what happens next- hoping to adopt strategies that will allow us to evaluate our be fast enough to meet the growing need for the thousands of Veterans and their families transitioning out of the military.

The American Evaluation Association is celebrating MVE TIG Week with our colleagues in the Military and Veteran’s Issues Topical Interest Group. The contributions all this week to aea365 come from our MVE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Sarah Baughman and I am the National Project leader for the Military Families Learning Network (MFLN). The MFLN connects the resources of Land Grant Universities and Cooperative Extension with military audiences. Working across institutional cultural contexts has been one of our challenges. Our academic institutions have very different cultures than the military and helping each learn to work with the other has been key to our success. The hierarchical nature of military service branches is distinctly different from the flat, individualistic culture of academia. Understanding the work culture of our partners and clients has improved our communications and our programming.

We’ve learned quite a bit as our program has matured and hope you find the tips and resources below useful in your work with military service members and their families.

Hot Tips: There are five branches in the military, each with a unique culture, language, values and identity. The five branches are: Army, Marine Corps, Air Force, Navy and Coast Guard. Be sure you are using the proper terminology and images with each branch.

Rank matters.

Check your assumptions. The popular narrative around military families is deficit based. While military families certainly have challenges unknown to non-military families they are also resilient.

Include or recruit an advisor or key staff member with military experience to help guide your work with service members and their families.

Operation Iraqi Freedom and Operation Enduring Freedom relied heavily on the National Guard and Reserves. Guard and Reserve families are more likely to be in rural areas or far from common military services available to active duty service members on or near installations. This presents unique opportunities and challenges for these families especially during times of deployment.

Military service members are mission oriented. The mission comes before the individual. Be clear about the mission of your program or project and who is tasked with each component.

Military families often do not identify themselves as military. Consider asking if clients are in the military in your program applications or intake forms to better serve those clients.

Rad Resources:

Dictionary of Military Terms

Understanding rank – What are those stripes and bars?

Guide to Military Uniforms and Insignia

Common military terms and lingo

Evidence based programs and practices for the military can be found at the Penn State Clearinghouse

Research on Military Families can be found at the University of Minnesota REACH program

The Military Family Research Institute at Purdue University

The American Evaluation Association is celebrating MVE TIG Week with our colleagues in the Military and Veteran’s Issues Topical Interest Group. The contributions all this week to aea365 come from our MVE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello, I’m Stephen Axelrad. As the chair of the Military and Veteran Evaluation (MVE) Topical Interest Group, I am very excited to introduce this upcoming week for AEA365 dedicate to military and veteran issues in evaluation. Last month, we made our Annual Conference debut. This week is an opportunity of us to introduce our TIG to the broader evaluation community who were unable to attend our conference and are interested in deepening their understanding of the application of evaluation theory, methods, and practice in active duty military, veteran, military family, and national security settings.

Hot Tip: While the military and veteran community goes beyond the Departments of Defense (DoD) and Veterans Affairs (VA), both agencies produce many of the policies and programs that affect millions of Service members, veterans and their families. Getting smart on DoD and VA will give you a good foundation in understanding the issues affecting military and veteran evaluations.

Rad Resource: VA’s Veterans Policy Research Agenda available at http://www.va.gov/op3/docs/StrategicPlanning/VPRA_FY15_Published_Version_9_5_14.pdf describes issues in which the VA needs more evaluation. The Defense Centers of Excellence for Psychological Health and Traumatic Brain Injury’s resource page at http://www.dcoe.mil/About_DCoE/Program_Evaluation.aspx describes how program evaluation is being leveraged to understand the effectiveness of military health and social support programs.

Lesson Learned: Cultural competency is a critical success factor for evaluators who have never served in the military but are working with military and veteran communities. These communities have their own norms, values, and vernacular that vary from the general population.   The Uniformed Services University of Health Sciences’ Center for Deployment Psychology and University of Southern California’s School of Social Work’s Military Social Work have great programs to help evaluators, applied researchers, and providers understand military and veteran communities.

The American Evaluation Association is celebrating MVE TIG Week with our colleagues in the Military and Veteran’s Issues Topical Interest Group. The contributions all this week to aea365 come from our MVE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Archives

To top