AEA365 | A Tip-a-Day by and for Evaluators

CAT | Military and Veteran’s

Hello. I’m Pat Clifford of Clifford Consulting and the Tristate Veterans Community Alliance. The Tristate Veterans Community Alliance (TVCA) has been working with partners to develop a way to track veteran progress toward their successful transition. It has been challenging to identify relevant indicators that are both applicable across sectors (e.g., employment, education, health) and simple enough to be incorporated into brief interactions with veterans and/or program intake/update processes.

In the Fall of 2015 and through ongoing collaborative committee work and discussions with community partners, TVCA identified 30 questions across 13 dimensions that were deemed most relevant for developing a sound client-level transition experience. In January 2016, TVCA conducted a brief survey that narrowed the list even further. In June of 2016, TVCA and its partners pre-tested the intake questionnaire and analyzed the information it yielded.

Post testing follow-ups with partners enabled TVCA to revise its question list to include more veteran-centered language, clarify terms, and bring more emphasis and detail to questions that are most important to a veteran’s transition process.

TVCA then partnered with researchers at the University of Cincinnati to distill the original list of 30 questions into 15 indicators. Indicators were developed to focus on addressing three primary challenges that transitioning veterans face: underemployment, social connection, and social support.

Underemployment is characterized by a poor match between current and ideal job in terms of whether the veteran’s skill and personality fit with the job and whether the job fits the veteran’s expectations. A related indicator is the educational attainment level of the veteran, as advancement in this area can lead to expanded career opportunities.

Social support can take many forms, such as emotional, belonging, companionship, tangible, and informational. All of these are important for successful veteran transitions and ultimate wellness. TVCA focuses on providing informational and tangible support that enables veterans to solve problems as well as to build awareness and connection with sources of support, including from other veterans. Social support could also mean connection to specific VA support services for which veterans and their families are eligible, including reviewing and upgrading the veteran’s disability rating when justified.

Lessons Learned:

Developing a relevant evaluation framework from the ground up can be quite challenging, especially for an area that is complex and wide reaching as military to civilian transitions. However, in the absence of clear guidance it is better to attempt something, always looking to those that are most directly impacted as your best guides. Looking toward the future, I believe our grassroots efforts will help us be more effective advocates with Federal stakeholders such as the Department of Defense and the Department of Veterans Affairs as a national veteran transition framework is developed.

Rad Resource:

To learn more about the TVCA’s efforts to use data to learn and develop their efforts, check out our regional veteran data portal at https://www.tristatevca.org/resources/regional-veteran-data/

The American Evaluation Association is celebrating MVE TIG Week with our colleagues in the Military and Veteran’s Issues Topical Interest Group. The contributions all this week to aea365 come from our MVE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello, I’m Julianne Rush-Manchester of the Military and Veterans TIG. I’m a career, nearly 20-year, evaluator whose been working in the Department of Defense space since 2015. My work has been interdisciplinary (education, criminal justice, prevention, health, public health); yet, across those areas, I’ve learned a few important lessons for conducting needs assessments that can be applied to military settings.  I recently worked with some colleagues in the military health system to create a needs assessment to ascertain knowledge needs among healthcare providers working at traumatic brain injury clinics.

Lessons Learned: 

  • Work with clinical experts in the military system to operationalize needs and develop metrics accordingly.  I have spent most of my career developing evaluation resources in partnership with content subject matter experts.  It would be difficult to develop a logic model on a diabetes or traumatic brain injury program without input from providers working in these areas with patients.  This is why so many evaluators come from education and/or psychology- we often wear both hats as we enter an unknown context to glean nuggets of wisdom while facilitating and creating evaluation products.  I’ve been in the position of developing metrics in areas in which I have no clinical expertise. Although we always have a literature base to review, it doesn’t come close to having the dialogue with experts to place the information in perspective (feasibility, utility, etc.)
  • Needs assessments should be exploratory and stakeholders with agendas may be surprised at the results. Stakeholders, in the Department of Defense included, can have agendas for steering the evaluation or needs assessment in particular directions. For example, the options created for the needs assessment may be purely based on pushing certain agendas forward (for new resource development, training content), rather than an interest in relevant (but perhaps less salient) topics. As an evaluator, it may be useful to inform and educate on the importance of being broadly representative (more akin to content validity) of the targeted knowledge needs.

Rad Resources:  This article refers to these lessons learned and a few others. The Benefits and Risks of Energy Drinks in Young Adults and Military Service Members (Manchester, Eschel & Marion, 2017).  The paper may be helpful in strategizing for needs assessments, using a gap analytic approach, in military and non-military settings.

The American Evaluation Association is celebrating MVE TIG Week with our colleagues in the Military and Veteran’s Issues Topical Interest Group. The contributions all this week to aea365 come from our MVE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I’m Erika Steele, a Health Professions Education Evaluation Research Fellow at the Veteran Affairs (VA) National Center for Patient Safety (NCPS).  For the past few years I have been collaborating with Research Associates at the Center for Program Design and Evaluation at the Dartmouth Institute for Health Policy and Clinical Practice (CDPE) to evaluate the VA’s Chief Resident in Patient Safety Program (CRQS). The CRQS is a one-year post-residency experience to develop skills in leadership and teaching quality improvement (QI) and patient safety (PS).  Since 2014, Research Associates at CDPE have conducted annual evaluations of the CRQS program.  In 2015, we began evaluating the CRQS curriculum by developing a reliable tool to assess QI/PS projects lead by CRs.

One of the joys and frustrations of being an education evaluator is designing an assessment tool, testing it and discovering that your clients, apply the tool inconsistently.  This blog will focus on the lesson learned about norming or calibrating a rubric for rater consistency during pilot testing the Quality Improvement Project Evaluation Rubric  (QIPER) with faculty at NCPS.

Hot Tips:

  1. Develop Understanding the goals of the assessment tool
    Sometimes raters have a hard time separating grading from assessing how well the program’s curriculum prepares learners. To help faculty at NCPS view the QIPER as a tool for program evaluation, we pointed out patterns in CRs scores.  Once faculty started to see patterns in scores themselves, the conversations moved away individual performance on the QIPER and back evaluating how well the curriculum prepares CRs to lead a QI/PS project. 

Once raters understood the goal of using the QIPER, insistences of leniency, strictness and first impression errors were reduced and rater agreement improved.

  1. Create an environment of respect
    All raters need the opportunity to share their ideas with others for score negotiation and consensus building to occur.  We used the Round Robin Sharing (RRS) technique to allow faculty to discuss their expectations, rationale for scoring, and ways to make reaching consensus easier.  We used the graphic organizer in Figure 1 to guide discussions.

RRS helped faculty develop common goals related to program expectations for leading QI/PS projects which led to increased rater agreement on scoring projects.

Figure 1: Round Robin Sharing Conversation Guidance

  1. Build Strong Consensus
    Clear instructions are an important aspect for ensuring that raters apply assessment tools consistently. Using the ideas generated during RRS, we engaged the faculty in building a document to operationalize the items on the QIPER and offer guidance in applying the rating scale.  The guidance document served as a reference for faculty when rating presentations.

Having a reminder of the agreed upon standards helped raters to apply the QIPER more consistently when scoring presentations.

Rad Resources:

  1. Strategies and Tools for Group Processing.
  2. Trace J, Meier V, Janssen G. “I can see that”: Developing shared rubric category interpretations through score negotiation. Assessing Writing. 2016;30:32-43.
  3. Quick Guide to Norming on Student Work for Program Level Assessment.

The American Evaluation Association is celebrating MVE TIG Week with our colleagues in the Military and Veteran’s Issues Topical Interest Group. The contributions all this week to aea365 come from our MVE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello, I am Stephen Axelrad the chair of the Military and Veteran Evaluation (MVE) Topical Interest Group. One of the reasons why I wanted to start this TIG was to help evaluators navigate the complex web of stakeholders in the military community. Many evaluators have little to no experience in the military and its formal structure. However, the military has a long history of valuing systematic evidence to inform decisions about policies and programs. Many uniformed leaders turn to civilian sources to understand innovative, evidence-based methods for addressing national security issues as well as social and organizational problems that affect the military (suicide, sexual harassment, sexual assault, financial literacy, domestic violence, opioid abuse).

Hot Tips:

Civilian evaluators are not expected to know everything about the military to make effective connections. Evaluators just need to apply the same culturally responsive methods they apply to other sub-cultures to military stakeholders. Here are some tips that can set culturally responsive evaluators up for success.

  • The military is not monolithic: the popular press often refers to the military as the “Pentagon” and makes it seem like there is only one military perspective; the actual reality is far from the truth; the military community is composed of communities that vary based on Service branch (Army, Navy, Air Force, Marines, Coast Guard), Component (Active-duty, Reserve, National Guard), rank (commissioned officer, enlisted officer, enlisted), career field and other factors.
  • Not all members of the uniformed military are soldiers: another common mistake in the popular press is to refer to uniformed military members as soldiers but that only applies to the Army; the other terms for military are – sailors (Navy), airmen (Air Force), marines (Marine Corps), guardmen (National Guard, Coast Guard); these terms are central to their identities so getting the term right will help you build rapport with the uniformed military
  • Military installations are like mini-cities: the installation commander is like the mayor and there are usually one or two commands that act like the major employer; installations attract workforces with specific skill sets and interests that give each installation a unique culture
  • Leaders are change agents: one of the few consistent qualities across the military system is the value placed on leadership; leadership is frequently defined through rank and other formal authority; however, the military sees leaders at all ranks and leverages peer leaders to create positive social change

Rad Resources:

The following web sites were developed to help civilian professionals understand military structure

Lesson Learned: Best opportunity for evaluators to help with data-driven, decision making is to come within the first 90 days of a senior military leader’s taking on control of the command. During this period, leaders are in a learning mode, want data relevant to the command, and want to understand ways of improving their commands.

The American Evaluation Association is celebrating MVE TIG Week with our colleagues in the Military and Veteran’s Issues Topical Interest Group. The contributions all this week to aea365 come from our MVE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Collaborative evaluation principles have been used to bolster projects and gain representative stakeholder input. I’m Julianne Rush-Manchester of the Military and Veterans TIG. I’m an implementation science and evaluation professional working in the Department of Defense. I’ve learned some tips for facilitating stakeholder input in clinical settings that may be more hierarchical (rather than collaborative) in nature.  These tips could be applied in military and non-military settings.

Lessons Learned: 

  • Push for early involvement of stakeholders, with targeted discussions, to execute projects successfully (according to plan).  It is expected that adjustments to the implementation and evaluation plan will occur; however, these should be modest rather than substantive if stakeholders have provided input on timing, metrics, access to data, program dosage, recruitment challenges, and so forth.  This is particularly true in military settings, where bureaucratic structures dictate logistics and access.
  • Plan for unintended effects, along with intended ones, in new contexts for the program. A replicated program may look slightly different as it must accommodate for nuances of the organization (military member participants, contractors, mandatory vs. volunteer programs, program support from senior leadership). Expected outcomes may be variations of intended ones as the program adjusts to its host setting.

Rad Resources:

This article refers to the use of collaborative evaluation principles when there is an anticipation of systems change as a result of implementation (Manchester et al., 2014)The paper may be helpful in strategizing for collaborative evaluations around evidence based practices in clinical and non-clinical settings, military or otherwise.

The American Evaluation Association is celebrating MVE TIG Week with our colleagues in the Military and Veteran’s Issues Topical Interest Group. The contributions all this week to aea365 come from our MVE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I am Annette L Gardner and I am a faculty member at the University of California, San Francisco. Developing information-rich case studies can be one of the most rewarding evaluation methods.  Not only do they speak to stakeholders on a deep level but as described below they can create a legacy that endures and has the potential to reach a broad base of stakeholders.

In 2012, The Veterans Administration Office of Academic Affairs launched the Centers of Excellence in Primary Care Education (CoPCE) to seek and implement improvement strategies for interprofessional, patient-centered clinical education and methods to prepare health professions team leaders. A mixed-method study was conducted to assess implementation, trainee outcomes and new approaches to team-based interprofessional care.

I worked closely with CoEPCE Coordinating Center and the five Centers  to develop The Centers of Excellence in Primary Care Education Compendium of Five Case Studies: Lessons for Interprofessional Teamwork in Education and Clinical Learning Environments 2011-2016. Each case describes the contextual and developmental issues behind five unique examples of integrated interprofessional curriculum to support the clinical education workplace.  Peer-reviewed by the National Center for Interprofessional Practice and Education, this compendium provides tools and resources to help prepare professionals for interprofessional collaborative practice. These cases include: 

  • Boise VA Medical Center and the CoEPCE’s“Interprofessional Case Conferences for High Risk/High Need Patients- The PACT ICU Model”
  • Louis Stokes Cleveland VA Medical Center and the CoEPCE’s “Dyad Model”
  • San Francisco VA Health Care System and the CoEPCE’s “Huddling for Higher Performing Teams”
  • VA Puget Sound Health Care System Seattle Division CoEPCE “Panel management Model”
  • Connecticut VA Health Care System West Haven Campus CoEPCE “Initiative to Minimize Pharmaceutical Risk in Older Veterans (IMPROVE) Polypharmacy Model”

Hot Tips:

So what makes these cases different from other case studies? For starters, these cases were developed in an environment that values experimental designs and has the sample sizes to support them.  Sensitivity to stakeholder perceptions of ‘evidence’ were critical.  A contributing factor to the positive reception of these cases may have the been the sharing of these case study initiatives across sites and with VA leadership  prior to the development of the compendium. Their preparation represents a partnership effort with high Center involvement. Second, there was a strong desire to support adoption in other training settings. The VA took dissemination very seriously and launched an aggressive campaign to distribute the compendium through multiple platforms, including the VA website, Government Printing Office, LinkedIn, and the Institute for Healthcare Improvement Playbook. Third, VA staff are monitoring the uptake and use of the cases, a rare occurrence in evaluation design, and are soliciting input using an impact using online questionnaire.

Lessons Learned:

Partnerships and a creative approach to dissemination have the potential to keep evaluation findings from being consigned to the ‘dustbin of history’ and facilitate learning beyond the immediate program stakeholders.

Rad Resource:

VA CoEPCE Case Studies Quality Improvement Questionnaire

The American Evaluation Association is celebrating MVE TIG Week with our colleagues in the Military and Veteran’s Issues Topical Interest Group. The contributions all this week to aea365 come from our MVE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Hello, I’m Stephen Axelrad. As the chair of the Military and Veteran Evaluation (MVE) Topical Interest Group, one of the reasons why I wanted to start this TIG was to broaden the military’s understanding of what program evaluation is and broaden program evaluator’s understandings of what the military community is. While there are pockets of the military that conduct evaluations from the health, behavioral, and social science perspective, the majority of program evaluation in the military is conducted from a resource management perspective. In the Department of Defense and the military service branches, program evaluation is embedded in what is considered the program planning, budgeting, and evaluation (PPBE) process. Although different terms are used, the goals of the PPBE process are very similar to what we would recognize in most evaluations.

  1. Describe and understand the key assumptions, inputs, activities, outputs, and outcomes associated with a program, strategy, or initiative.
  2. Identify areas of effectiveness and ineffectiveness, gaps, and innovations in current programs, strategies, initiatives, and systems.
  3. Provide rigorous evidence to support quality decisions from leaders.
  4. Use data to improve program planning and execution.

Hot Tip: Many program evaluators, especially those who exist outside of the military healthcare and community support agencies have financial management, operations research, and systems engineering backgrounds. These fields understand and use some of the same methodological (e.g., surveys, interviews) and analytic techniques (e.g., regression, structural modeling) that evaluators within AEA frequently use. Finding common ground on methodology can help avoid pedagogically based misunderstandings.

Rad Resources: Before engaging with a project or initiative that involves DoD or military service branch officials, review the following web sites to become more familiar with how they may approach evaluation.

Lesson Learned: Evaluators do not have to be in the military in order to understand how to design and execute evaluations affecting the military community. Evaluators who demonstrate an understanding of the military decision-making process (MDMP) and can structure their evaluations as closely as possible to MDMP can maximize buy-in from uniformed and civilian leaders and the use of their evaluation results.

The American Evaluation Association is celebrating MVE TIG Week with our colleagues in the Military and Veteran’s Issues Topical Interest Group. The contributions all this week to aea365 come from our MVE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Stanley Capela. I am the VP for Quality Management for HeartShare Human Services of New York. In addition, I am a military peer reviewer for a nation accreditation organization where I have had the opportunity for participating on reviews for military family readiness programs as well as after school programs on military bases throughout the United States as well as military bases in Germany, Japan and Guam. The purpose of this AEA 365 is to share some tips that I have learned doing these reviews on how best to evaluate military programs.

As a military reviewer I had to undergo training to make sure I have a good understanding on how to approach these reviews. My comments will be based on my experiences as well as what I learned participating in training programs to prepare me for these reviews.

Lesson Learned: First, hierarchy is very important in the military. One key is to be sensitive to their respect for hierarchy and anticipate how you may address this. For example when doing entrance and exit conferences you may see others stand as a higher ranking officer enters the room.

Second, the military has a language that is part of their culture. Therefore, it is important to understand the high usage of acronyms.

Third, their use of various terms such as the use of customers versus clients; staff instead of personnel, service member instead of soldier/sailor or Marine; In-Briefing instead of Entrance Meeting; Cusomer File Review instead of Consumer Record Review.

Fourth, when conducting interviews, it is good to start by letting them know who you are and why you are there. If in my case when doing a review we try to let them know about the accrediting organization and why they are being interview.

Fifth, make sure you stick to a schedule; let people know there may be follow up schedules, avoid casual conversations and if there are people you want to interview outside of the original schedule it is kept to a minimum.

Sixth, when interviewing customers who are veterans, it is important to use the following guidelines: be understanding; be an active listener; listen with empathy, but minimize sympathy; and provide background and context to your interview questions.

These are some tips that I have learned over time.

Rad Resource: If you want to learn more I would suggest looking at the Council on Accreditation website www.coanet.org to learn more about the standards as well as opportunities to possibly become a military reviewer. It has been a very valuable experience and I gained a great deal of knowledge enhancing my evaluation skills.

The American Evaluation Association is celebrating MVE TIG Week with our colleagues in the Military and Veteran’s Issues Topical Interest Group. The contributions all this week to aea365 come from our MVE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I’m Pat Clifford, a consultant working with the Tristate Veterans Community Alliance and Program Chair of the Military and Veteran Evaluation (MVE) TIG. A prominent issue for our collaborative and many other veteran support organizations is the successful transition from military to civilian life. When approaching this task, it’s important for evaluators to recognize some of the primary contextual drivers.

  • Military transition is a life transition: And, like other transitional periods, it comes with unique risk and protective factors.
  • Some stakeholders and factors are common: For example, there are similar national stakeholders like the Department of Veterans Affairs (VA), the Department of Defense (DoD) as well as a strong military culture. National changes like draw downs and restructuring impact everyone across the board.
  • However, transitions play out in local contexts: Each region has different demographics, organizational players, and linkages to bases and installations.
  • And the “Sea of Goodwill” can intensify problem: Many organizations and individuals are quick to start providing services thinking they are the only ones out there. This means that resources are often not targeted effectively or coordinated.

All of these factors contribute to a complex environment for evaluators. Often well-meaning programs and models (especially those that are implemented top-down) tend to run into roadblocks in practice. To counteract this, there are things that evaluators can do to engage with their local context and drive relevance. 

Hot Tips:

  • Prioritize relationships: Often programs are focused on evaluating programs as if they operated in a social vacuum. The reality is that program outcomes hinge on credibility with their target population, and that means relationships. Intentionally explore how programs build trust and display credibility. From my experience, word-of-mouth is key in the veteran space; and cultural missteps can lead to long-term consequences.
  • Help programs recognize and agree on roles: No organization can do everything well. Too often programs want to be the “one stop shops” to help solve all of a veteran or military families’ needs. Evaluators can work with stakeholders to think critically about their scope, identify their roles and set up processes to ensure they work in partnership, not in isolation.
  • Encourage leadership by empowering: Work with programs to explore how they create a space for indigenous leadership. While national models and top-down initiatives can help bring solutions “to scale”, often they work to disempower local leaders and veteran stakeholders. Evaluators can help become a voice for local knowledge and expertise that can help inform larger initiatives.

Rad Resources:

Tristate Veterans Community Alliance

VA Office of Policy and Planning

Ending Veteran Homelessness by Addressing Failed Transition Policies

The American Evaluation Association is celebrating MVE TIG Week with our colleagues in the Military and Veteran’s Issues Topical Interest Group. The contributions all this week to aea365 come from our MVE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I’m Nathan Graeser with the Center of Innovation and Research on Veterans & Military Families (CIR) at the University of Southern California. Over the last three years, we have lead a collective impact movement dedicated to address reintegration challenges faced by our veterans, service members and their families. As the director/facilitator of the Los Angeles Veterans Collaborative, it has become increasingly clear that despite undergoing massive efforts to collect data on the needs of Veterans (see LA/OC Veterans Study), our strategies were still making little dent in the outcomes we hoped for. Our current efforts, while thoughtful and robust, still failed to address the scale and magnitude of the problem. Much more needed to be done to support creative outreach, better access to care, connecting Veterans back into their community, employment support, and better access to physical and behavioral health care. Despite the tremendous momentum and collective work being done, the list of needs and systemic failures was still daunting.

We needed a process to try out new strategies- based on data- to learn from and adapt our current efforts. Using a developmental evaluation process, we decided to encourage our community to begin to try out new ideas with the hope that we might be able to learn new ways of addressing some of these challenges. It was our hope that by trying out new ideas we might learn as a community and thus change the way we deliver our services and programs which, just maybe, might change the way Veteran experience transition in Los Angeles.

Lesson Learned: In order to encourage these new ideas in our collective impact movement, in October we launched a Community Initiative Grant offering pilots as a way to do this in real time. Members of the Los Angeles Veterans Collaborative now have access to seed money to try out new ideas with the specific purpose of sharing failures, successes and lessons learned to the larger community. Our hope is that this will encourage members to try our ideas- that they will work with the intended outcome to grow our community capacity, not necessarily only their own. The first seven pilots have been submitted and it has already been an amazing process to watch them unfold and the community behind them. We are looking forward to seeing what happens next- hoping to adopt strategies that will allow us to evaluate our be fast enough to meet the growing need for the thousands of Veterans and their families transitioning out of the military.

The American Evaluation Association is celebrating MVE TIG Week with our colleagues in the Military and Veteran’s Issues Topical Interest Group. The contributions all this week to aea365 come from our MVE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Older posts >>

Archives

To top