AEA365 | A Tip-a-Day by and for Evaluators

CAT | Health Evaluation

Hi! I’m David McCarthy, a 4th year medical student at the University of Massachusetts Medical School. I had the opportunity to get involved in the Prevention Wellness Trust Fund (PWTF), a project run by the Massachusetts Department of Public Health that works to combat treatable chronic medical conditions by integrating clinical and community interventions. I chose to focus on the pediatric asthma intervention of the City of Worcester’s PWTF, which utilized a series of Community Health Worker (CHW) home visits. As part of this project’s evaluation, I interviewed CHWs and Care Coordinators about their experiences providing home visits for patients with pediatric asthma and their families. In this blog, I summarize some tips and tricks that I learned that could help refine a community-based care model and be used as benchmarks for future care model evaluations.

Hot Tip: Let those with the contacts help with the networking

Initially, getting patients referred for enrollment in the intervention was difficult due to lack of medical provider education about the program. The solution had two components. First, increasing the frequency of Worcester PWTF asthma workgroup meetings improved coordination between the different groups involved and overall program engagement. Second, provider champions at each site reached out directly to other providers taking care of patients within the focus population, which expanded the project reach. Eventually, referral numbers improved, as they were coming in from nearly all care team members.

Hot Tip: Think outside of office hours when coordinating visits with families

We needed to be flexible scheduling home visits outside of typical business hours, including weekends, to accommodate families’ schedules. CHWs also needed to be available to patients by cell phone for calls and text messaging. This scheduling and options for availability helped to build trust with families and further helped retention of patients in the program.

Hot Tip: Consider care provider’s safety

As with any intervention that requires home visits or meeting parents/families in their own space, it’s always good to remember that the safety of study team members is paramount when going to unfamiliar sites. As part of this project, we provided personal safety training for CHWs who were entering patient homes. Where possible, a team of 2 CHWs conducted each home visit and CHWs confirmed dates and times with families before each visit.

Lesson Learned: Account for the varied needs of patients and families

CHWs provided a standardized set of asthma management supplies to families at each visit, including medication pill boxes, trash cans, mattress and pillow covers, and vacuums. This was designed to incentivize their engagement and compliance with their asthma management plan. However, these supplies didn’t always match individual families’ needs. Future intervention efforts should tailor supply sets for each family based on their existing individual home environment.

Overall, our evaluation efforts identified that an integrated clinical program to address social determinants of health through CHWs represents an innovative healthcare delivery system and is very feasible to implement.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

I am Kate Cartwright, a 2016 AEA Minority Serving Institution Fellow and an Assistant Professor of Health Administration in the School of Public Administration at the University of New Mexico in Albuquerque. I study racial and ethnic health equity in regard to healthcare access, quality, and outcomes.

As an evaluator who values health equity, the imbalance of funding and research which prioritizes the health of underrepresented and underserved populations is of great concern. Researchers and evaluators alike are able to follow best practices in the field. However, too often the “best” practices reify inequities, which includes practices that leave out underrepresented groups.

A provocative essay published in The Atlantic in the summer of 2016 investigates why health studies are frequently so white when our population is so diverse. The article offers several theories, but repeatedly reveals that best practices in research fail to hold researchers accountable for non-inclusive sampling strategies. A recent PLoS Medicine article notes that even though the 1993 National Institutes of Health (NIH) Revitalization Act mandates that federally funded clinical research prioritize the inclusion of women and minorities, the act has not yielded parity in clinical study inclusion (for example, less than 2% of National Cancer Institute funded cancer trials from 1993 to 2013 met the NIH inclusion criteria).

Lesson Learned: Design Inclusive Sampling Strategies

Evaluators must design evaluations which have inclusive sampling strategies if they hope to improve the efficacy, effectiveness, and equity of evaluations.

Hot Tip: Always Include the Community as a Stakeholder

In one workshop on culturally responsive evaluation I attended at Evaluation 2016, some participants lamented that they would like to be more inclusive of community members when evaluating community health programs, but that they had to respond to the priorities of their stakeholders first. Thankfully, we were in a session with a great leader who gently, but firmly, challenged them (and all of us) to remember that community members must be counted as primary stakeholders in all evaluations.

Rad Resources:

The American Evaluation Association is AEA Minority Serving Institution (MSI) Fellowship Experience week. The contributions all this week to aea365 come from AEA’s MSI Fellows. For more information on the MSI fellowship, see this webpage: http://www.eval.org/p/cm/ld/fid=230 Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

We are Valerie Hutcherson and Rebekah Hudgins, Research and Evaluation Consultants with the Georgia Family Connection Partnership (GaFCP) (gafcp.org). Started with 15 communities in 1991, Family Connection is the only statewide network of its kind in the nation with collaboratives in all 159 counties dedicated to the health and well-being of families and communities. Through local collaboratives, partners are brought together to identify critical issues facing the community and to develop and implement strategies to improve outcomes for children and families. The GaFCP strongly believes that collaboration and collective effort yield collective impact. Evaluation has always been a significant part of Family Connection, though capacity within each local collaborative greatly differs.

In 2013, GaFCP invited 6 counties to participate in a cohort focused on early childhood health and education (EC-HEED) using the Developmental Evaluation (DE) framework developed by Michael Quinn Patton. (Patton, 2011. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use). Each county was identified by GaFCP based on need and interest in developing a EC-HEED strategy and had the autonomy to identify collaborative partners, programs and activities to create a strategy tailored to meet the needs and resources of the county. As evaluators we recognized the collaborative and their strategy formation as existing in a complex system with multiple partners and no single model to follow. The DE approach was the best fit for capturing data on the complexity of the collaborative process in developing and implementing their strategies. DE allows for and encourages innovation which is a cornerstone of the Family Connection Collaborative model. Further, this cohort work gave us, as evaluation consultants, the unique opportunity to implement an evaluation system that recognized that understanding this complexity and innovation was as important as collecting child and family outcome data. With DE, the evaluator’s primary functions are to elucidate the innovation and adaptation processes, track their implications and results, and facilitate ongoing, real-time, data-based decision-making. Using this approach, we were able to engage in and document the decision making process, the complexity of the relationships among partners and how those interactions impact the work.

Lessons Learned: Just a few of the lessons we’ve learned are:

  1. Participants using a DE approach may not recognize real-time feedback and evaluation support as “evaluation”. Efforts must be taken throughout the project to clarify the role of evaluation as an integral part of the work.
  2. Successful DE evaluation in a collaborative setting requires attention to the needs of individual partners and organizations.
  3. The DE evaluator is part anthropologist thus is required to be comfortable in the emic-etic (insider-outsider) role as a member of the team as well as one involved in elucidating the practice and work of the team.

We’re looking forward to October and the Evaluation 2016 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

Hi all, I’m Mariel Harding, a Program Coordinator with Prevention Institute (PI). PI is a national non-profit dedicated to improving community health and well-being by building momentum for effective primary prevention. PI has 18 years’ experience advancing policy and supporting communities in improving environments for health and health equity.

When gathering data on health, we often simply measure individual conditions and fail to count the elements at the community level that shape health. Yet we know that access to quality education and housing, clean air, safe places to play, strong social networks, and more, are essential for health and well-being. For example, when identifying key metrics regarding diabetes, it’s not enough to measure blood sugar levels—we need to also look at walkability, the existence of safe parks and open spaces nearby, and food access.

Prevention Institute’s (PI) recent report, Measuring What Works to Achieve Health Equity: Metrics for the Determinants of Health, represents a paradigm shift in thinking about helping communities count what matters when it comes to health, safety and equity. The report lays out the determinants of health – including structural drivers, community determinants, and healthcare – that must be improved to achieve health equity. It also describes the methods and criteria we applied to identify health equity metrics.

We identify 35 recommended metrics for the determinants of health that could track progress toward achieving health equity. However, not all of the metrics presented in the report actually exist because many important determinants of health equity are not regularly measured. Or, if they are, they aren’t being compiled in meaningful ways. Where metrics didn’t exist, we suggest new metrics to fill the gap.

Lessons Learned

  • What we count reflects what we think matters. If health equity is important, we must note it, count it, measure it, and track it.
  • Good metrics foster an understanding of the problem and the solution. For example, measuring neighborhood access to healthy food vendors may prompt efforts to facilitate healthy eating that address the underlying causes of illness, going beyond education campaigns. Such efforts may include recruiting vendors, zoning changes, and/or community gardens.
  • Metrics should gain the attention of the public, be designed not only as a measurement tool, but also as a communications tool to help inform the public about health inequity and what will reduce it. Composite measures, which include multiple indicators, do this well because they express the complexity of the environments which produce health inequity.

Metrics can help clarify and measure the sources of inequity and fostering understanding of solutions and actions that can lower the cost of healthcare, keep all people healthy, and ensure equal opportunities to thrive.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello! I’m Julie Goldman, a librarian at the University of Massachusetts Medical School’s Lamar Soutter Library. I want to introduce you to a number of high-quality public health resources, most of them freely available, and all that can help you as evaluators understand and support your communities, your research, and your work. Public health information comes in many different forms from education materials, research articles, white papers, and policies to raw data on lifestyle characteristics, disease prevalence, healthcare utilization – or, basically, all scientific output that applies to day-to-day life.

Here are a few takeaways when looking for public health information:

 Lessons Learned:

  • Public health is multidisciplinary; it is not just doctors and nurses! Other fields include:
    • Health educators
    • Community planners and policy makers (e.g., local, regional or state health boards)
    • Scientists and researchers
    • Biostatisticians
    • Occupational health and safety professionals
  • Health promotion should happen everywhere – and include everyone!
    • Information should be available to all communities, domestic and international
    • Addressing health disparities should be a key focus. Examples include: healthcare access, infectious diseases, environmental hazards, violence, substance abuse, and injury.

Hot Tips: Evidence-based practices inform decisions:

  • Using the best available scientific evidence leads to informed decisions
  • Pro-active prevention can lead to measurable impact
  • Spending on prevention saves money long-term

Visit the American Public Health Association’s website for public health news, webinars and useful infographics like the one below that visually tell the public health story. Many of these (or similar ones from a variety of sources) can be used to help relay the messages of evaluation data.

Goldman 1

Public Health Infographic, © 2016, American Public Health Association

Rad Resources: The table below provides a brief overview of many of the public health collections that offer freely available resources on populations, agencies, public health news and policy briefs, and much more.

GoldmanLamar Soutter Library Evidence-Based Public Health Portal.
A collaboration of U.S. government agencies, public health organizations, and health sciences libraries highlighting news, public health topics, policies, jobs, and education.
The National Library of Medicine, National Institutes of Health, offer many public health resources such as Haz-Mat, Toxnet, and IRIS.
A free, digital archive of scientific research and literature.
Explore and evaluate projects aimed at reducing racial and ethnic health disparities. From the Robert Wood Johnson Foundation.
The world’s most comprehensive collection of population, family planning and related reproductive health and development literature.
An image-based review of world demographics and statistics.

 

Many libraries, at both public and private institutions, as well as public libraries can assist with evaluation and research, and access to all levels of public health information. Many librarians are highly motivated to work with a research and/or evaluation team to help navigate public health data and resources. Re-read these two blog postings from previous years to learn more about collaborating with a librarian to help explore the vast array of information that can help with the development, conduct, and analysis of evaluation projects: “Library Resources and the Important Role They Play in Evaluation Work” and “Today’s Librarian and Building an Evaluation Team.”

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

I am Elizabeth Tully the Online Toolkit Manager at the Johns Hopkins Bloomberg School of Public Health’s Center for Communication Programs (JHU?CCP). One of the toolkits that I work on regularly is the Measurement, Learning & Evaluation (MLE) Project‘s Measuring Success Toolkit. This toolkit provides guidance on how to use data to plan a health program and to measure its success through monitoring and evaluation (M&E). Using data to design and implement health programs leads to more successful and impactful programs. Data can be used to solve health programming-related problems, inform new program design, assess program effectiveness and efficiency, and suggest evidence-based adaptations and improvements.

But this post is especially about the importance of designing useful resources for M&E practitioners – and the Measuring Success Toolkit is Rad!Tully

Hot Tip #1: Using the Toolkit. The Toolkit is meant to be used! It offers full text documents and usable tools that are in the form of an uploaded file that a user can download or a hyperlink to another website’s rad resources. It is organized both by steps in an M&E plan and by health topic. A handy Toolkit tutorial video is also available to assist new users in navigating the Measuring Success Toolkit.

Hot Tip #2: Curated Content Focuses on Use. The Measuring Success Toolkit team updates the toolkit with useful resources every quarter! This means that the content is curated by M&E experts and includes guides, checklists, protocols, indicators and other tools that can be used by M&E practitioners in the field. While you won’t find peer reviewed journal articles or lengthy end of project reports in this toolkit, you will find the tools and resources to help you plan for, monitor and evaluate a program that would be worthy of an esteemed journal. You’ll certainly be prepared to document your project’s success!

Rad Resources: Within the toolkit you’ll find plenty of great tools – 160 and counting! Here’s a quick list of our most popular (and most downloaded) resources on the site:

Sample Outline of an M&E Plan

Qualitative Research Methods: A Data Collector’s Field Guide

A Guide for Developing a Logical Framework

Please send suggestions for resources to include to contactus@urbanreproductivehealth.org. Our next update will be in mid-October!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hi. We’re Sherry Campanelli, Program Compliance Manager and Laura Newhall, Clinical Training Coordinator, from the University of Massachusetts Medical School’s Disability Evaluation Services. As an organization, we are committed to ongoing quality improvement efforts to enhance our services for people with disabilities. Earlier, we wrote about common techniques that help a quality improvement (QI) team to be successful. Today we share some potholes and pitfalls we’ve encountered in group facilitation and our tips for negotiating them successfully:

Lessons Learned:

  • New problems or issues frequently arise in the middle of a QI project. Team members, management, or external events (such as changes in the industry) can generate issues unrelated to the original charge. This can be discouraging for the team members and leader and can delay completion of the project. The following may be helpful.
    • Reaffirm the team’s goals, mission, and review data as a group to ascertain if the new issue should be addressed in this venue or in another way.
    • Allow team members to opt out of participating in the new task. Seek new members for the team as needed to address the new issue(s).
    • Keep a “hot” list of issues that arise to be addressed by future QI teams.
  • Recommendations from team not fully accepted. A less than enthusiastic response from decision- makers to a team’s recommendations is a challenge for any team.
    • Set expectations with the group up front that recommendations might be accepted, rejected or amended.
    • Sustain the group’s enthusiasm during the revision process by reminding them of the importance of their work and input regardless of the outcome.
    • Emphasize the positive feedback before sharing constructive feedback. Thank team members for their efforts.
    • Ensure that relevant decision-makers are regularly briefed so the team can make “mid-course corrections” toward options likely to be approved.
  • Difficulty achieving full team consensus. This can be due to dominating or defensive team member(s), incomplete information or team members needing more time for analysis.
    • Encourage subgroup and individual work on the issue between meetings.
    • Allow the team to live with ambiguity for a while to enable consensus to develop.
    • Document what’s already been decided and refer team members back to prior discussions.

Thoughts to Ponder:

“The best-laid plans of mice and men / Often go awry” – from a poem by Robert Burns. The QI team process does not always go smoothly; however, these unexpected challenges present opportunities for better overall outcomes.

From a motivational poster by the British government in 1939, the facilitator must “keep calm and carry on” through the potholes and pitfalls of the QI team process.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi.  We’re  Sherry Campanelli, Program Compliance Manager and Laura Newhall, Clinical Training Coordinator, from the University of Massachusetts Medical School’s Disability Evaluation Services (DES). Although DES conducts evaluations regarding whether an applicant for public benefits can be found disabled, evaluation as a research endeavor is not our primary focus. Nevertheless, as an organization, we are committed to ongoing quality improvement efforts to enhance our services for people with disabilities. We use a team-based iterative approach to define and address problem functions and processes.

For example, we used the process described herein to develop Quality Assurance systems for our clinical, clerical and technical support processes. We have also used this method to tackle caseload backlogs, and effective processing of incomplete applications.

We’ve discovered over time, regardless of the issue or problem involved, that there are common techniques that help a quality improvement (QI) team be successful. We would like to share some of these lessons learned with you.

Lesson Learned: 

  • Determine and clearly state the issues to be solved and team goals.
  • Involve key staff (line staff doing the work and managers supervising the work) in the development of any QI initiative. They are in “the know” about areas that may be problematic.
  • Incorporate non-judgmental facilitation to keep up the momentum. Key components include:

o   Involving all participants in decision making/discussion;

o   Keeping meeting minutes and agendas;

o   Keeping track and sharing “to do” lists, “next steps” and progress towards goals;

o   Meeting on a regular and ongoing basis (don’t cancel meetings unless absolutely necessary);

o   Seeking management decisions and input as needed; and

o   Making sure you hear from the quiet folks in the room – they may need a little encouragement to speak up, but often offer great insights.

  • Utilize team members/subcommittees to perform specific tasks between meetings.
  • Utilize available qualitative and quantitative data.
  • Collect specific data, as necessary, to help define the problem and suggest solutions.
  • Do fact finding to support decision-making.
  • Maintain a “living” working document(s) as decisions are made to be incorporated into a final product.

Utilize pilot testing to determine feasibility and make changes (i.e., “fix bugs”) prior to full implementation.

  • Provide periodic communication to the rest of the department or organization during the project and at its conclusion.
  • Train all impacted staff on process improvements.
  • Conduct periodic assessments after implementation to assess success of the project.
  • Refine processes as new issues and changes occur.

Hot Tips:

  • Sometimes QI processes take longer than expected. “Keep going even when the going is slow and uncertain.”  G.G. Renee Hill
  • “To discover new ways of doing something – look at a process as though you were seeing it either for the first or last time.” Mitchel Martin

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Hello, we are Linda Cabral and Jillian Richard-Daniels from the Center for Health Policy and Research at University of Massachusetts Medical School. Collecting and reporting data on a set of predefined measures is something that many evaluators are asked to do.  This typically quantitative process involves gathering data, often from multiple sources and stakeholders, to assess the amount, cost, or result of a particular activity. But have you ever thought about what goes into measure development, collection and reporting? A recent evaluation that we completed included interviews with the people involved in this process.

A federal grant interested in testing a set of child health quality measures was awarded to a group of stakeholders in Massachusetts. An example of such a quality measure is the percentage of infants who reached age 15 months during the measurement year, and who has six or more well infant visits during the first 15 months of life. Given that this was the first time that this particular set of measures was being collected, there was interest in learning about the feasibility, relevance and usefulness of these measures.  Qualitative data collection in the form of interviews and focus groups were done with a variety of stakeholders, ranging from the people who defined and calculated the measures to the providers who were seeing measure results specific to their site.

Lessons Learned:

  • Do your homework ahead of time – When talking to people involved in the nitty-gritty of data measurement and calculation, you need to have a solid understanding of the technical aspects involved so that you don’t spend the entire interview asking background questions.  Be comfortable with the subject matter.
  • Be flexible – The measure development process takes time. There can be unanticipated challenges or circumstances that can delay any component of the project. If interviews are planned for the end of a particular component, be flexible with the timing.
  • Orient interviewees, if necessary – Not all stakeholders, particularly consumers, will have a strong understanding of what a measure is.  In order to get the desired feedback, you may need to spend time providing some background and education before you start asking questions.

Hot Tips:

  • In a project that takes place over the course of a number of years with several different components, try to complete interviews when the information will be the most current for the interviewee and you.
  • Have a tracking mechanism set up to help you stay organized with your data collection. For us, this takes the form of an Excel spreadsheet containing fields such as interviewee contact information, dates of contact and data collection, and staff responsible for transcription and quality assurance.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I’m Lisle Hites, Director of the Evaluation and Assessment Unit (EAU) at the University of Alabama at Birmingham (UAB). I’m writing to share my team’s experiences in conducting needs assessments.

We frequently have opportunities to work with our colleagues on campus to conduct needs assessments for grant-funded projects. One such example was a training grant through the School of Nursing, and we provide it to highlight the value of gathering more than one perspective in assessing needs.

In 2012, CDC data revealed that the South is the epicenter of new infections of HIV; compared to other regions, 46% of all new infections occurred in the region, with a higher percentage of women (24%) and African-Americans (58%) represented in the new infections. Therefore, it is critically important that healthcare providers receive HIV/AIDS training in order to provide HIV/AIDS primary care to meet current and future healthcare demands.

To establish workforce training capacity, we sent surveys to two key healthcare audiences: (1) potential training sites (Ryan White Grantees) and (2) future family nurse practitioners (FNPs). Responses identified both a shortage of trained HIV/AIDS healthcare providers as well as an interest by providers and students to establish clinical training opportunities. Additionally, 78% of current FNP students enrolled at one research institution in the south resided within 60 miles of a Ryan White Grantee site in a tri-state region.

Lessons Learned:

  • The design of this needs assessment allowed us to consider the capacity of Ryan White Grantee sites to provide clinical training opportunities for FNP students.
  • The survey captured the interest and desire of FNP students to seek the skills necessary to provide HIV/AIDS primary care.

Despite the current and future needs for a trained healthcare workforce, healthcare providers in the Deep South still encounter many of the same attitudes toward people living with HIV/AIDS as were found in the early years of the epidemic; therefore, it was necessary to identify a pool of potential candidates for training (i.e., FNP students). At the same time, little was known regarding the capacity and willingness of Ryan White Grantee sites to provide an adequate number of opportunities to meet the training needs of these students. By considering both sides of the equation, we could accurately match the number of students and training sites to ensure a high degree of satisfaction and success for both parties.

Rad Resources: 

The American Evaluation Association is celebrating Needs Assessment (NA) TIG Week with our colleagues in the Needs Assessment Topical Interest Group. The contributions all this week to aea365 come from our NA TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Older posts >>

Archives

To top