AEA365 | A Tip-a-Day by and for Evaluators

Greetings! I am Kendra Lewis, Evaluation Coordinator for the California 4-H Youth Development Program at University of California, Agriculture and Natural Resources. Today I am going to share my experience with a “data party” as a way to engage stakeholders in evaluation data. We recently held a one-day workshop with 4-H camp staff (youth and adults) to review evaluation data collected at their camps last summer. We had nearly 30 people representing 6 camps attend. We presented results from across the state as a whole as well as specific results for each camp. Evaluation data was collected from two sources: youth campers and teen camp staff. I presented data in multiple representations (graphs, tables, word clouds) and posed open-ended prompts to initiate conversation.

Lessons Learned: Participants loved having the opportunity to explore the data, discuss what they thought the results meant, and formulate action plans with their camp team. The data party made the results accessible and understandable. All camps signed up to participate in the evaluation again, and we already have our next data party planned for Fall 2017 after this summer’s camps.

Hot Tip: Start with a “gallery walk” that gives an overview of the results. We had large posters that presented data from the state results, and had small groups of people walk around to review the posters. We made sure to mix youth with adults, and to put people from different camps together to ensure diversity in camp experiences.

Hot Tip: Create a “data placemat” for each site. We made a data placemat that was specific to each camp that they could review with their team. We made a placement for camper data and a placemat for the teen data so that the data could be reviewed separately for those different experiences.

Hot Tip: Word clouds are a great way to introduce qualitative data. Before giving attendees all the qualitative data, we presented word clouds so as not to overwhelm them. After reviewing the word clouds, each camp had the opportunity to go over all their qualitative data in full.

Rad Resource: Check out all these great ideas and pins from Kylie Hutchinson on data parties.

Rad Resource: See the Innovation Network’s slide deck on Data Placemats for more information about this cool tool.

Gallery Walk

Gallery Walk

Reviewing Data Placemats

Reviewing Data Placemats

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Greetings Colleagues!  We are Becca Sanders (Hood River, Oregon) and Heather Robinson (Rochester, New York) of Iteration Evaluation LLC. Our forte is the “…ahhh… there is no button on your system for that” analysis requests.  We work mostly with behavioral health organizations whose current informational needs often far exceed what existing systems can pull off.

Enter VLOOKUP.  VLOOKUP is a tool in Excel that can be used to amalgamate data data that lives on seperate spreadsheets.  It’s like a 4 wheel drive vehicle that can get stakeholders what they need across the potholes that result from siloed management information systems. For example, say we have a direct service provider with a caseload of folks he or she is trying to help.  We also have spreadsheet X, that lives in one place, and gives the provider a list of something important—say emergency room visits.  And we have spreadsheet Y that lives elsewhere and contains some other important piece of information about individuals the provider is trying to help—say existence of a chronic condition.  The provider wants this information all in one place across a particular group of individuals—say a caseload– in order to do their job well.  Enter VLOOKUP as the solution; a solution while you wait for relational table builds, system integration, or the like.  One common identifier across different spreadsheets (for example, a person’s ID number) PLUS a bazillion exported spreadsheets, none of which quite do the trick, PLUS VLOOKUP is all you need.  VLOOKUP essentially says “hey, here is an ID number for an individual in this table and that one (and that one and that one ad nauseum).  Let’s use it to pull in all values from one spreadsheet to the other (and so on) until all the information you need is in one place!” In that way, VLOOKUP allows you to deal with situations where the snapshot you are trying to take involves variables that live in different systems – especially systems that don’t speak to one another very well.  Get good at it! It will take you far!  It is like a Subaru in the winter which can get you through just about any wacky data set up you are faced with.

Rad Resource: Need more of a leg up to get started? Check out the free resource www.chandoo.org!  Chandoo has a free blog, with a mission to help anyone become awesome in EXCEL.

Go VLOOKUP! Don’t wait to kick your car into VLOOKUP 4 Wheel Drive!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

awab

Awab

My name is Awab and I am working as Monitoring & Evaluation Specialist for Tertiary Education Support Project (TESP), at the Higher Education Commission (HEC), Islamabad, Pakistan.

In my experience, the most challenging task in any evaluation is to sell the findings and recommendations to the decision makers and make the evaluation usable. Many evaluations stay on the shelf and do not go beyond the covers of the report as their findings are not owned and used by the management and implementation team.

After conducting the Level-1 &2 Evaluation (shared here earlier https://goo.gl/gyit55), recently we conducted the Level-3 evaluation of the TESP training programs (please find the full report on https://goo.gl/AELJtU). The overall purpose of the evaluation was to know if the learning from training had transformed into improved performance at work place. Also, we wanted to document the lessons learnt from the training and incorporate them in improving strategies for the future training programs.

Cool Tricks:

In order to ensure that when we conduct the Level-3 Evaluation of the training program of TESP, its findings and recommendations are used, we adopted the following strategies:

  1. Drafted the scope of work for Level-3 Evaluation and shared it with the top management and the implementation team. As a result they clearly knew the purpose and importance of the Level-3 Evaluation in measuring the effects of training on performance of its participants.
  2. Engaged the implementation team in the processes of drafting the survey questionnaire and finalizing it. As a result, they curiously waited for the evaluation results so that they could learn how well their training program had performed in improving the performance.
  3. Presented the results overall to make them easy to understand. Then we disaggregated the information and explained the results ‘training theme-wise’ and ‘implementation partner (IP)-wise.’ So, the implementation team knows the problem areas very precisely, avoiding over-generalizations.
  4. Used data visualization techniques and presented the information in the forms of attractive graphs with appropriate highlights, as shown in the following figure. This made the findings easy to understand. awab2
  5. Adopted a sandwich approach in presenting the findings. Highlighted the achievements of the training program, before we went to point out the gaps. And closed the presentation with a note of appreciation for the implementation team. This helped the implementation team in swallowing the not-so-good feedback.

All the above tricks helped the management in acknowledging the findings of the evaluation and adopting its recommendations. Interestingly, at the end of our final presentation, the Leader of the training implementation team was the one to lead the applause.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Greetings!  I am Tania Rempert, Evaluation Coach at Planning, Implementation, Evaluation Org.  This post is written together with my colleagues Molly Baltman from the McCormick Foundation, Mary Reynolds from Casa Central, and Anne Wells from Children’s Research Triangle.  We would like to share our experience speaking at the Office of Social Innovation White House convening on Outcomes Focused Technical Assistance (OFTA).

The purpose of this convening was to advance an outcomes mindset in government, across the public, private, and philanthropic sectors.  David Wilkinson shared the vision of OFTA to focus on building the capacity of social service providers using data to inform smarter service delivery and to implement evidence-based practices in local communities.  Wilkinson began the convening by pointing out,

“Government pays for 90% of the funding for social services in this country, but typically pays for outputs and compliance rather than outcomes and impact.  As a result, many social service providers do not have outcomes they are actively pursuing….and less likely to have consistent outcomes useful for comparison with their peers.” 

The White House Office of Social Innovation and Civic Participation would like to change that.  This convening was meant to draw attention to the technical assistance needed by social service agencies when tasked with measuring and reporting and using outcomes.

Hot Tip: Principles of OFTA:

  • Identify the most important measurable outcomes
  • Implement evidence-based practices
  • Use data to inform research-based service delivery

We were asked to speak based on our experience with the Unified Outcomes Project.  We shared our experiences focusing on increasing grantees’ capacity to report outcome measures and utilize this evidence for program improvement, while streamlining the number of tools being used to collect data across cohort members.  Our model emphasizes communities of practice, evaluation coaching, and collaboration between the foundation and 29 grantees to affect evaluation outcomes across grantee contexts:

Lessons Learned:

  • It takes at least 2 years to see measurable outcomes and be able to model the use of this data at the cohort level of shared outcomes.
  • Grantees are experts through lived experience, use their community voice to determine specific strategies, because they have the language and experience to take each other to the next level, so when they are brought together, a learning community organically develops.
  • The beauty of using an evaluation coach visiting organizations on-site to provide technical assistance is that each organization has different needs to make data-informed decision making.

We hope that this initial convening will encourage ongoing discussion and development of strategies in OFTA for evaluation practice and government policy making.  Since it is not a thing unless it has an acronym, let all of us in the evaluation community commit to “OFTA often!”

rempert

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Metzler

Metzler

Hello!  I’m Christy Metzler, Director, Program Evaluation for NeighborWorks America®, a Congressionally chartered community development intermediary.  As an internal evaluator, I often work closely with program staff to generate actionable learning about our programs and services.  I find that more meaningful participation of the program staff throughout the evaluation process promotes richer strategic conversations, yields actionable and useful recommendations, and ultimately contributes to organizational effectiveness and impact.

Hot Tip #1: Connect to business planning.  Work with program staff to identify where they are in their business planning cycle and be intentional in connecting evaluation findings to the business plan.  Participatory sense-making sessions can be a natural launch pad for discussing program strategy and business plan priorities.  Allow the time and space for these discussions.

Hot Tip #2: Make it inclusive.  In designing evaluation efforts, find ways to include program staff across multiple levels of the organizational structure, from senior vice president to line staff.  Each position has a unique perspective to offer and can expose challenges that may not be evident to others.

Hot Tip #3: Imbed program staff.  Solicit a program operations staff member to play a key role with the data collection or other evaluation activities where possible. Not only does the involvement in the evaluation effort build evaluation capacity, but it also lends greater credibility to the effort, increases ownership of the process and can better support program staff in making program improvements after the evaluation is completed.

Lesson Learned: Remain flexible and responsive to program staff. In a recent evaluation effort, what started out as an implementation review expanded, upon the staff’s suggestion, to include a review of business data being regularly used and strategic conversations taking place in order to identify knowledge gaps and barriers to implementation of business plans. As a result, the evaluation was more relevant and useful for business planning efforts.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello! We are Elaine Donato, Maria Jimenez (Internal Evaluators), and Samantha Moreno (Research Assistant) from the Evaluation Department at Volunteers of America Greater Los Angeles (VOALA), a large non-profit organization with over 70 human services programs, whose mission is to enrich the lives of those in need.  We have pursued the recent eco-conscious trend to reap the many advantages of switching to paperless solutions.

Lessons Learned: Some benefits of going green have included:

Supporting Funder Goals and Mission. In sharing our vision and having conversations with our funders, we have found that almost all our funders are being asked to do more with less. By adopting cost-effective methods, such as implementing a database (e.g., ServicePoint) and reducing paper management, our programs are able to provide the required data to our funders with a quick turnaround while supporting the needs of the funder and VOALA’s mission of sustainability.

In collaboration with the program’s funder, we added additional services (depicted in red text in image) into the database reflecting other grant-related activities and services provided within the program. These activities are now tracked in the database with just a “click of a box” rather than a written case note, which is a much simpler and time-efficient way of tracking data.

VOALA Runaway Homeless Youth Program Services database screenshot.

VOALA Runaway Homeless Youth Program Services database screenshot.

Boosting Staff Morale and Capacity. By training staff on how to use the database, we are able to present to them an opportunity to learn new ways to collect, enter, and manage data on clients’ progress. With time and practice, they begin to understand that collecting data in digital format increases their capacity to work more productively.

Increasing Secured Accessibility and Information Sharing. By moving into a database, program staff no longer have to sift through tons of paper to access client information. Staff can access information 24/7, which allows efficiency in different directions—the important data is tracked, gets shared among staff, and is used to meet client’s needs.

Streamlining Information for Data Accuracy. By converting hard copy documents to digital format, programs can collect more precise data about client needs and services provided, while eliminating redundancy and duplication of questions. Accurate data also helps improve the quality and credibility of services and automates program workflow processes.

Rad Resource: Leverage Technology for Informed Decision-making

By introducing staff to free, paperless avenues such as Google Forms, real-time data and analysis is easily accessible. Technology is leveraged to assist programs in not only collecting robust, cleaner data more quickly, but also providing real-time summary snapshots of progress for informed decision-making.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

meHi my name is Jayne Corso and I am the Community Manager for AEA. I was recently asked how I find and choose articles to post on the AEA social media sites, so I thought I would share my resources with everyone. When posting on social media, I try to maintain a good mix of association news, to keep our community informed about AEA, and evaluation news, to keep our community informed and about trends and lessons learned in evaluation. Here is where I pull my information:

Rad Resource: Twitter

Twitter is an excellent resource for finding content. I will often search relevant hashtags such as #Eval, #Evaluation, and #DataViz to find posts relating to these topics. I do have to do a little digging to make sure I find articles and resources that are informative, reliable, and can relate back to our community – but the content I find is often very rich and diverse.

In addition to searching on twitter, I follow many evaluators who are using the platform. This is helpful, because I can then see what other evaluators are posting 1) to share their content on our sites and 2) to gain a better understanding on what content is relevant and trending in evaluation. Here’s just a few evaluators I follow:

annkemery | Ann K. Emery

clysy | Christopher Lysy

EJaneDavidson | Jane Davidson

EvaluationMaven | Kylie Hutchinson

John_Gargani | John Gargani

Rad Resource: Evaluation Blogs

I follow a lot of evaluation blogs to find insights from our members. I often share posts that I believe are relevant and will resonate with our community. These blog posts allow AEA to share multiple points of view on evaluation related topics. Below are a few blogs that I use for my “go-to” resources:

BetterEvalution

Evaluation is an Everyday Activity

Evergreen Data Blog

Ann K. Emery’s blog

Eval Central

Rad Resource: Resources from AEA

AEA has a whole page of great resources for finding evaluation content. Click here to see evaluators that are active on social media and an array of evaluation related blogs. This is a great starting point for curating content for your social media posts!

I hope this information is helpful. If you have other great evaluation resources, please share them in the comments. Get busy posting!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · ·

Hello to the young and emerging evaluators (YEEs), and to all partners and friends of EvalYouth! My name is Marie Gervais, and I co-chair EvalYouth, along with Bianca Montrosse-Moorhead and Khalil Bitar. I also serve as vice-president of IOCE. What achievements for EvalYouth since its launch 16 months ago! A strong mobilization of YEE around the world, dynamic working groups with effective leadership of YEE, strategic actions successfully implemented and a very promising future. In addition to the actions already mentioned by EvalYouth Task Force 1, Task Force 2 and Task Force 3 in previous posts, I will be talking today about promising milestones of EvalYouth for 2017.

Milestones Set for 2017  –

EvalYouth will build upon its’ successes, and has set a number of milestones for 2017.  A few are included below:

  • A fruitful contribution of EvalYouth to the EvalPartners 3rd Global Evaluation Forum (GEF) organized by EvalPartners, together with IOCE, UNEG, the Global Parliamentarians Forum for Evaluation, the Kyrgyz M&E Network, and the Eurasian Evaluation Network to be held April 26 to 28, 2017 in Bishkek, Kyrgyzstan. EvalYouth and a YEE delegation will be there! We will review progress of the EvalAgenda 2020, particularly in support of the SDGs, and will map out productive partnerships for the future. An excellent opportunity for EvalYouth to celebrate accomplishments, draw lessons learned, identify good practices, explore fruitful collaborations, raise awareness of responsibilities, and plan for more results in coming years.
  • An effective co-sponsorship of the “youth and emerging evaluators strand” at the 8th AfrEA Conference to be held March 27 to 31 in Kampala, Uganda. A very interesting program for YEE, good opportunities to meet seasoned evaluators, and to delve into the challenges of evaluation in Africa and beyond.
  • Enhanced membership and new partnerships to build the capacity of EvalYouth and its Task Forces to realize their ambitions.
    • A call to the YEE: Become ambassadors for EvalYouth and actively participate in its activities. Engage in your national and regional VOPEs and set up YEE groups. Leverage networking and capacity building opportunities!
    • A call to partners: Are you interested in supporting young or new evaluators who will be tomorrow’s leaders in the field? Join us to create enabling opportunities for and with them!

Rad Resources:

There are multiple ways to stay tuned to the future of EvalYouth and to contribute to the future of evaluation:

  • Be familiar and advocate for the EvalAgenda 2020;
  • Check EvalPartners website for updates about the 3rd GEF and for opportunities for self-improvement and for connections with the global community in evaluation;
  • Follow EvalYouth on social media: Facebook, Twitter, LinkedIn and YouTube;
  • Check IOCE website to find information about the VOPE of your country or region.

Join us, together we can!

The American Evaluation Association is celebrating EvalYouth week. EvalYouth addresses the need to include youth and young people in evaluation. The contributions all this week to aea365 come from members of EvalYouth. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· · ·

Hello, my name is Josette Arevalo, EvalYouth Co-Chair of Task Force 3 (Organization of conferences) and Co-Chair of EvalYouth LAC (Latin America and the Caribbean). Since November 2015, I have been part of a group of motivated young people that seek to improve the evaluation capacities of young and emerging evaluators worldwide.  This post focuses on sharing hot tips, rad resources, and lessons learned related to the first virtual EvalYouth conference, which took place on December 3rd, 2016.

Lessons Learned: Below I highlight some of the accomplishments achieved prior to and during the first virtual EvalYouth conference.

Planning Milestones:

  • Commitment: EvalYouth appointed two co-chairs (Sofia Estevez and myself) and a secretariat (Antonina Rishko-Porcescu), whom, along a small group of EvalYouth members, started planning the virtual conference through monthly online meetings.
  • Inclusion: The Task Force conducted an online poll to choose conference topic and sub-themes in a participatory process.
  • Exchange: Task Force leaders developed a conference program and confirmed participation of renowned international evaluation experts.
  • Innovation: The Task Force developed a implemented a multilingual (English, Spanish, and French) promotion strategy using social media, email, and VOPE support, to ensure young and emerging evaluators (YEEs) across the globe were aware of this opportunity.

Conference Achievements:

  • Reach and Diversity: Almost 600 participants from all continents across the world registered for the conference.
  • Inclusiveness: The conference was broadcast in English, with real-time translation into Spanish and French.
  • Engagement: The conference chat was actively used and the presentations included time for audience questions and presenter responses.
  • Efficiency: The conference was made possible with just under $5,000 USD. We received several positive comments and reactions about the virtual conference. Among the most notable, Ziad Moussa, President of the International Organization for Cooperation in Evaluation (IOCE), said, “If opportunity doesn’t knock, build a door. EvalYouth proved that evaluation conferences can be done differently! Congratulations!”
  • Relevance: In a post-conference survey, conference attendees indicated that the conference had a noticeable or strong impact on their knowledge about how to build an evaluation career.

Rad Resources: A recent aea365 blog post noted that readers were interested in resources on (i) how to get started in this field for emerging evaluators and tips on how to become a full-time evaluator, (ii) skills you need to develop to become an evaluator, and (iii) different ways to enter into the field.  These topics were covered in the first virtual EvalYouth conference, which was recorded and is available for FREE on EvalYouth’s YouTube channel.  Like the conference, the recordings are available in English, Spanish, and French.

Get Involved: We are currently planning the first face-to-face EvalYouth conference. We hope it will be as successful as our first virtual conference. Please send us an email (EvalYouth@gmail.com) if you would like to join the conference planning team.  To receive updates, follow us on Facebook, Twitter, or LinkedIn.

The American Evaluation Association is celebrating EvalYouth week. EvalYouth addresses the need to include youth and young people in evaluation. The contributions all this week to aea365 come from members of EvalYouth. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi Colleagues! We are Antonella Guidoccio, Mohamed Rage, and Qudratullah Jahid and we coordinate Task Force 2 charged with developing a mentoring program for Young and Emerging Evaluators (YEEs). We want to share our experience of how to design a mentoring program in a collaborative and inclusive way.

We started by conducting a needs and assets assessment to: a) understand the ways in which organizations that commission and/or use evaluation engage in YEE mentoring; and b) identify YEE mentoring gaps and opportunities for potential solutions.

Hot Tips: Gathering the same type of data across multiple regions is possible, but takes planning.  Below are the steps we implemented, which might be of use to others who are interested in doing surveys in more than one language:

  1. Identify multilingual volunteers, in our case a YEE, who can help translate the survey in 6 languages (English, French, Spanish, Arabic, Russian, and Ukrainian).
  2. Implement a pilot of the survey to assess survey clarity and respondent fatigue.
  3. Partner with VOPEs (Voluntary Organizations for Professional Evaluation) to distribute the survey to their members.
  4. Engage multilingual volunteers, again YEEs in our case, to do the data analysis in the different languages.
  5. Conduct online workshops with Task Force members to discuss the findings in regards to the questions, and draw inferences from results about the design of a mentoring program (our goal).

Lessons Learned: We heard from over 300 individuals across 19 countries.  These individuals had either had a   We are still in the design phase of our program, but some interesting findings have emerged:

  • 90% of respondents characterized the need to have an evaluation mentoring program as a “high” priority;
  • 91% of respondents described unmet mentoring needs of YEEs in their countries;
  • Unmet mentoring needs of YEEs include more support in terms of work and internship opportunities, the opportunity to networking with experts, the need for more training in evaluation design, including report and evaluation proposal formulation; and
  • Overwhelmingly, respondents mentioned that the most appealing format for the mentoring program is an initial face-to-face meeting, with online follow up.

Below is an infographic that was designed by one of our task force members, Antonina Rishko-Porcescu, which summarizes the most important findings of the survey:

evalyouth

Get Involved: These are a few rad resources for you to look at:

  • Are you or have you been an evaluation mentor or mentee? Did you have experiences that you believe we should hear about?  If so, please send us an email (EvalYouth@gmail.com).
  • The Task Force is working on the formal dissemination of the results of the survey and on finalizing the design of the Mentoring Program. To be among the first to receive updates, follow us on Facebook, Twitter, or LinkedIn.

The American Evaluation Association is celebrating EvalYouth week. EvalYouth addresses the need to include youth and young people in evaluation. The contributions all this week to aea365 come from members of EvalYouth. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

Older posts >>

Archives

To top