AEA365 | A Tip-a-Day by and for Evaluators

CAT | Internal Evaluation

Metzler

Metzler

Hello!  I’m Christy Metzler, Director, Program Evaluation for NeighborWorks America®, a Congressionally chartered community development intermediary.  As an internal evaluator, I often work closely with program staff to generate actionable learning about our programs and services.  I find that more meaningful participation of the program staff throughout the evaluation process promotes richer strategic conversations, yields actionable and useful recommendations, and ultimately contributes to organizational effectiveness and impact.

Hot Tip #1: Connect to business planning.  Work with program staff to identify where they are in their business planning cycle and be intentional in connecting evaluation findings to the business plan.  Participatory sense-making sessions can be a natural launch pad for discussing program strategy and business plan priorities.  Allow the time and space for these discussions.

Hot Tip #2: Make it inclusive.  In designing evaluation efforts, find ways to include program staff across multiple levels of the organizational structure, from senior vice president to line staff.  Each position has a unique perspective to offer and can expose challenges that may not be evident to others.

Hot Tip #3: Imbed program staff.  Solicit a program operations staff member to play a key role with the data collection or other evaluation activities where possible. Not only does the involvement in the evaluation effort build evaluation capacity, but it also lends greater credibility to the effort, increases ownership of the process and can better support program staff in making program improvements after the evaluation is completed.

Lesson Learned: Remain flexible and responsive to program staff. In a recent evaluation effort, what started out as an implementation review expanded, upon the staff’s suggestion, to include a review of business data being regularly used and strategic conversations taking place in order to identify knowledge gaps and barriers to implementation of business plans. As a result, the evaluation was more relevant and useful for business planning efforts.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello! We are Elaine Donato, Maria Jimenez (Internal Evaluators), and Samantha Moreno (Research Assistant) from the Evaluation Department at Volunteers of America Greater Los Angeles (VOALA), a large non-profit organization with over 70 human services programs, whose mission is to enrich the lives of those in need.  We have pursued the recent eco-conscious trend to reap the many advantages of switching to paperless solutions.

Lessons Learned: Some benefits of going green have included:

Supporting Funder Goals and Mission. In sharing our vision and having conversations with our funders, we have found that almost all our funders are being asked to do more with less. By adopting cost-effective methods, such as implementing a database (e.g., ServicePoint) and reducing paper management, our programs are able to provide the required data to our funders with a quick turnaround while supporting the needs of the funder and VOALA’s mission of sustainability.

In collaboration with the program’s funder, we added additional services (depicted in red text in image) into the database reflecting other grant-related activities and services provided within the program. These activities are now tracked in the database with just a “click of a box” rather than a written case note, which is a much simpler and time-efficient way of tracking data.

VOALA Runaway Homeless Youth Program Services database screenshot.

VOALA Runaway Homeless Youth Program Services database screenshot.

Boosting Staff Morale and Capacity. By training staff on how to use the database, we are able to present to them an opportunity to learn new ways to collect, enter, and manage data on clients’ progress. With time and practice, they begin to understand that collecting data in digital format increases their capacity to work more productively.

Increasing Secured Accessibility and Information Sharing. By moving into a database, program staff no longer have to sift through tons of paper to access client information. Staff can access information 24/7, which allows efficiency in different directions—the important data is tracked, gets shared among staff, and is used to meet client’s needs.

Streamlining Information for Data Accuracy. By converting hard copy documents to digital format, programs can collect more precise data about client needs and services provided, while eliminating redundancy and duplication of questions. Accurate data also helps improve the quality and credibility of services and automates program workflow processes.

Rad Resource: Leverage Technology for Informed Decision-making

By introducing staff to free, paperless avenues such as Google Forms, real-time data and analysis is easily accessible. Technology is leveraged to assist programs in not only collecting robust, cleaner data more quickly, but also providing real-time summary snapshots of progress for informed decision-making.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

We are Rebekah Sobel, Manager, Planning & Evaluation, and Dana Burns, Data Analyst, in the United States Holocaust Memorial Museum’s Office of Planning. The Museum is a dynamic institution with a bold and ambitious strategic plan. We are a small unit responsible for leading and strengthening the Museum’s annual and long-term planning, evaluation, and monitoring practices. One of our top priorities — and greatest challenges — has been standardizing processes, systems, and language across our wide-ranging programs. The Museum is still in the early years of taking a more comprehensive and consistent approach to planning and evaluation. To help our colleagues better prioritize projects, set goals and measure success, templates quickly became our “go-to” tool.

Lessons Learned: Conversation and customization are key!

While every Museum program area has established priority outcomes, there is great variation in how they integrate evaluation and measurement in their work. Over many conversations with our colleagues, we have come to understand their challenges and how they operate, which allows us to mix and match our templates to address their needs.  In meeting our colleagues where they are, our office is building important relationships. These partnerships are key to ensuring our ability to achieve our strategic plan goals and set up our work to measure success. Sometimes, our program colleagues approach us with their own internal evaluation and planning documents that the Museum could use more widely. We will adjust their language and format, and then share the revised version with other potential users for feedback. Occasionally, our collaborative work results in a new tool that can be consistently used across the Museum.

Hot Tips:

  • It is important to remember that templates are a means to an end; they are tools to help us maximize our impact.  In order to be effective, they must address a need or solve a problem, and be both user-friendly and efficient.
  • When responsibility for next steps in evaluation work is vague or when enthusiasm for trying out new templates stalls, we keep the ball rolling and offer hands-on support. Sometimes that means filling out the template the first time or providing more one-on-one guidance.
  • We do the project management for evaluation planning, which includes managing the schedule, setting meetings, sending the meeting notes and any follow-up. We ask our partners for their input as we develop templates, and assign small tasks for them to undertake with their teams so they gain experience using new tools. If they don’t own the template, they won’t use it.
  • We bring candy or other treats to our planning meetings. We also have been told playing music in meetings helps to set a lighter mood– we are trying that next!

Rad Resources:

Here are a few of the templates we have tried and adjusted, while making new friends and co-conspirators to organizational change along the way. Try them yourself, change them up, and let us know what works for you:

 The American Evaluation Association is celebrating Internal Evaluation (IE) Topical Interest Group Week. The contributions all this week to aea365 come from our IE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

salehi-1Greetings from Toronto! My name is Roxana Salehi and I am the principal consultant at Vitus Consulting and a Senior Research Associate at the Centre for Global Child Health, Hospital for Sick Children.  Today I will share with you four ways you can help your organizations use evaluation data to make strategic decisions.

Lesson Learned 1: Create your own business processes if none exist. Having clearly defined and agreed upon processes in place are essential for getting useful results that can lead to action. Evaluators can pro-actively help to create these processes. For example:

  • Create a Terms of Reference document, outlining the role of working group members, decision making structure, etc.; this will teach you a lot about how business is done in the place you are working. In some countries, you may have to pay people to attend meetings – good to know when you are setting your budget! See an example from the Partnership for Maternal, Newborn & Child Health.
  • Create simple protocols to clarify who is doing what, and when, for key tasks, such as data collection. Protocol development makes you consider issues that you may not typically consider. For example, it may not occur to you that in some places certain documents need to be hand delivered.
  • Ask for time on the agenda of meetings, so that evaluation remains an organizational priority.

Lesson Learned 2: Build your utilization-focused philosophy into your plan. In your evaluation plan, list “Utility” as an explicit evaluation standard you want to adhere to.  Utility means that evaluation should be planned and conducted in ways that meets the needs of stakeholders and increases the likelihood of results leading to action. The evaluation plan can act as a strong reference document for bringing people back to the question of: “so what?”. For more on this, see Michael Quinn Patton’s Utilization-Focused Evaluation.

Lesson Learned 3: Let your stakeholders help you create meaning out of data. Instead of holding a “presentation of evaluation results”, consider convening a  “data sense making session.”  My experience? Stakeholders love it! It intrigues them; they come in curious and willing to help pull out the most important findings and actions that can be taken. Just make sure you have some worthy questions for your stakeholders, or else it will be just a presentation of results.

Lesson Learned 4: Stay on top of your game. And I don’t mean it just in terms of evaluation knowledge. That is a given! Also learn from data visualization and communication fields so that you can tell a compelling story that enables action. For quantitative data, I found Stephanie Evergreen’s Effective Data Visualization useful and I am still searching for good resources for qualitative data display.

There you have it. I’d love to hear about other ways you help put data into action!

The American Evaluation Association is celebrating Internal Evaluation (IE) Topical Interest Group Week. The contributions all this week to aea365 come from our IE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

We are Ana Flores and Joshua Paul at Volunteers of America – Los Angeles. At the AEA Evaluation 2016 conference, we presented a panel entitled “A Picture is Worth a Thousand Words…But Will They Use It?”. Today, we want to provide additional information regarding how to make data more user-friendly.

The Evaluation Department at Volunteers of America – Los Angeles (VOALA) is tasked with providing evaluation services as needed to more than 70 social service programs. Staff in these programs are dedicated to helping people and many find data unappealing. Addressing communication barriers has given us the opportunity to learn a number of lessons.

Lessons Learned #1: Needs Change, Open a Dialog with Succinct Visualizations  

Understanding stakeholder needs and how they fit into a program model is a major part of any evaluator’s task. Unfortunately, we have found that stakeholder needs and program models can change rapidly, and stakeholders do not always volunteer information about these changes.

We were once mystified as to why one of our programs — whose initial purpose was to connect with and refer homeless veterans to local services — had such poor monitoring results. Traditional reporting methods failed to open a dialog that could bring the core problem to light. After months of discussion, we tried a new visualization-based design (see image) that demonstrated the discrepancy between the goal and present performance and prompted program leadership to identify the issue. The staff had been focused on the transportation of clients to appointments, a secondary program activity, which had not originally been designated as important to track.

internal-eval-image-1

internal-eval-2

Lessons Learned #2: Only Show What You Need to Show

 

Past reports for many of our programs provided detailed data, presenting every single outcome for individual clients. However, this level of information was not necessary for program performance discussions and was a distraction from the overall outcomes included in the report. Using Tableau, the detailed information was removed, and only overall outcome percentages and targets were kept on the graph. With outcomes presented this way, VOALA upper management was able to get the information they needed to make program recommendations and help program directors implement better practices.

 

Lessons Learned #3: Use Interactivity

Giving your audience an opportunity to control the data makes it easier for them to make inferences about the information. Visual analysis programs, like Tableau, allow us to provide interactive reports so that upper management and program directors can filter results by key demographics or periods of time, depending on what is useful to them.

 

Having these types of “quick snapshot” visualizations has helped upper management at VOALA communicate recommendations with programs.

Experimenting with these different data visualization techniques has improved our discussions with key staff, helping us ask hard questions while reducing staff resistance to data. Otherwise, the response to “Why is this benchmark never reached?” might just be silence.

The American Evaluation Association is celebrating Internal Evaluation (IE) Topical Interest Group Week. The contributions all this week to aea365 come from our IE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi! We are Laura Beals, Director, and Barbara Perry, Evaluation Manager, of the Department of Evaluation and Learning at Jewish Family and Children’s Service Boston, a multi-service nonprofit in Massachusetts. At Eval 2015, we learned about “Data Placemats” from Veena Pankaj of the Innovation Network. Recently, we held several placemat-focused “Learning Conversations” with one of our multi-program divisions. We created seven placemats for these meetings:

  1. An overview of the Learning Conversation and placemat process.
  2. Client census—new and active—over the past four years for each program.
  3. Client demographics by program.
  4. Client geographic distribution heat map. This placemat was interactive, using Tableau. We wanted not only to show the geographic distribution of clients in Massachusetts, but also to provide an opportunity to explore the data further, through the use of filters for program and key demographics.
  5. A network analysis showing referral sources.
  6. A network analysis showing how clients were served by multiple programs at the agency.

(click for larger image)

beals

7. A learning and dissemination plan. This placemat encouraged meeting participants to use the data and allow our team to     create specific follow-up documents and undertake follow-up analysis.

Lessons Learned:

  • During the planning stages, check-in with stakeholders from around the organization. We asked the program director, division director, grant writers, and development associates what they wanted to learn about the division. Their responses allowed us to tailor the placemats to be as useful to as many people as possible.
  • Don’t forget to include the staff! In order to share the placemats and get feedback from the direct-service staff, at an all-staff meeting we held a shorter placemat discussion, focusing on two placemats; the other placemats were provided for later review. We also hung up the placemats near the staff offices and provided sticky notes for feedback and observations.
  • Be ready to “go on the road” with your placemats. We found that word spread about our placemats and there was interest from various stakeholders who had not been able be part of the original few meetings. By continuing the conversations, we were able to increase learning and generate new ideas.
  • Bring data chocolates! We had been waiting for an opportunity to create data chocolates, after being inspired by Susan Kistler. We wrapped shrunken versions of several of the graphs around chocolates. They put everyone in a good mood to talk data—the lightheartedness of our gesture helped break down barriers and were a great conversation starter.

Rad Resources:

The American Evaluation Association is celebrating Internal Evaluation (IE) Topical Interest Group Week. The contributions all this week to aea365 come from our IE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Stanley Capela and I am the Vice President for Quality Management and Corporate Compliance Officer at HeartShare Human Services of New York.

I have been an internal evaluator for 38 years in the non-profit sector and have reviewed more than 115 organizations in 35 states and four countries as a Peer Reviewer for the Council on Accreditation. Through those experiences, I have come to realize the importance of communication skills in conducting any evaluation. I offer the following tips when conducting reviews.

First, be contextual. Take the time to understand the world of the program and, more specifically, the staff’s work. This context provides the internal evaluator with a better understanding on how to approach the evaluation and, more importantly, how to present the value of that evaluation and that it will have a positive outcome.

Second, communicate in a language that the stakeholder understands. Very often when conducting internal evaluation and presenting the results, evaluators get caught up in their own world and forget that the primary role of evaluation is to assist key stakeholders — specifically, the leadership –in developing a better understanding as to whether or not the program is achieving its goals. If not, what are the issues and how can the evaluator help resolve them?

Finally, carefully choose your words. The language that you use for your evaluation will impact how the stakeholder interprets your findings. Further, it also has an effect on whether or not you will face resistance to the evaluation. For example, when I conduct reviews, I like to use the word “challenges” rather than “deficits.” Also, stakeholders often view evaluation as a tool to identify program weaknesses, so I will make an effort to identify program strengths. These are basic strategies to ensure that stakeholders are more receptive to the evaluation.

Rad Resources: To learn more, I refer you to Michael Patton’s work on Utilization Evaluation and David Fetterman’s work on Empowerment Evaluation for further understanding on the importance of communication skills in evaluation. In addition, I refer you to a presentation that I made at last year’s AEA Conference titled, “Turning Communication into Collaboration: The Development of an Outcomes-Based Management Training Program.”  Finally, check out the Council on Accreditation website at www.coanet.org and look at their Performance Quality Improvement standard and tools.

The American Evaluation Association is celebrating Internal Evaluation (IE) Topical Interest Group Week. The contributions all this week to aea365 come from our IE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello!  I am Jay Szkola, senior program analyst in the Strategy, Evaluation and Learning Division at Good Shepherd Services. As an internal evaluator, I have the opportunity not only to evaluate the programs in my portfolio, but also assist staff in using data in their day-to-day work with participants. This doesn’t have to involve a great deal of complexity, and can be as simple as developing tools that translate data into an accessible format.

Welcome to internal evaluation week, hosted by the Internal Evaluation TIG! Thanks to all of the internal evaluators who contributed to this week’s blogs and to Annie Gleason (Good Shepherd Services in New York City) for coordinating the work! This week is dedicated to sharing insights and tools from evaluators across the country (and Canada!) about how they help organizations to put data and evaluation findings into action. Blogs this week offer a broad range of suggestions for helping stakeholders to use data, drawing on technology, data visualization, communication and framing, and participatory methods.

All of the community-based youth justice programs at our agency use the Positive Youth Development Inventory as a tool to help evaluate our impact on key youth development constructs, such as friendship and future orientation. We use the PYDI to assess our participants at baseline, and gauge pre-to-post changes upon program completion. These insights are useful for program planning and advocacy.

I also wanted to make sure staff could use the results to inform their work with participants directly, but initially experienced limited success with this goal. One barrier was that while staff liked the survey, they were unsure how to translate individual survey responses into insights and action.

Cool Trick:

In response, I constructed a simple Excel sheet to help with this translation.  Program staff enter in participant survey responses, and get out easy-to-read graphs showing how the participant scored in each construct. As staff enter the data, a table aggregates each response by construct, which then links to the graphs.

Program staff and leadership liked the tool so much that when the NYC Department of Probation adopted the PYDI as a measure, the Excel tool was shared and is now used by other youth justice programs in the city.

(click for larger images)

PYDI Template Input

PYDI Template Input

PYDI Template Output

PYDI Template Output

Lessons Learned:

Offer a guide. For those who do not have a high comfort level with data, colorful graphs alone will not bring about understanding. Included on the Excel tool is a brief, jargon-free definition, a sample question, and a list of questions for each construct on the survey. 

Think about multiple uses to get maximum impact. Whatever the data source (surveys, administrative data, etc.), talk to staff about how they could use the data. Once staff in our youth justice programs began seeing the PYDI as something they could use in day-to-day practice, it became less of a task to be completed “for the evaluators” and instead, a shared project.

The American Evaluation Association is celebrating Internal Evaluation (IE) Topical Interest Group Week. The contributions all this week to aea365 come from our IE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Greetings!  I am Robert Hoke, an independent evaluation consultant based in Indianapolis, Indiana and former chair of the Independent Consulting TIG.   I am a big proponent of shaking up the traditional conference session with a little “speed-dating.”

We have all sat through sessions in which presenters give a good presentation but then there is only time for one question.  (And usually the question is more likely a comment than a question.) All the audience members leave the room and move to the next session with no opportunity to meet the other audience members.   For the Independent Consulting TIG, our response has been to mix things up with one session that uses a fast-paced, more interactive, speed-dating approach.  (This technique is also known as SpeedGeeking.)

The Independent Consulting TIG has offered versions of this session nine times since Gail Barrington first shared the process in 2005.

The Format:

  • Tables of six to nine participants
  • Each table has a table leader/presenter. Each table leader presents on a different topic
  • Table leader presents their topic for 10-15 minutes. ½ of the time should be devoted to questions
  • Participants then move to another table
  • Repeat, Repeat, Speed Up, Repeat…..

The table leaders prepare a two-page summary of helpful hints and resources for the participants.  The written summary are very similar to the type of information included in the AEA365 blog:  best practices, helpful hints, key resources, etc.

What I like about “SpeedGeeking”:

  • Like a tapas restaurant, participants can “taste” a wild variety of topics in a short period.
  • The method allows for much more interaction between the participants and the speaker with each round being almost equally divided between presentation and questions.
  • Participants (especially introverts) are more likely to ask a question in a small group setting.
  • The opportunities for networking and information exchange are off the chart.

Hot Tips:

  • Avoid the temptation to make the session more “organized” by having the table-leaders move instead of the participants. The disorder of the transition between each round increases the energy and allows for participants to meet more people.
  • The length of each round can vary. For example, our sessions start with three fifteen minutes rounds and then four faster rounds of ten minutes.  The table-leaders become more comfortable with their topics and can move faster.
  • It is critical that the time-keeper is very firm about keeping things moving and also gives the table-leader a two-minute warning.

Rad Resources

The American Evaluation Association is celebrating Independent Consulting TIG Week with our colleagues in the IC AEA Topical Interest Group. The contributions all this week to aea365 come from our IC TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Charmagne Campbell-Patton and I am an independent evaluation consultant. About a year ago, I made the transition from my role as program manager and internal evaluator at an education nonprofit, to an external evaluation consultant. I continued working with my former employer as a client and, in my naiveté, I thought the transition would be relatively straightforward. I figured that since I knew the inner workings of the organization and had strong relationships with most staff members, it would be easy to continue to conduct useful evaluation.

Lessons Learned: My first mistake was failing to recognize and address that as the program manager, I used to be the primary intended user of the evaluation results. When I made the transition to an external consultant, I needed to be much more intentional about designing evaluations that met the needs of the new intended users.

Hot Tip: Be aware of how your position affects use. The personal factor is different in different relationships – internal and external.

Lesson Learned: Process use is different internally and externally. As a staff member, I used to be able to identify opportunities for process use in an ongoing and informal way. As an external consultant, however, I again had to be much more intentional about identifying opportunities and planning for process use.

Hot Tip: External evaluators need to be intentional about seeking opportunities to support evaluative thinking across the organization through more formalized process use.

Cool Trick: One way to engage staff is a reflective practice exercise. Bring staff together to reflect on the question: “What are things you know you should be doing but aren’t?” This question gets people thinking about potential personal barriers to using information. That sets the stage for discussing barriers to evaluation use organizationally. Next identify enabling factors that support and enhance use, and ways to overcome barriers to use.

It’s also worth noting that despite some of the challenges noted above, the transition from internal to external also gave me a new perspective on evaluation use. Once I recognized some of the barriers to use as an external consultant, I was actually able to use my position to promote use more effectively than I did while internal. The added distance gave me some leverage that I lacked as a staff member to call attention to opportunities and challenges to evaluation use across the organization.

Rad Resources: Essentials of Utilization-Focused Evaluation, Michael Quinn Patton, Sage (2012).

Consulting Start-Up and Management, Gail Barrington, Sage (2012).

Using Reflective Practice for Developmental Evaluation, Charmagne Campbell-Patton, AEA365 March 2015.

The American Evaluation Association is celebrating Evaluation Use (Eval Use) Topical Interest Group Week. The contributions all this week to aea365 come from our Eval Use TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

Older posts >>

Archives

To top