AEA365 | A Tip-a-Day by and for Evaluators

TAG | reports

My name is Amy A. Germuth,  and I am Founder and President of EvalWorks, LLC in Durham, NC and blog at EvalThoughts.com. Over the last year I have worked improving my reporting of findings to better meet my client’s needs and have a few great resources to help you do the same.

Rad Resource: “Unlearning Some of our Social Scientist Habits” by Jane Davidson (independent consultant and evaluator extraordinaire, as well as AEA member and TIG leader). She added some additional thoughts to this work and presented them at AEA’s 2009 annual conference in Orlando. Her PowerPoint slides for this presentation can be found at: http://bit.ly/7RcDso.

Frankly, I think this great article has been overlooked for its valuable contributions. Among other great advice for evaluators (including models or theories but not using them evaluatively and leaping to measurement too quickly), she addresses these common pitfalls when reporting evaluation findings: (1) not answering (and in some cases not even identifying!) the evaluation questions that guided the methodology, (2) reporting results separately by data type or source, and (3) ordering evaluation report sections like a Master’s thesis. This entertaining article and the additional PowerPoint slides really make a case for using the questions that guide the evaluation to guide the report as well.

2015 UPDATE
Read Resource:
Data visualization can help make reporting more accessible and visually captivating.  There is a great post on “What is data visualization?” and many posts from other aea365 authors.

Rad Resource: Why assume all findings have to be reported as a paper?  Try reporting using PowerPoint and heed the advice Garr Reynold’s provides in his great book “Presentation Zen Design” to ensure that you do not subject your clients to DBP (death by PowerPoint).

This post is a modified version of a previously published aea365 post in an occasional series, “Best of aea365.”  Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

I am Megan Greeson, an Assistant Professor of Clinical/Community Psychology at DePaul University. I conduct collaborative, utilization-focused program evaluations with community-based organizations and I also teach a graduate seminar course in Program Evaluation.

I am Adrienne Adams, an Assistant Professor of Ecological-Community Psychology and Program Evaluation at Michigan State University. I use collaborative evaluation approaches to help human service organizations build evaluation capacity and foster organizational learning.

We are interested in helping community organizations engage with program evaluation and evaluation findings in order to promote ownership over the evaluation process and findings, and to increase evaluation capacity and use. The ways in which we communicate with communities about evaluation and evaluation data matter and can improve their retention and understanding of important material.

Hot Tip: Multimedia reports (involving text, audio, and video components) can be a great way to reach evaluation stakeholders from a distance. The addition of audio and video to a text-based report can draw attention, create a connection to the evaluator, and help stakeholders interpret difficult technical topics.

Hot Tip: Since multimedia reports can be viewed or heard on a variety of devices at times and locations most convenient for evaluation stakeholders, they can be a good way to reach busy stakeholders with little time to digest a written report.

Hot Tip: Audio and video components can be easily edited to tailor parts of a presentation to individual sites or stakeholders, or to correct presentational errors, like “ums” or animation glitches.

Hot Tip: Adding a worksheet or activity that the audience completes while watching the presentation can assess retention and/or encourage a plan for use.

Rad Resources: Jing, Camtasia, and Adobe Presenter are tools that can be used to create presentations that involve text/figures, audio, and video components.

Jing: http://www.techsmith.com/jing.html  (FREE!)

Camtasia: http://www.techsmith.com/camtasia.html (Free trial)

Adobe Presenter: http://www.adobe.com/products/presenter.html

Rad Resource:  Attend our demonstration “Technological Tools for Creating Multimedia Evaluation Reports” at AEA 2013 to see how we have used these tools in our own work.

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from theAmerican Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org. Want to learn more from Megan and Adrienne? They’ll be presenting as part of the Evaluation 2013 Conference Program, October 14-19 in Washington D.C.

·

Hello! We are Johanna Morariu, Kat Athanasiades, and Ann Emery from Innovation Network. For 20 years, Innovation Network has helped nonprofits and foundations evaluate and learn from their work.

In 2010, Innovation Network set out to answer a question that was previously unaddressed in the evaluation field—what is the state of nonprofit evaluation practice and capacity?—and initiated the first iteration of the State of Evaluation project. In 2012 we launched the second installment of the State of Evaluation project. A total of 546 representatives of 501(c)3 nonprofit organizations nationwide responded to our 2012 survey.

Lessons Learned–So what’s the state of evaluation among nonprofits? Here are the top ten highlights from our research:

1. 90% of nonprofits evaluated some part of their work in the past year. However, only 28% of nonprofits exhibit what we feel are promising capacities and behaviors to meaningfully engage in evaluation.

2. The use of qualitative practices (e.g. case studies, focus groups, and interviews—used by fewer than 50% of organizations) has increased, though quantitative practices (e.g. compiling statistics, feedback forms, and internal tracking forms—used by more than 50% of organizations) still reign supreme.

3. 18% of nonprofits had a full-time employee dedicated to evaluation.

Morariu graphic 1

4. Organizations were positive about working with external evaluators: 69% rated the experience as excellent or good.

5. 100% of organizations that engaged in evaluation used their findings.

Morariu graphic 2

6. Large and small organizations faced different barriers to evaluation: 28% of large organizations named “funders asking you to report on the wrong data” as a barrier, compared to 12% overall.

7. 82% of nonprofits believe that discussing evaluation results with funders is useful.

8. 10% of nonprofits felt that you don’t need evaluation to know that your organization’s approach is working.

9. Evaluation is a low priority among nonprofits: it was ranked second to last in a list of 10 priorities, only coming ahead of research.

10. Among both funders and nonprofits, the primary audience of evaluation results is internal: for nonprofits, it is the CEO/ED/management, and for funders, it is the Board of Directors.

Rad Resource—The State of Evaluation 2010 and 2012 reports are available online at for your reading pleasure.

Rad Resource—What are evaluators saying about the State of Evaluation 2012 data? Look no further! You can see examples here by Matt Forti and Tom Kelly.

Rad Resource—Measuring evaluation in the social sector: Check out the Center for Effective Philanthropy’s 2012 Room for Improvement and New Philanthropy Capital’s 2012 Making an Impact.

Hot Tip—Want to discuss the State of Evaluation? Leave a comment below, or tweet us (@InnoNet_Eval) using #SOE2012!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · · ·

Hello! I am Manny Straehle and I work with International Credentialing Associates (ICA). ICA is a measurement consulting firm with expertise in developing certificates, certification, and licensing programs. We are committed to producing reports and visualizations that are simple and easy to understand for the clients we serve, but we need your help.

Hot Tip – Enter our Data Visualization Competition: With the sponsorship of AEA’s Data Visualization and Reporting TIG, we are launching a data visualization competition. The winner will receive $500.

ICA often generates data reports for our clients – and revising the look and ease of this data report is the basis for this data visualization competition.

Don’t worry if you aren’t familiar with certification and licensing programs, or if some of the terms below are new to you. The contest involves improved organization and display – no data analysis required.  The submissions will be reviewed by our panel of judges, including me, Rory McCorkle, DVR TIG founder Stephanie Evergreen, DVR TIG Co-chair Johanna Morariu, and other experts.

Task:       To create a data visualization-centered report, using a provided dummy set of data.

Hot Tip – Include These Required Report Components:

  • ICA’s name, logo, and color schema (available from the ICA website)
  • Space for customer name
  • The following data points, from the provided Excel spreadsheet:
    • Key (correct answer)
    • P+ (item difficulty)
      • Percent of candidates who answer the question correctly
    • Distractor Analysis (number and percentage) across three performance levels
    • Rbis (item discrimination)
      • Correlation between performance on an item and total performance
    • Recommended actions

Hot Tip – Attend to the Evaluation Criteria:

  • Uses basic spreadsheet software (Excel, Open Office, etc.) for data visualizations
  • Report fits on one page
  • Required report components are present
  • Text is legible
  • Data visualizations are easily interpreted
  • Transfer of graphs from spreadsheet to report is easy
  • Report uses best practices in data visualization design
    • Layout and use of space
    • Use of color and contrast
    • Inclusion of appropriate text
    • Use of data visualizations appropriate to data
    • Contribution of design to overall comprehension of information

Hot Tip – Follow Our Submission Instructions:

  • Download the Excel spreadsheet with a data set you can use (please note the file is dummy data).
  • Submit your revision, both report in PDF format and spreadsheet file, to info@intlcred.com by 15 April 2013.
  • The winner will be announced on 15 May 2013 and contacted by email.
  • All entries become the property of ICA upon submission.  By submitting an entry, all Entrants agree to the Official Rules of Entry.
  • Questions can be directed to info@intlcred.com

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Stan Capela, and I am the VP for Quality Management and the Corporate Compliance Officer for HeartShare Human Services of New York. I have devoted my entire career to being an internal evaluator in the non-profit sector since 1978.

In graduate school, you develop a wide range of skills on how to conduct program evaluation. However, there is one skill that schools don’t focus on – how an internal evaluator develops a brand that clearly shows that s/he adds value to the organizational culture.

Developing a personal brand can be a challenge, given workplace perceptions, pressures, and stresses. For example, program staff may have varying perceptions of my dual roles as an internal evaluator, which involve supporting their efforts and pointing out deficiencies. In addition, I often conduct simultaneous projects that combine formative and summative evaluations and may involve quality and performance improvement. Finally, my attention often gets split between internal reports and external reviews.

Lesson Learned: Producing quality reports that clearly are utilization-focused is important. But I’ve found that the secret ingredient to making my work valued and developing a brand within the organization is simply the ability to help answer questions related to programmatic and organization problems.

Lesson Learned:  Get to know program staff and their work.  In my early years, I found it especially helpful to spend time talking to program staff. It provided an opportunity to understand their work and the various issues that can impact a program’s ability to meet the needs of the individuals and families served. Ultimately, this helped me to communicate more effectively with staff and about programs.

Lesson Learned:  Find additional outlets to build your networks. I have had an opportunity to be a Council on Accreditation (COA) Team Leader and Peer Reviewer and have developed contacts by participating in 70 site visits throughout the US, Canada, Germany, Guam and Japan. Over the span of 34 years, I have developed a network of contacts that  have helped me respond expeditiously – sometimes through one email – when a question arises from management. As a result, I became know as a person with ways to find answers to problems.

RAD Resources:   Many of my key resources are listservs.  These include Evaltalk – a listserv of worldwide program evaluators; the Appreciative Inquiry List Serve (AILIST); and the List of Catholic Charities Agencies (CCUSA).  Other helpful affiliations include the Council on Accreditation (COA), the Canadian Evaluation Society, and the American Society for Quality.

If you have any questions, let me know by emailing me or sharing them via the comments below.

The American Evaluation Association is celebrating Internal Evaluators TIG Week. The contributions all week come from IE members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

· · · · ·

Greetings from Boise, the city of trees! We are Rakesh Mohan (director) and Margaret Campbell (administrative coordinator) of Idaho’s legislative Office of Performance Evaluations (OPE). Margaret reviews drafts of our reports from a nonevaluator’s perspective, as well as copyedits and desktop publishes each report. In this post, we share our thoughts on the importance of writing evaluation reports with users in mind. Some of our users are legislators, the governor, agency officials, program managers, the public, and the press.

Lessons Learned: Writing effective reports for busy policymakers embraces several criteria, such as logic, organization, and message. But in our experience, if your writing doesn’t have clarity, the report will not be used. Clear writing takes time and can be difficult to accomplish. We have examined some reasons why reports may not be written clearly and declare these reasons to be myths:

Myth 1: I have to dumb down the report to write simply. Policymakers are generally sharp individuals with a multitude of issues on their minds and competing time demands. If we want their attention, we cannot rely on the academic writing style. Instead, we write clear and concise reports so that policymakers can glean the main message in a few minutes.

Myth 2: Complex or technical issues can’t be easily explained. When evaluators thoroughly understand the issue and write in active sentences from a broad perspective, they can explain complex and technical issues clearly.

Myth 3: Some edits are only cosmetic changes. Evaluators who seek excellence will welcome feedback on their draft reports. Seemingly minor changes can improve the rhythm of the text, which increases readability and clarity.

Our goal is to write concise, easy-to-understand reports so that end users can make good use of our evaluation work. We put our reports through a collaborative edit process (see our flowchart) to ensure we meet this goal. Two recent reports are products of our efforts:

Equity in Higher Education Funding

Reducing Barriers to Postsecondary Education

Hot Tips

  1. Have a nonevaluator review your draft report.
  2. Use a brief executive summary highlighting the report’s main message.
  3. Use simple active verbs.
  4. Avoid long strings of prepositional phrases.
  5. Pay attention to the rhythm of sentences.
  6. Vary your sentence length, avoiding long sentences.
  7. Write your key points first and follow with need-to-know details.
  8. Put technical details and other nonessential supporting information in appendices.
  9. Minimize jargon and acronyms.
  10. Use numbered and bulleted lists.
  11. Use headings and subheadings to guide the reader.
  12. Use sidebars to highlight key points.

Rad Resources

  • Revising Prose by Richard A. Lanham
  • Copyediting.com
  • Lapsing Into a Comma by Bill Walsh

We’re celebrating Data Visualization and Reporting Week with our colleagues in the DVR AEA Topical Interest Group. The contributions all this week to aea365 come from our DVR members and you may wish to consider subscribing to our weekly headlines and resources list where we’ll be highlighting DVR resources. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice.

·

My name is Penny Black and I am a Public Health Program Evaluator at the University of Wisconsin’s Population Health Institute.  As an evaluator, I am often called upon to present or facilitate discussions of evaluation results.

Rad Resource – Prezi: For years, the presentation tool standard has been Microsoft PowerPoint. I have recently begun to use, however, Prezi, a presentation software product that uses “zooming” technology to create visually stimulating presentations (www.prezi.com).

For those who haven’t used Prezi (or even heard of it), I thought it might be helpful to have a comparison of industry-standard PowerPoint and newcomer Prezi. Please note that this review is based on my personal experience, others may have different experiences and opinions.

PowerPoint Prezi
Splash factor Small Huge (it’s new, it can’t help but make a cannon-ball sized splash)
Design elements Pretty basic; templates provide design elements such as columns for text and boxes for images Pretty cool; frames, arrows, zooming, free-style drawing tools
User interface Friendly; familiar to MS product users Somewhat friendly; website provides helpful tutorial and examples
Templates Available and modifiable Available in the form of searching the prezi public database and using others’ prezis as templates; also modifiable
Progression Linear Non-linear
Easy to develop Check! Once you know what you’re doing, it’s smooth sailing
Last-minute editing Easy and quick A little cumbersome
Video capability No; PowerPoint must be minimized to access videos in another application Yes; videos can be embedded in the presentation
“Operating system” Hard drive Online (but can be downloaded)
Presenter notes Includes dedicated space for notes and printing options Does not have this capability (but I predict this is coming)
Printer friendliness Great Not great (but again, I anticipate this getting better as more users request improvements)
Motion sickness Not an issue Can be an issue if you’re not careful with transition frequency and depth effects
Cost Usually sold as part of the Microsoft Office Suite, can cost several hundred dollars Free! (see Rad Resource below for license information)

Hot Tip: Use a dark background for better visibility in large rooms. Light backgrounds look great on your computer monitor but wash out on a projection screen or whiteboard.

Rad Resource: Student/Teacher Licenses: Prezi offers three licenses: Public (free), Enjoy ($59/yr), and Pro ($159/yr), each license offering progressively more features. There is also a free 30-day trial for the Enjoy and Pro licenses. Students and Teachers can sign up for a free Enjoy license (called EduEnjoy) or a reduced price Pro license (EduPro: $59/yr).

Rad Resource: AEA Coffee Break webinar – Easy as Prezi: A Powerpoint Alternative. Free to AEA members, this webinar presents a great demonstration of creating and using Prezi. Watch it now!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Marion O’Reilly-Stanhauer. I am an independent evaluation consultant focusing on out-of-school time (OST) programs such as after-school enrichment, tutoring, and summer success. The programs I work with are impacted by standardized testing in their states, often due to calls for additional OST support when standardized tests indicate low proficiency levels or the converse when standardized testing scores raise concerns that prompt advocacy for redistribution of funds away from OST to core competencies. As such, I find standardized testing to be increasingly and unexpectedly important to the context in which I practice.

 

Clipped from: kingsburycenter.org (share this clip)

Rad Resource – The Kingsbury Center at Northwest Evaluation Association (NWEA) Data Gallery: The Data Gallery is a project from the Kingsbury Center that explains and shares their data focusing on United States standardized testing educational achievement. The Data Center is eye opening on many levels. There are currently two online ‘exhibits.’ Each exhibit includes a video from a lead project researcher, four interactive graphs that provide comparative data, and links to research and commentary to help you to learn more.

Exhibit 1 – State of Proficiency: This exhibit focuses on the inconsistency of state standards across states and grades, so that states with easier standards report higher rates of student mastery.

Exhibit 2 – Achievement Gaps: This exhibit builds on the state of proficiency exhibit, providing profiles showing that a standout student in one state could be sub-par in another. As an example, 20% of the students in Clarkson Elementary (a hypothetical example provided by NWEA) would be deemed proficient in fifth grade mathematics in Massachusetts, while 80% would be deemed proficient in Colorado.

Lessons Learned for Accessibility and Transparency: I believe that the Data Center helps parents, teachers, and the public to ask fundamental questions about standardized testing. It allows stakeholders to examine their state in relation to others, and ideally prompts questions to policymakers about group and test disparities. I look at what they have developed as an opportunity to consider how I could work with my own clients, mostly nonprofits, to share data with their stakeholders and prompt similar actions.

Lessons Learned for Data Exhibiting: The Data Gallery uses Tableau software to provide the interactive visualizations. I found that they were useful, and that the available manipulations helped me to answer some of my own questions (how did my state compare? Are there inconsistencies based on school type), but not others (how does race and ethnicity enter into the equation, do the state inconsistencies reflect curriculum foci or contextual differences?). In some cases, it was difficult to understand exactly what was being compared. As an evaluator, I wanted more ready access to the source of the data and a more full explanation of the data source and calculations right on the graphs (rather than accessible only within a pdf of the full report).

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

Our names are Judy Savageau and Terri Anderson. Judy is Associate Professor of Family Medicine and Community Health at the University of Massachusetts Medical School. Terri is Director for Evaluation in the University of Massachusetts Medical School Commonwealth Medicine Center for Health Policy and Research.

Founded in 1999, Commonwealth Medicine’s mission is to apply knowledge to improve health outcomes for those served by public health and human service programs. To support that mission, the Center for Health Policy and Research’s Research and Evaluation Unit (Unit) conducts applied research and evaluation studies on health interventions and policies.

Recently the Unit established a strategic planning group to focus on standards for its evaluation report products, adopting numerous strategies outlined in several Robert Wood Johnson Foundation documents and peer-reviewed evaluation publications.

Hot Tip: The Unit also recently implemented an evaluation journal club as a means of collectively reviewing with research collaborators such topics as guiding principles for program evaluation, ethical considerations, methodologic and analytic design options, and stakeholder advocacy. We suggest this combination of resources for evaluators working with health service agencies.

Hot Tip: There are often many and varied audiences to which evaluation reports are disseminated. Similarly, there are numerous program evaluation designs employed as well as those with mixed methodologies. The program evaluation design and the intended audience(s) for dissemination of findings, coupled with a desire to bring a rigorous academic focus to the work and its outputs, requires a set of principles and standards for a formalized internal review process which will increase the likelihood of publication as well as improve the marketability of the evaluation team.

Rad Resources: The RWJF has developed a series of reports as a guide to aid evaluators from beginning their assessment of programs (e.g., A Practice Guide for Engaging Stakeholders in Developing Evaluation Questions) to writing up results for dissemination (e.g., A Checklist for Evaluators). This RWJF series and links to its easy-to-read and easy-to-implement individual reports can be found at http://www.rwjf.org/pr/product.jsp?id=52588.

In addition, numerous peer-reviewed articles have outlined common pitfalls and recommendations in developing evaluation criteria for research in health care settings (e.g., Cohen and Crabtree. Evaluative Criteria for Qualitative Research in Health Care: Controversies and Recommendation. Ann Fam Med 2008;6:331-339).

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Want to hear more from Judy and Terri? Attend their session at AEA’s Annual Conference this November – Evaluation 2011. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

I’m Annelise Carleton-Hug, principal evaluator of Trillium Associates, a small evaluation company with a focus on environmental program evaluation. I’m also the Chair of the Environmental Program Evaluation TIG and I invite you to learn more about our TIG by visiting the EPE TIG website.

Hot Tip: Consider this aea365 post a conversation starter that encourages the AEA community to share tips and ideas for reducing the environmental impacts of our work. Personally, I’m much more attuned to living a greener life in my home life. I know I have a lot to learn about thinking sustainably in how I conduct my evaluation business. Here are a few ideas to get you thinking of how you can reduce your use of resources as well as the amount of waste you create.

Hot Tip – Marketing your business: Investing in a professionally-crafted website is not only one of the best ways to market your evaluation services, it is also a sound ecological choice.

Hot Tip – Conducting the Evaluation: Consider how data will be collected. Can you effectively use online surveys rather than paper copies? Consider conducting interviews and focus groups via telephone, Skype or other conferencing platform to reduce travel costs and environmental side effects.

Hot Tip – Reporting: Do you really need a paper report? Many times I’ve found that the clients are satisfied with an electronic file; however, I don’t want to simply transfer the burden of printing the report to the recipient. Rather than producing one long report, it’s a good idea to “chunk” the electronic version of the report into sections (e.g. Executive Summary, separate reports for each stakeholder group) so that the client can choose to print out only what is necessary. Another tip: create a PowerPoint with pertinent data summaries so that the client can share the results in that format, foregoing the need for long reports. For times when I do produce a printed report, I make sure to use post-consumer recycled paper and earth-friendly inks, and I include notation of this fact in the front pages.

Hot Tip – Office materials: When it comes time to upgrade your office equipment, keep your old computer and other electronic equipment out of the waste stream by recycling or donating. Check out these helpful resources from the EPA: http://www.epa.gov/osw/conserve/materials/ecycling/donate.htm

Hot Tip – Recharge your own batteries: Be sure to make time for yourself to spend time in nature, to remind yourself why it is so vitally important that we reduce our negative environmental impact. Take a hike, work in your garden, ride the rapids, watch the sunset. Make connecting with nature a regular part of your life. In addition to physical and mental health benefits, you’ll probably be inspired to learn more ways to protect and restore our planet.

The American Evaluation Association is celebrating Earthweek with our colleagues in the Environmental Program Evaluation AEA Topical Interest Group. The contributions all this week to aea365 come from our EPE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.  aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

Older posts >>

Archives

To top