AEA365 | A Tip-a-Day by and for Evaluators

CAT | Data Visualization and Reporting

Hello, and welcome to Cleveland!  I’m Douglas Clay, a longtime Cleveland resident and evaluator with Candor Consulting.  I focus on data analysis and assessment training mostly with K-12 schools.  I’ve been in Cleveland long enough to remember when the Browns were a winning team, when the shopping centers were steel mills, and trendy Tremont was the plain old Southside.  As the city is on an upswing, enjoy your stay and check out some of the revitalized sections of my town during the conference.

After you return to work you might like to investigate the following resource I discovered years ago while working with the Cleveland Public Schools.  I was tasked with preparing student testing results for school administrators and teacher leaders for a weekend planning retreat.  I was called down to the superintendent’s office on a Tuesday afternoon and given a box of state testing reports. I was responsible for turning these stacks of little numbers into usable information for educators in three days.  What I learned that afternoon was that nothing focuses the mind like terror.  Luckily, I was able to draw upon the genius of Edward Tufte and his classic book, The Visual Display of Quantitative Information.

Rad Resource:

Tufte outlines the theory and practice of designing data graphics.  The book gives detailed analysis in how to display data with precision, accuracy, high data-ink ratio, and aesthetics.  There are over 250 illustrations showing right (and wrong) ways to display lots of numbers in a small space for maximum understanding.  His theories of graphical excellence are extended in subsequent books on the subject, all worth your time if you have a need to make sense of complex data sets and engage stakeholders in meaningful discussions.  The quickest way to jump into graphical excellence is Tufte’s one day course. He tours the country presenting to groups of academics, journalists, financial analysts, and policymakers.  The course includes all his books and is well worth your time.  As I explained to a colleague who questioned my investment of time into creating graphs for schools by asking, “What, do you have to draw them a picture?” I responded, “Yes, yes, I do.” Tufte shows how to make that picture the most effective one for each type of data.

The following graphics are from work I’ve done recently and I hope they illustrate Tufte’s principles.  The first displays 256 separate data points to show test scores over time.  It allows a school to see the overall direction as well as how their progress deviates from the average.  What they viewed as plateauing over two years after years of increasing scores is contrasted with neighboring, similar, schools falling off a cliff.

Performance Index 2001-2016 Solon, Neighboring and Similar Districts

This graphic displays a high school’s recent graduates’ college enrollment.  The National Clearinghouse dataset follows college graduates for years as they enroll and complete college degrees.  It allows for measurement of persistence and transfers between schools.  Tufte will teach you why this graph is more effective as a donut rather than a pie chart, and also allows decision makers to “see” the data without poring over charts of numbers.

5 Years average of first college enrollment after graduation

We’re looking forward to the fall and the Evaluation 2018 conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

Hi, I’m Sara Vaca, independent evaluator working in international development evaluation, helping Sheila curate this blog and being a Saturday contributor, and I want to share with you a short reflection on something that is bugging me lately.

Conducting an evaluation is a process: a rich, participatory, multi-partner, multidimensional process, that has to be later condensed, reflected, gathered in a report (a product), that is unidimensional, static and written by the evaluation leader or team. I have realized some time ago that those two parts of the assignment, equally important, are soooo different! And this represents a challenge in my practice, as I wonder: how can I make the report better represent the process?

Cool Trick: First, of course, try to make the evaluation process as technically good and adapted to the commissioners’ needs as possible. If the process is not good (see this related post), nobody is probably going to care about the report. In general, I enjoy conducing the evaluation process so much, and the feedback about the process is usually good or very good.

Ok, but what about the report?

That is when the second, less rewarding in my experience phase starts.

No matter what you get as findings, internal validity of the process has so far granted me acceptance during the debriefing and validation phase. However, the evaluation users are waiting to see it in a document, clearly stated and articulated, to be able to read it and assimilate what you saw through the process. And at that point, I notice a slight change of attitudes in them, and in consequence in me, as I have to adapt:

And I totally understand: the process is a soft activity and not easy to see the traces of it, but the report is the hard activity, and what it says, stays there forever… Still I find it very interesting how both parts of the same assignment differ in so many things.

Hot Tip: The only thing I’m doing so far is to make the materials in the debriefing presentation very consistent with the evaluation report. How I do that? I create summaries and visuals for my Powerpoint presentation (that I share with them when I leave the country) that will later be quite similarly reproduced in the draft.

However, looking forward to your ideas and tips… How do you reconcile these two equally important parts of evaluation? Thanks!

Hello, we’re Anthony Oboh, an I/O Psychology Doctoral candidate at Keiser University and Consulting Research Intern, and Sy Islam, Principal Consultant with Talent Metrics, a data-driven consulting firm. It is often said, “a picture is worth a thousand words” and that means using visuals can help people understand data more effectively. At Talent Metrics, we help professional organizations evaluate the effectiveness of their meetings. Visualizations can help professional organizations evaluate the quality of their meetings. Data visualization is an easy way for people to understand and interpret information. The simpler and more dynamic the visualization, the easier it is for people to interpret. Often, professional organizations evaluate their meetings based on content, relevance, and the satisfaction of the participated members. Such data can be presented in a simple yet powerful data visualization that can help these organizations evaluate their events more effectively.

Hot Tip: Use data visualization to report multiple pieces of data in a concise and simple manner.

A data visualization like a bar chart or stacked graph such as the one below is an easy way of presenting and communicating information about participants’ experience of a meeting or events. Most meeting feedback is collected after the meeting and can be presented in a simple visual like the stacked bar chart. However, more meaningful findings could be discovered by using longitudinal data. Professional organizations can evaluate how effective different subjects or meeting topics are over time.

Stacked Bar Indicating Members’ Reactions/Outcome of a Meeting

Stacked Bar Indicating Members’ Reactions/Outcome of a Meeting

A survey conducted at the end of a networking event, used to understand how well the event was, particularly to have a better understanding of the impact the event had made on the attendees.

You can use a simple line graph to track these same results over time. Review the line graph below. From this line graph we can evaluate the effectiveness of meetings on these criteria over time and provide appropriate content curation for future professional meetings.

Event evaluation responses

Lessons Learned: Using a longitudinal data such as the line graph above is extremely beneficial to understand how the content of professional networking events is perceived by audience members.

Rad Resources: Check out these resources to learn more about data visualization!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

Hello from Katherine Bergmann and Eryn Collins from Choices Coordinated Care Solutions, a national non-profit organization that supports individuals with significant behavioral and emotional challenges in community settings. We accomplish this by using evidence-informed methods that build on the strengths of those individuals and everyone involved.

Data drives our focus on effectiveness and efficiency at Choices and our Applied Research and Evaluation Team uses this data and translates it into consumable information that drives decision-making. We not only create detailed technical reports, we also develop infographics to present our analyses quickly and clearly for lay audiences.

We use several resources to better highlight actionable outcomes when presenting this information.

Rad Resource:  Piktochart, an infographic design program, has been essential to developing our graphic materials. The website offers free and low-cost templates to kickstart your creative efforts with no professional design experience required.

Rad Resource: Additionally, we use The Noun Project for icons and symbols to add visual interest and help lead consumers through the layout of our graphic presentations. When you use this resource, you have the capability to choose between royalty free and creative commons licensure, so it can fit within any budget.

Rad Resource: Since our goal of these infographics is to create content in plain language, the Hemingway App is a great resource to help make our writing more clear and concise. This free resource highlights problem areas within our content such as complicated sentence structures, redundant vocabulary, and use of passive voice.

In a recent example, Choices was tasked with providing a summary of outcomes to an influential stakeholder in one of our service areas. By using the resources listed above, Choices created a visually-appealing infographic that outlined the clear improvement youth and families were experiencing within our program (Figure 1). We accompanied statistics with graphic elements to clarify the patterns and trends in our data analyses. This output illustrates how evaluators can use graphics to disseminate results that meet the information needs of multiple audiences. Creating a consumable infographic helped our stakeholder readily understand our program success and served as a resource to inform future decision-making.

Figure 1: Infographic created for influential stakeholder.

Figure 1: Infographic created for influential stakeholder.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi, I’m Sara Vaca, independent consultant, helping Sheila curate this blog and occasional Saturday contributor. I haven’t been an evaluator for a long time (about 5 years now), but I have facilitated or been part of 16 evaluations, so I start getting over the initial awe of the exercise, and I am starting to be able to take care of other dimensions rather than just “surviving” (that is: understanding the assignment, agreeing on the design, leading the data collection process, simultaneously doing the data analysis, validating the findings, debriefing the preliminary results, finalizing digesting all these loads of information for finally packaging it nice and easy in the report).

I want to think that I incorporate (at least I try to) elements of Patton’s Utilisation-Focused Evaluation during the process, but until recently, my role as evaluator ended with the acceptance of the report (which is usually exhausting and challenging enough), taking no concrete actions once I had delivered it, partially because: a) it was not specified in the Terms of Reference (or included in the days of contract), or b) I usually didn’t have the energy or clarity to go beyond after the evaluation.

However, I’ve understood since the beginning of my practice that engaging in evaluation use is an ethical responsibility of the evaluator so I’ve just recently started doing some shy attempts to engage myself in it. Here are some ideas I just began implementing:

Cool Tick: Include a section in the report called “Use of the evaluation” or “Use of this report” in the document, so you (and them) start thinking of the “So what?” once the evaluation exercise is finished.

Hot Tip: Another thing I did differently was to elaborate the Recommendations section, but not in a prescriptive manner. Usually I would analyse all the evaluation ideas for improvement, and I would prioritize them according to their relevance, feasibility and impact. This time, I pointed out the priority areas I would focus on, and a list of ideas to improve each area, without clearly outlining what to do. Then I invited the organization to discuss and take that decision internally, and maybe forming internal teams to address each of the recommendations to gain more ownership.

Although, in occasions, clients have reached out months/years after the evaluation for additional support, this time I proactively offered my out-of-the-contract commitment to support, in case they think I could be of help later down the road.

Rad Resource: Doing proactive follow-up. I’ve read about this before, but haven’t yet done it systematically yet. So, I will set a reminder 3-6 months after the evaluation and check on how they are doing.

Hot Tip: I just published a post understanding Use and Misuse of Evaluation (based on this article by Marvin C. Alkin and Jean A. King), that helped me realize some dimensions of use.

As you see, I’m quite a newbie introducing mechanisms and practical things to foster use. Any ideas are welcome! Thanks!

 

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Hi all! My name is Gaelyn West and I serve as the State Government Representative and Board Member for the Southeast Evaluation Association (SEA). My work experience includes program evaluation, strategic planning, and grant development in state government.

One of the most important tasks that evaluators face is relaying their evaluation findings to a lay audience. Most often, evaluators do this with charts and graphs. However, infographics are rising to the forefront as an innovative way to display data quickly and clearly. Infographics are the visual representation of data. They can help tell the story of your evaluation in a creative and compelling way.

Hot Tips:

  1. The data used in your infographics should be reliable, timely, and relevant. Your data should always come from a credible source, should represent the most recent statistics available, and should not contain any distortions. In any case, your data should stand up to tests of reliability and validity.
  1. Remember The Six Basic Principles of Design:
  • Unity / Harmony – proximity, similarity, continuation, repetition, rhythm
  • Balance – symmetry, asymmetry, radial
  • Hierarchy – trees, nests, weights, timeliness
  • Scale / Proportion – size, ratio, divisions
  • Dominance / Emphasis – highlight, color, size
  • Similarity and Contrast – light and dark, bold and italic
  1. The purpose of an evaluation is to create an objective and compelling story that can help judge the merits of a program, improve a program, and/or generate new knowledge. Infographics can be used to illustrate information about the program context, implementation, need, or outcome/impact. Your infographics should highlight the story you want to tell about your evaluation findings.
  1. “Shareability”. Evaluators can customize Infographics to fit any shareable platform, such as reports, presentations, posters, and social media marketing. The list goes on! It is important to know the communication method your audience will most likely utilize and prefer. Will a one-page memo be the most effective way to share your evaluation findings? Would an interactive presentation, a video, or an online article work? You can easily tailor infographics to fit the chosen platform.

Rad Resources:

  • Microsoft Excel: A common and versatile spreadsheet software for creating visuals.
  • Microsoft PowerPoint: A common and easy to use presentation software.
  • Venngage: A simple infographics creator with templates and customization options.
  • Canva: A design program for creating Web or print visuals.
  • Infogram: A drag-and-drop infographics creator offers flexibility and independence in designing visuals.
  • Piktochart: A good beginner infographics creator in which users can customize set templates.

 

The American Evaluation Association is celebrating Southeast Evaluation Association (SEA) Affiliate Professional Development Week with our colleagues in the SEA Affiliate. The contributions all this week to aea365 come from SEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello from two scholars and two coasts! We are Mike Osiemo Mwirigi, MS, and Glen Acheampong, MPP. During our GEDI program year, we learned that evaluators and stakeholders are increasing their use of visuals to present data. Data visualization pioneers in evaluation have pointed out that a good visual can make evaluation results more user friendly. Effective visuals capture people’s attention, substitute for text and help reduce the lethargy of reading long reports. Last, they can tell a more memorable story.

We noticed that when talking about data visualization, cultural competency rarely comes up. Cultural competency in evaluation is the ability to engage with diverse stakeholders to “include cultural and contextual dimensions important to the evaluation” (American Evaluation Association, 2011). Data visuals can be interpreted differently based on cultural differences and, as a result, we interpret and react differently to the same stimulus.

The documentary West and East, Cultural Differences discusses how Easterners (Chinese, Japanese and Korean) and Westerners (Americans and Europeans) are tuned to differently interpret visual information. The documentary shared the following:

Hot Tip 1: Begin with a plan.

Data visualization can lose the intricacies of the story its telling. Further, some data visuals are complex and hard to interpret without an explanation. Evaluators should consider data visualization from the onset of the evaluation design to navigate exactly what the image should convey.

Hot Tip 2: Check and reflect stakeholders’ interpretations of data visuals.  

When interpreting data visualization guidelines or rules of thumb we must note that these are not universal; what works for one population might be counterproductive for another. This is true for constructed meanings around colors, shape, and symbols. Instead, explore what stakeholders need and can digest.

Rad Resources:

American Evaluation Association. (2011). American Evaluation Association Public Statement on Cultural Competence in Evaluation. Fairhaven, MA. Retrieved from www.eval.org.

EBS. (2012, December 05). West and East, Cultural Differences. Retrieved July 06, 2017, from https://www.youtube.com/watch?v=ZoDtoB9Abck&index=302&list=LLaTQQZHp4uDV7ubqPoaq4NQ

Emery, A. K., & Evergreen, S. (2014). Data Visualization Checklist. http://stephanieevergreen.com/dataviz-checklist/

Links to West and East, Cultural Differences

Part 1: https://www.youtube.com/watch?v=ZoDtoB9Abck&index=302&list=LLaTQQZHp4uDV7ubqPoaq4NQ and

Part 2: https://www.youtube.com/watch?v=709jjq8qk0k&t=4s

The American Evaluation Association is celebrating Graduate Education Diversity Internship (GEDI) Program week. The contributions all this week to aea365 come from AEA’s GEDI Program and its interns. For more information on GEDI, see their webpage here: http://www.eval.org/GEDI  Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello! We are Carolyn Camman, Christopher Cook, Andrew Leyland, Suzie O’Shea, and Angela Towle of the UBC Learning Exchange, which is a little bit of the University of British Columbia in the heart of Vancouver’s Downtown Eastside (DTES). It’s a bridge for mutual understanding and learning between the University and a neighbourhood rich in community, art, and history, but whose residents face challenges, including homelessness, poverty, gentrification, and stigma. The UBC Learning Exchange acts as a member of the community, giving back to residents through community-based programming alongside experiential learning opportunities for students and support for community-based research.

The Learning Lab supports members of the DTES community to engage in activities and scale-up their involvement by offering creative and educational activities in a flexible, low barrier format. In keeping with the arts-based principles of the Learning Lab and community engagement mission of the Learning Exchange, when it came time to make the results of a recent evaluation accessible to a community audience, the answer was obvious: put on a show!

Voices UP! is a theatrical performance co-written and co-performed with the community members who contributed to the original evaluation. It not only communicated evaluation results, but deepened the evaluation itself. Through writing and performing the play, the cast learned more about evaluation and shared new stories and insights. Over its four-performance run from Spring 2016 to Fall 2017, the show evolved and grew.

Hot Tip: There’s growing interest in using arts-based methods in evaluation. Live theatre is a dynamic and engaging approach that encourages people to connect with findings viscerally and immediately as part of a dialogue between performers and audience. In post-performance talk-back, one person said, “It was neat to hear the participants reflecting on what they had just done as well as what it meant to them to be a part of it.” Another commented that “seeing” the impact of the program was more persuasive than reading about it from a report or grant application.

Lessons Learned: A performance doesn’t have to be polished or “professional” to be effective. Community members speaking in their own words is powerful and there are many creative techniques (like puppets!) that can bring evaluation findings to life. Having a conversation with the cast and giving introductions to audiences before performances about different ways theatre can “look” helped set appropriate expectations.

Rad Resources: To keep Voices UP! going even after the curtains come down for the last time, the Learning Exchange staff and cast of program patrons came together to tell their story one last time, this time in a comic book. You can download this resource online for free: http://learningexchange.ubc.ca/voicesupcomic It was created using the same participatory process as the original performance and tells the story of how Voices UP! came to be, with tips and insights for anyone interested in using theatre methods to tell their evaluation story. Look up “reader’s theatre” and “ethnodrama” for more ideas about turning evaluation and research into plays.

The cast and creators of Voices UP! Photo credit: The UBC Learning Exchange

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Greetings! I’m Rose Hennessy, Adjunct Professor in the Helen Bader School of Social Welfare and a Doctoral Student at the Joseph J. Zilber School of Public Health, both at the University of Wisconsin-Milwaukee. In teaching Program Evaluation to MSW students, I’ve had the wonderful opportunity to collaborate with Jennifer Grove and Mo Lewis at the National Sexual Violence Resource Center (NSVRC).

In the short duration of a semester, it can be difficult to provide students the opportunity to practice engaging with stakeholders and translate evaluation findings. In conjunction with NSVRC staff, we proactively identified recent research articles of interest for sexual violence prevention practitioners. Busy professionals frequently do not have time or access to recent publications, but in academia we can play a role in getting current research out in digestible ways! Students are assigned articles and asked to create infographics of key themes and implications to meet stakeholder needs.

Lessons Learned:

  • Students learn a new technology best with hands-on learning. A free infographic program is taught to the class in a computer lab where they can learn and practice. Walking through skills step-by-step with a guided handout promotes a new skill and program.
  • Assignment scaffolding models the stakeholder process. Four different assignments are used for the project, allowing for feedback, revisions, and reflection. Students review the NSVRC website for content, design, and values. They critique their article to pull content specific to the stakeholder, create and present the infographic, and use class feedback to reflect and create revisions.
  • Presenting infographics allows for shared learning of evaluation concepts. Students review creative ways to share qualitative and quantitative findings, examine different study designs, discuss how to present null findings, explore various visualization options, and gain experience utilizing critical feedback from peers.
  • More time is needed to promote culturally responsive evaluation. Research with diverse populations was intentionally chosen for review, but many students lack prior experience translating findings across cultures. Providing readings to assist students, setting up ground rules, and allowing more time for reflection and discussion is necessary to help students process evaluation results in a culturally-responsive manner. Conversations also highlighted the need to differentiate between collaborative approaches and culturally-responsive evaluation, and new readings have been identified for future courses.

As an instructor, a collaborative project with NSVRC provides students the opportunity for learning with real-world applications. There was high motivation for the creation of projects that can be used by a national leader in the field, and students leave the class with new skills in the translation of research, study design, visualization, and dissemination!

The American Evaluation Association is celebrating ¡Milwaukee Evaluation! Week with our colleagues in the Wisconsin statewide AEA Affiliate. The contributions all this week to aea365 come from our ¡Milwaukee Evaluation! members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello, AEA365 readers! We are Antonina Rishko-Porcescu (Ukraine), Khalil Bitar (Palestine/Germany) and Bianca Montrosse-Moorhead (USA), from EvalYouth, a global, multi-stakeholders partnership which promotes young and emerging evaluators to become future leaders in evaluation.  Antonina is leader of an EvalYouth task force; Khalil is Vice-Chair of EvalYouth and Bianca is Co-Chair of the network.  We are excited to share what we have learned from EvalYouth’s use of visualization when communicating with our young audiences of evaluators.

We communicate a lot with a broad range of evaluators, especially young and emerging evaluators, and young people from around the world.  In this ever changing fast-paced world, we understand that using words is not enough.  Information must be clear, direct, coherent, and compelling.  One question we have explored is: how should we disseminate information and evidence in a way that draws novice evaluators in, and presents information in a meaningful way?

Examples:

  • Summarizing data collected through an international survey with young and emerging evaluators (e.g., here and here).
  • Summarizing results of received applications for the first EvalYouth International Mentoring Program (e.g., here).
  • Transforming the Network’s original logo to highlight special events and programs (e.g., here and here)

Hot Tips and Cool Tricks:

  • Visualization makes complex data easier to understand, but it is not easy to create good visualizations; it involves hard work and research. Do your homework.
  • Try to strike a balance between pictures and words. An infographic should include valuable information, not just cool graphics, but that too.
  • Use colorful designs and, when appropriate, humor. Doing so invites readers, especially youth and young and emerging evaluators, to read information.
  • Work collaboratively. Others bring fresh perspectives and new ideas, but also often feel more of an ownership of the project you are working on after it concludes.  Ownership often means that there is an excellent possibility they will share it with relevant contacts and on their social media channels afterward.
  • It is not enough to make a great infographic and stop there. Disseminate such work widely through mailing lists and social media outlets. There people will see your message and, very importantly, engage with it.

Lesson Learned:

  • Data visualization used well is a powerful communication tool. It can simplify complex ideas and big data in just a few items on an infographic.
  • Working with a team of people from many backgrounds or countries, helps a lot. What might be right, appropriate and trendy in one region or culture, could be the opposite in another.  Diverse team perspectives can identify and overcome such issues.
  • The ethics of data visualization is also important to consider. A well-done data visualization is a powerful tool! As Uncle Ben in the Spider-Man series said, “with great power comes great responsibility.”  Care should be taken to ensure that the visualization message is accurate, valid, coherent, and just.

Want to learn more about EvalYouth? Follow EvalYouth on social media: Facebook, Twitter, LinkedIn, and YouTube. Or, write to us: evalyouth@gmail.com.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Older posts >>

Archives

To top