AEA365 | A Tip-a-Day by and for Evaluators

CAT | Data Visualization and Reporting

I’m Lisle Hites, Chair of the Needs Assessment TIG and Director of the Evaluation and Assessment Unit (EAU) at the University of Alabama at Birmingham (UAB). Today’s posting is about the use of data visualization to enhance your needs assessment.

Recently, my team worked with a state agency to help them identify potential sites for a pre-k development initiative. We used ArcGIS 10.2 Geographic Information Systems (GIS) technology to geocode and map all child care centers and grant applicants within the state. In turn, these data were displayed on an interactive, web-based map using ESRI’s ArcOnline platform. Supplemental data regarding percentage of people in poverty were added to the map to enhance the decision making process for policy makers (American Community Survey Census).

Displaying these multiple sets of data visually allowed state representatives to see the highest concentrations of four year olds in the state as well as potential gaps in service coverage by existing pre-k programs. In other words, these data were used to reduce the potential for duplication of services and to identify areas of greatest need.

Lessons Learned:

  1. Needs assessments can be conducted in a variety of ways using existing data in new and innovative ways.
  2. While state representatives had ideas of what they wanted to know, data visualization led them to refine their questions and identify additional sources of information to support their “data-driven” decision.
  3. Hardcopy paper maps of each county did not provide enough geographic detail of childcare facilities. To maximize the large amount of disparate data, an online interactive mapping platform was critical to the success of this project.

Rad Resources:

ArcGIS Online (n.d.). The mapping platform for your organization. ESRI.

ArcNews (2013, Summer). ArcGIS 10.2 brings transformational capabilities to users. ESRI.

Azzam, T., & Robinson, D. (2013). GIS in evaluation: Utilizing the power of geographic

information systems to represent evaluation data. American Journal of Evaluation, 34(2),

207-224. doi: 10.1177/1098214012461710

Evergreen, S. (2013). Presenting data effectively: Community your findings for maximum

impact. NY: Sage Publications.

United States Census Bureau (2015). American community census.

The American Evaluation Association is celebrating Needs Assessment (NA) TIG Week with our colleagues in the Needs Assessment Topical Interest Group. The contributions all this week to aea365 come from our NA TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

My name is Eva Guenther, I am a Project Manager for a for-profit, employee-owned US Government (USG) contractor in Washington, DC. I am responsible for the successful implementation of USG funded projects in the Middle East and North Africa (MENA). While my heart beats for evaluation it is not the primary focus of my role and responsibilities. Here are a few tips on how I support the local project teams from afar and ensure excellent data driven decision making and monitoring and evaluation happens on the projects.

Hot Tip: Establish a culture of data driven decision-making. Evaluation is often an afterthought and treated as if it were the sole the responsibility of the evaluation team rather than the entire project team. I try to set the expectations from the start that we should measure the outputs and outcomes for of all activities so that the team wants to know this data and proactively seeks it.

Hot Tip: Get involved early and set dates for revisits. After award I help operationalize our evaluation plan with the local team, work with them on regular data quality assurance activities and review evaluation data in regular reports to the client(s).

Hot Tip: Use web-based tools for data sharing. All project team members, often dispersed, should know the outputs and outcomes of project activities. Web-based tools make that easier. The company I work for has a proprietary project management tool that includes an evaluation module that facilitates easy capturing and sharing of evaluation data. This makes checking progress easy for everybody easy, including for myself from the US.

Hot Tip: Use data visualization. USG funded projects are heavy on narrative centric reports. I help our implementation teams on the ground to better tell the project’s story through data visualization and web-based interactive maps. This has led to deeper conversations as the information is more accessible. It also has led to outcome and impact data being shared outside the immediate project circle as it did not require a deep background on to understand it.

Rad Resource: Get inspired by others. There are a lot of brilliant minds out there so looking at other’s infographics has given me great ideas for how to explain a project output or outcome better with the help of an image. I have drawn lots of inspirations from these blogs:

http://dailyinfographic.com/

http://www.visualizing.org/

http://www.coolinfographics.com/

Try out one of the free resources for how to create infographics: http://www.creativebloq.com/infographic/tools-2131971

Lessons Learned: Keep up-to-date with new developments in data collection. Mobile data collection, sentiment analysis and other approaches and tools can be helpful for the local teams where resources are often scarce.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Greetings, I’m Ann K. Emery. I consult, instruct, and write on all things data visualization. One of the most common questions I receive during workshops and webinars is, “Ann, where do I start?”

Cool Tricks: As you’re overhauling your visualizations, these three edits are guaranteed to give you the biggest bang for your buck.

Remove unnecessary ink.

I immediately begin deleting or lightening everything without a purpose: the border, the grid lines, and the tick marks. Visualization guru Edward Tufte calls this strategy the data:ink ratio; we’re intentionally removing any ink that isn’t directly related to the data itself. These edits ensure our viewers will focus attention where we need it: on the actual patterns, not on the software program’s outdated and clunky lines.

emery-eers-1-lines

Read Muted Grid Lines: Small Details, Big Difference to explore before/after remakes in more detail and watch Removing Tick Marks and Grid Lines for a how-to lesson in Excel.

Customize your color palette.

Next, I swap my software program’s random color scheme for a customized palette.

  • As a consultant, I’m typically following my client’s branding. I scroll through the organization’s website, look at their logo, and skim publicly-available reports that were created with the aid of a graphic designer.
  • In my former role as an internal evaluator, I would match my chart colors to my organization’s own logo and branding.
  • When I create graphs through my role with AEA’s Data Visualization and Reporting Topical Interest Group, I match AEA’s exact shade of burgundy—RGB code 149:8:4—rather than sloppily choosing any old shade of red.

emery-eers-2-colors

Editing color codes is simple. Newer versions of Excel on both PCs and Macs have built-in eyedropper tools. If you’re using an older version of Excel, follow the Uganda Evaluation Capacity Development Project’s step-by-step instructions for using a free tool called Instant Eyedropper.

Write a descriptive title and subtitle.

Today’s viewers want and deserve brevity, everyday language, and text that describes something about the actual finding—so that even the quickest report-skimmers will walk away having digested and retained the report’s contents. Bonus points: Select an important word or two from the title and make that word stand out (“chocolate” is in bold text and matches the graph’s dark brown color scheme). Then, add a one- or two-sentence subtitle (“Cookie dough was second most popular flavor”).

emery-eers-3-titles

Rad Resource: Want to master these skills and more? I’m leading a pre-conference workshop at the Eastern Evaluation Research Society’s conference in April 2015. Bring your laptop so we can build these charts and more from scratch. See you there!

The American Evaluation Association is celebrating Eastern Evaluation Research Society (EERS) Affiliate Week. The contributions all this week to aea365 come from EERS members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello, we are William Faulkner (i2i Institute) and João Martinho (PlanPP), writing here on our own poster design process, which apparently worked well enough to impress some of the judges at AEA 2014. We were guided by a simple principle: understand what the target audience considers relevant and where this overlaps with that which we desire to communicate.

Faulkner 1

 

(Here’s a larger pdf of this poster: Network Analysis on a Shoestring_AEA2014)

We organized the content in three blocks:

  1. Orientation: what are we talking about and for whom is it relevant?
    • Who are you? The target audience – for whom we thought the content would be useful – because poster content is never relevant for everyone.
    • How do you collect data? We wanted to at least orient the audience to the range of types of data which could be fed into this tool.
    • Why would you use this tool? This box attempts to correct two common misconceptions: (a) that network analysis is only useful to map relationships between people, and (b) that producing a network visualization is the end of the process. The latter misconception inspires complaints that network analysts often produce attractive visualizations with little to no interesting interpretations.
  2. Main Message: what are the basic steps of using this tool? This block leads the reader through a tutorial on the main steps of using NodeXL emphasizing simplicity – in four steps NodeXL transforms raw data into a visualization. The section should display sufficient information to a solitary reader, but during the poster session itself at AEA we had one of the authors present with a laptop so anyone interested could play with a real dataset themselves as a way of reducing some of the mental entry barriers to starting to use the software.
  3. Examples/inspiration: The final block presents some concrete examples which illustrate the insights which network visualization (alone – even without the calculation of statistics) can supply.

Faulkner 2

Hot Tip: Focus on content first. The choice of a design tool should come after you can clearly articulate what you want to communicate and how this information is relevant to the target audience. Think about the gap you are trying to fill in the readers’ mind, and research how others communicated similar content. Second, as the design comes together, be strict about following the standard bank of recommendations about visual communication (less text, leave empty space, help the reader with cues about where their eye should go next). Once you have thoroughly thought through these aspects, the design should pretty much draw itself.

Rad Resource: NodeXL, of course! https://nodexl.codeplex.com/

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Greetings, fellow evaluation enthusiasts! My name is John Murphy and I am an Evaluation Associate on the Education Research and Measurement team at Cincinnati Children’s Hospital Medical Center. We provide evaluative support to various forms of learning and career development, ranging from clinical orientation to leadership and management training. One of our clients does extensive work surrounding quality improvement education. While we have provided them with evaluations of their courses, we have been fortunate enough to glean important tips and advice from their wealth of knowledge in measurement theory. One of the most basic tools used in improvement science is the annotated run chart, a simple form of the line chart. What makes the annotated run chart different from the typical line chart are annotations, small text snippets that show when an intervention or event has taken place.

Murphy

More and more, I have embraced annotations as being crucial to providing context and a story to a data representation. As an aficionado of data visualization, I have begun the quest for the perfect annotation. Here is what I have found so far:

Rad Resource: Stephanie Evergreen and Ann Emery provided an amazing resource within the virtual pages of this very blog! Their data visualization checklist not only reaffirmed my enthusiasm for the annotation, it also gave concrete guides for font size and text direction.

Rad Resource: What discussion about data visualization would be complete without mentioning Edward Tufte? The first chapter of his 2006 book Beautiful Evidence, entitled “Mapped Pictures”, discusses, in rich detail, the benefits of various techniques of providing context to images. Placing content in its proper space, scale, and time is crucial for making all genres of data representation tell a compelling story.

Hot Tip: If you are creating many annotated run charts that are updated frequently, consider investing in BI software such as Tableau. While Excel data labels are functional for one-shot data representations, more dynamic software can save time and provide more flexibility so annotations fit the story instead of being limited by the medium.

Good luck in telling your data stories!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Christine Frank and I am an independent Canadian evaluator. I have a couple of questions for you. Do your reports intrigue your audience or send them for coffee? Do people grasp your message easily?

Although I am best known as a program evaluator, I have also taught courses on business communications and co-authored a textbook on that subject. Experts in business communications focus on dynamic, readable writing. Plain writing experts promote a similar style. Both areas of expertise afford simple strategies to make functional documents more inviting and compelling.

Evaluators sometimes hinder their effectiveness by writing in an overly academic style. For instance, in journal articles, you often find sentences 60 words in length or more. One of the pivotal rules of both business writing and plain writing is to limit sentence length. Even if readers have excellent reading skills and are grounded in the subject matter, you can construct your text to propel them forward, not slow them down. My own frustration in reading unnecessarily lengthy, wordy text drives me to strive for instant clarity.

Hot Tip: For evaluators, I suggest a maximum of 20 words per sentence. You might stretch this limit when a short sentence just won’t convey the message. However, another fundamental rule is to check your text to see if you have used the least number of words possible. If you do this, you may find you can achieve the limit. Many strategies can be applied to maximize clarity. One is to avoid an over-abundance of nouns, especially in sequence. In the following sentence adapted from an actual Request for Proposals, you will see eight nouns, five of them in a row.

  • Our first task is the development of a best practice guideline implementation evaluation plan.

Better

  • First we will develop a plan for evaluating the implementation of best practice guidelines.

Hot Tip: A strategy that reduces sentence length and makes the text more compelling is use of the active voice of the verb.

  • The top three reasons given by students for choosing a career were successfully predicted by teachers.

Better

  • Teachers successfully predicted students’ top three reasons for choosing a career.

Rad Resource: Federal Plain Language Guidelines (2011)

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org. Want to learn more from Christine? She’ll be presenting as part of the Evaluation 2014 Conference Program, October 15-18 in Denver, Colorado.

·

Hello! I’m Molly Ryan, a Research Associate at the Institute for Community Health (ICH), a non-profit in Cambridge, MA that specializes in community based participatory research and evaluation. I am part of the team evaluating the Central Massachusetts Child Trauma Center (CMCTC) initiative, which seeks to strengthen and improve access to evidence-based, trauma-informed mental health treatment for children and adolescents. I would like to share a great resource that we use to visualize and communicate findings with our CMCTC partners.

Rad Resource: Icon Array University of Michigan researchers developed Icon Array to simply and effectively communicate risks to patients. For more information on why icons are so rad, check out Icon Array’s explanation and bibliography.

Hot Tip: Icon Array offers 5 different icons to choose from.

6 out of 11 reassessments (54.5%) received

6 out of 11 reassessments (54.5%) received

Hot Tip: Icons aren’t just for risk communication! We use icons to help our partners understand and visualize their progress collecting reassessment data for clients.

14 out of 24 reassessments (58.3%) received •9 out of 14 (64.3%) complete •5 out of 14 (35.7%) incomplete

14 out of 24 reassessments (58.3%) received
• 9 out of 14 (64.3%) complete
• 5 out of 14 (35.7%) incomplete

Cool Trick: Icon Array allows you to illustrate partial risk by filling only a portion of the icon. We used this feature to communicate whether a reassessment was complete or incomplete for a given client.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello! I’m Lisa Kohne, an independent evaluator and I work for SmartStart Consulting in Orange County, California. We specialize in conducting project evaluations for federally funded grants, primarily from the National Science Foundation. Most of our clients are math, science, and engineering professors from four-year universities. One of our big challenges is that some of our projects are multi-institutional, multi-state, and multi-country. It’s very difficult to bring multiple partners together at the same time to discuss evaluation findings – and most don’t have the time, inclination, or enough evaluation knowledge to read lengthy reports.

Hot tip:
To overcome these challenges we began to develop Evaluation Newsletters. They are usually two pages with lots of graphs, maps, and tables. We try to make them colorful, high-interest, and eye-catching. Some are wordier than others and our “skills” have evolved over the years. You can see the evolution from one of our earlier version (very wordy) to our more recent version (less wordy, more white space).

Kohne 2 Kohne 1

We only offer these to our larger, multi-site projects.  The reactions and feedback have been extremely positive.  No PI has ever turned down the offer to create a newsletter.  They are also great to distribute at advisory board meetings and project conferences.

Rad Resources:

  • Google Images works great for the simple clipart needed for newsletters. Simple is better. Just be careful to not use copyrighted ones.
  • Microsoft Publisher is our current choice of software.  We’ve tried Word but Publisher lines up the information much better.  Also, the new online subscriptions to MS Office 365 include Publisher.

Kohne 3

SmartArt is our go to graphic developer.  Only available in MS Word, not Publisher.  So you need to create it in Word and paste it into Publisher.

Lessons Learned:

  • Less is more.  Less words, more pictures, lots of bullet points.
  • Make it personal and make it positive.  Add university and project logos, project goals, maps that indicate locations of participating institutions, funders’ logs, and anonymous quotes from participants.
  • Newsletters take a lot of time so build the cost into the budget.
  • Recruit your most artistic employee to create your newsletters – someone who understands color, balance, and brevity of words.
  • Send a sample to stakeholders and ask if it would be helpful to get evaluation results out.
  • Get commitment from your principal investigator to email the newsletters out to all stakeholders and/or project participants and post them on their project webpage.  Here is a webpage containing out newsletters on a NSF PIRE project.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hey friends! We are Ann Emery, Co-Chair of the DVRTIG, and Stephanie Evergreen, Founder of the DVRTIG – two evaluators who are crazy about data visualization.

“I love your examples, but how do I know what I should do next time I’m creating a graph?” We heard comments like this when we talked with evaluators about good graph design. We saw that evaluators had thirst for better graphs and a clear understanding of why better graphs were necessary, but they lacked efficient guidance on how, exactly, to make a graph better.

Rad Resource: Introducing the Data Visualization Checklist

Take the guesswork out of your next graph. Download our 25-item checklist for clear guidelines on graph text, color, arrangement, lines, and overall messaging. Read about what makes a memorable graph title (spoiler alert: it’s not “Figure 1”). Learn how to arrange your bar chart based on whether your categories are nominal or ordinal. Decide which default settings in your software program to keep and which ones to toss.

Emery 2

Not familiar with the terminology? The last page is a Data Visualization Anatomy Chart. Watch that example’s before-and-after remake in Stephanie’s training on Ignite presentations.

Hot Tip: How can you use the Checklist?

Get in the habit of producing several drafts before sharing final graphs with your clients. Draft, score, edit, repeat!

In this example, we printed an existing graph (page 6 here) in both color and black and white to see how the final chart looked for viewers. We scribbled notes all over the graph and the checklist as we scored. Overall, the graph earned 91% of the possible points—just above the cutoff that enables viewers to read, interpret, and retain the content.

Emery 1

Cool Trick: What’s next for the Checklist?

We are publishing examples to illustrate the 25 items as well as before-and-after remakes. Check out the growing galleries at http://annkemery.com/tag/data-visualization-checklist/ and http://stephanieevergreen.com/tag/data-visualization-checklist/. And we’re taking requests: Which checklist items would you like examples for?

We’re also hoping to present the checklist at the American Evaluation Association’s annual conference in October. Let’s high five there! Please please please can you take a picture of your existing data visualization, apply the checklist, and then take another picture? Tweet your redesigns to @annkemery and @evalu8r. Email them to annkemery@gmail.com and stephanie@evergreenevaluation.com. Fold them into paper airplanes and fly them to us! Send your redesigns and show people how awesome you are!

Big thanks to our pilot reviewers: James Coyle, Amy Germuth, Chris Lysy, Johanna Morariu, Jon Schwabish, David Shellard, Rob Simmon, Kate Tinworth, Jeff Wasbes, and Trina Willard.

The American Evaluation Association is celebrating Data Visualization and Reporting (DVR) Week with our colleagues in the DVR Topical Interest Group. The contributions all this week to aea365 come from DVR TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

Greetings AEA365! My name is Gretchen Biesecker, and I am the Vice President of Evaluation at City Year. City Year is an education-focused, nonprofit organization founded in 1988 that partners with public schools and teachers to keep students in school and on track to succeed.

This year I competed in my first storytelling slam—an event where people tell five-minute, first-person, true stories. Constructing and telling my story was really fun. I started thinking about new ways our staff at City Year could think about incorporating numbers or data into our communications and reporting.

Lessons Learned:

  • Sharing findings in context is important, especially for audiences that may be unfamiliar with the data. To someone within a school, improving the average daily attendance rate by 2% may be a huge win, but to someone outside education, without context, that increase may sound miniscule.
  • Taking a look at some resources to organize good stories was really helpful to me. Reviewing the ways to construct a good story: 1) Helped me generate ideas for sharing different kinds of numbers and data to share our story; and 2) Emerged as a foundational step before I think about data visualizations and creating reports or presentations.

Rad Resource: Nancy Duarte’s book, Resonate, is now available for free! Duarte shares multiple examples of effective story structures, which may inspire you.

Resonate by Nancy Duarte

Resonate by Nancy Duarte


Hot Tips: Here are some additional ideas you might take from storytelling. You can use numbers to:

  • Create Drama—good stories may be formatted as a sweeping saga, and you can use numbers to convey scale (e.g., City Year serves in 242 schools, reaching 150,000 students). Pairing that with a personal story and results from one child or case is powerful.
  • Set the Stage—numbers can be used to share the problem or give the context for results (e.g., fewer than 40% of students in the nation’s schools score at or above proficiency in English/Language Arts and math).
  • Share the Transformation—good stories have a beginning, middle, and end, or conflict and resolution—so we can share the “before and after” through numbers. You can also “show your work” and the effort or conflict that it took to achieve the results (e.g., in 2012-2013, students in City Year schools spent over 589,100 hours in extended learning time programming).
  • Catch the eye or ear with Repetition—sometimes good stories or speeches have repeating rhymes, words, or numbers, so think about when repeating a particular number may be effective or impactful.

I encourage you to find inspiration and new ideas from the things you love. They may not be within evaluation, but translating them into our field will help us reach more people to put results into action.

The American Evaluation Association is celebrating Data Visualization and Reporting (DVR) Week with our colleagues in the DVR Topical Interest Group. The contributions all this week to aea365 come from DVR TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Older posts >>

Archives

To top