AEA365 | A Tip-a-Day by and for Evaluators

CAT | Data Visualization and Reporting

Hi, I’m Sara Vaca, independent consultant, helping Sheila curate this blog and occasional Saturday contributor. I haven’t been an evaluator for a long time (about 5 years now), but I have facilitated or been part of 16 evaluations, so I start getting over the initial awe of the exercise, and I am starting to be able to take care of other dimensions rather than just “surviving” (that is: understanding the assignment, agreeing on the design, leading the data collection process, simultaneously doing the data analysis, validating the findings, debriefing the preliminary results, finalizing digesting all these loads of information for finally packaging it nice and easy in the report).

I want to think that I incorporate (at least I try to) elements of Patton’s Utilisation-Focused Evaluation during the process, but until recently, my role as evaluator ended with the acceptance of the report (which is usually exhausting and challenging enough), taking no concrete actions once I had delivered it, partially because: a) it was not specified in the Terms of Reference (or included in the days of contract), or b) I usually didn’t have the energy or clarity to go beyond after the evaluation.

However, I’ve understood since the beginning of my practice that engaging in evaluation use is an ethical responsibility of the evaluator so I’ve just recently started doing some shy attempts to engage myself in it. Here are some ideas I just began implementing:

Cool Tick: Include a section in the report called “Use of the evaluation” or “Use of this report” in the document, so you (and them) start thinking of the “So what?” once the evaluation exercise is finished.

Hot Tip: Another thing I did differently was to elaborate the Recommendations section, but not in a prescriptive manner. Usually I would analyse all the evaluation ideas for improvement, and I would prioritize them according to their relevance, feasibility and impact. This time, I pointed out the priority areas I would focus on, and a list of ideas to improve each area, without clearly outlining what to do. Then I invited the organization to discuss and take that decision internally, and maybe forming internal teams to address each of the recommendations to gain more ownership.

Although, in occasions, clients have reached out months/years after the evaluation for additional support, this time I proactively offered my out-of-the-contract commitment to support, in case they think I could be of help later down the road.

Rad Resource: Doing proactive follow-up. I’ve read about this before, but haven’t yet done it systematically yet. So, I will set a reminder 3-6 months after the evaluation and check on how they are doing.

Hot Tip: I just published a post understanding Use and Misuse of Evaluation (based on this article by Marvin C. Alkin and Jean A. King), that helped me realize some dimensions of use.

As you see, I’m quite a newbie introducing mechanisms and practical things to foster use. Any ideas are welcome! Thanks!

 

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

Hi all! My name is Gaelyn West and I serve as the State Government Representative and Board Member for the Southeast Evaluation Association (SEA). My work experience includes program evaluation, strategic planning, and grant development in state government.

One of the most important tasks that evaluators face is relaying their evaluation findings to a lay audience. Most often, evaluators do this with charts and graphs. However, infographics are rising to the forefront as an innovative way to display data quickly and clearly. Infographics are the visual representation of data. They can help tell the story of your evaluation in a creative and compelling way.

Hot Tips:

  1. The data used in your infographics should be reliable, timely, and relevant. Your data should always come from a credible source, should represent the most recent statistics available, and should not contain any distortions. In any case, your data should stand up to tests of reliability and validity.
  1. Remember The Six Basic Principles of Design:
  • Unity / Harmony – proximity, similarity, continuation, repetition, rhythm
  • Balance – symmetry, asymmetry, radial
  • Hierarchy – trees, nests, weights, timeliness
  • Scale / Proportion – size, ratio, divisions
  • Dominance / Emphasis – highlight, color, size
  • Similarity and Contrast – light and dark, bold and italic
  1. The purpose of an evaluation is to create an objective and compelling story that can help judge the merits of a program, improve a program, and/or generate new knowledge. Infographics can be used to illustrate information about the program context, implementation, need, or outcome/impact. Your infographics should highlight the story you want to tell about your evaluation findings.
  1. “Shareability”. Evaluators can customize Infographics to fit any shareable platform, such as reports, presentations, posters, and social media marketing. The list goes on! It is important to know the communication method your audience will most likely utilize and prefer. Will a one-page memo be the most effective way to share your evaluation findings? Would an interactive presentation, a video, or an online article work? You can easily tailor infographics to fit the chosen platform.

Rad Resources:

  • Microsoft Excel: A common and versatile spreadsheet software for creating visuals.
  • Microsoft PowerPoint: A common and easy to use presentation software.
  • Venngage: A simple infographics creator with templates and customization options.
  • Canva: A design program for creating Web or print visuals.
  • Infogram: A drag-and-drop infographics creator offers flexibility and independence in designing visuals.
  • Piktochart: A good beginner infographics creator in which users can customize set templates.

 

The American Evaluation Association is celebrating Southeast Evaluation Association (SEA) Affiliate Professional Development Week with our colleagues in the SEA Affiliate. The contributions all this week to aea365 come from SEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello from two scholars and two coasts! We are Mike Osiemo Mwirigi, MS, and Glen Acheampong, MPP. During our GEDI program year, we learned that evaluators and stakeholders are increasing their use of visuals to present data. Data visualization pioneers in evaluation have pointed out that a good visual can make evaluation results more user friendly. Effective visuals capture people’s attention, substitute for text and help reduce the lethargy of reading long reports. Last, they can tell a more memorable story.

We noticed that when talking about data visualization, cultural competency rarely comes up. Cultural competency in evaluation is the ability to engage with diverse stakeholders to “include cultural and contextual dimensions important to the evaluation” (American Evaluation Association, 2011). Data visuals can be interpreted differently based on cultural differences and, as a result, we interpret and react differently to the same stimulus.

The documentary West and East, Cultural Differences discusses how Easterners (Chinese, Japanese and Korean) and Westerners (Americans and Europeans) are tuned to differently interpret visual information. The documentary shared the following:

Hot Tip 1: Begin with a plan.

Data visualization can lose the intricacies of the story its telling. Further, some data visuals are complex and hard to interpret without an explanation. Evaluators should consider data visualization from the onset of the evaluation design to navigate exactly what the image should convey.

Hot Tip 2: Check and reflect stakeholders’ interpretations of data visuals.  

When interpreting data visualization guidelines or rules of thumb we must note that these are not universal; what works for one population might be counterproductive for another. This is true for constructed meanings around colors, shape, and symbols. Instead, explore what stakeholders need and can digest.

Rad Resources:

American Evaluation Association. (2011). American Evaluation Association Public Statement on Cultural Competence in Evaluation. Fairhaven, MA. Retrieved from www.eval.org.

EBS. (2012, December 05). West and East, Cultural Differences. Retrieved July 06, 2017, from https://www.youtube.com/watch?v=ZoDtoB9Abck&index=302&list=LLaTQQZHp4uDV7ubqPoaq4NQ

Emery, A. K., & Evergreen, S. (2014). Data Visualization Checklist. http://stephanieevergreen.com/dataviz-checklist/

Links to West and East, Cultural Differences

Part 1: https://www.youtube.com/watch?v=ZoDtoB9Abck&index=302&list=LLaTQQZHp4uDV7ubqPoaq4NQ and

Part 2: https://www.youtube.com/watch?v=709jjq8qk0k&t=4s

The American Evaluation Association is celebrating Graduate Education Diversity Internship (GEDI) Program week. The contributions all this week to aea365 come from AEA’s GEDI Program and its interns. For more information on GEDI, see their webpage here: http://www.eval.org/GEDI  Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello! We are Carolyn Camman, Christopher Cook, Andrew Leyland, Suzie O’Shea, and Angela Towle of the UBC Learning Exchange, which is a little bit of the University of British Columbia in the heart of Vancouver’s Downtown Eastside (DTES). It’s a bridge for mutual understanding and learning between the University and a neighbourhood rich in community, art, and history, but whose residents face challenges, including homelessness, poverty, gentrification, and stigma. The UBC Learning Exchange acts as a member of the community, giving back to residents through community-based programming alongside experiential learning opportunities for students and support for community-based research.

The Learning Lab supports members of the DTES community to engage in activities and scale-up their involvement by offering creative and educational activities in a flexible, low barrier format. In keeping with the arts-based principles of the Learning Lab and community engagement mission of the Learning Exchange, when it came time to make the results of a recent evaluation accessible to a community audience, the answer was obvious: put on a show!

Voices UP! is a theatrical performance co-written and co-performed with the community members who contributed to the original evaluation. It not only communicated evaluation results, but deepened the evaluation itself. Through writing and performing the play, the cast learned more about evaluation and shared new stories and insights. Over its four-performance run from Spring 2016 to Fall 2017, the show evolved and grew.

Hot Tip: There’s growing interest in using arts-based methods in evaluation. Live theatre is a dynamic and engaging approach that encourages people to connect with findings viscerally and immediately as part of a dialogue between performers and audience. In post-performance talk-back, one person said, “It was neat to hear the participants reflecting on what they had just done as well as what it meant to them to be a part of it.” Another commented that “seeing” the impact of the program was more persuasive than reading about it from a report or grant application.

Lessons Learned: A performance doesn’t have to be polished or “professional” to be effective. Community members speaking in their own words is powerful and there are many creative techniques (like puppets!) that can bring evaluation findings to life. Having a conversation with the cast and giving introductions to audiences before performances about different ways theatre can “look” helped set appropriate expectations.

Rad Resources: To keep Voices UP! going even after the curtains come down for the last time, the Learning Exchange staff and cast of program patrons came together to tell their story one last time, this time in a comic book. You can download this resource online for free: http://learningexchange.ubc.ca/voicesupcomic It was created using the same participatory process as the original performance and tells the story of how Voices UP! came to be, with tips and insights for anyone interested in using theatre methods to tell their evaluation story. Look up “reader’s theatre” and “ethnodrama” for more ideas about turning evaluation and research into plays.

The cast and creators of Voices UP! Photo credit: The UBC Learning Exchange

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Greetings! I’m Rose Hennessy, Adjunct Professor in the Helen Bader School of Social Welfare and a Doctoral Student at the Joseph J. Zilber School of Public Health, both at the University of Wisconsin-Milwaukee. In teaching Program Evaluation to MSW students, I’ve had the wonderful opportunity to collaborate with Jennifer Grove and Mo Lewis at the National Sexual Violence Resource Center (NSVRC).

In the short duration of a semester, it can be difficult to provide students the opportunity to practice engaging with stakeholders and translate evaluation findings. In conjunction with NSVRC staff, we proactively identified recent research articles of interest for sexual violence prevention practitioners. Busy professionals frequently do not have time or access to recent publications, but in academia we can play a role in getting current research out in digestible ways! Students are assigned articles and asked to create infographics of key themes and implications to meet stakeholder needs.

Lessons Learned:

  • Students learn a new technology best with hands-on learning. A free infographic program is taught to the class in a computer lab where they can learn and practice. Walking through skills step-by-step with a guided handout promotes a new skill and program.
  • Assignment scaffolding models the stakeholder process. Four different assignments are used for the project, allowing for feedback, revisions, and reflection. Students review the NSVRC website for content, design, and values. They critique their article to pull content specific to the stakeholder, create and present the infographic, and use class feedback to reflect and create revisions.
  • Presenting infographics allows for shared learning of evaluation concepts. Students review creative ways to share qualitative and quantitative findings, examine different study designs, discuss how to present null findings, explore various visualization options, and gain experience utilizing critical feedback from peers.
  • More time is needed to promote culturally responsive evaluation. Research with diverse populations was intentionally chosen for review, but many students lack prior experience translating findings across cultures. Providing readings to assist students, setting up ground rules, and allowing more time for reflection and discussion is necessary to help students process evaluation results in a culturally-responsive manner. Conversations also highlighted the need to differentiate between collaborative approaches and culturally-responsive evaluation, and new readings have been identified for future courses.

As an instructor, a collaborative project with NSVRC provides students the opportunity for learning with real-world applications. There was high motivation for the creation of projects that can be used by a national leader in the field, and students leave the class with new skills in the translation of research, study design, visualization, and dissemination!

The American Evaluation Association is celebrating ¡Milwaukee Evaluation! Week with our colleagues in the Wisconsin statewide AEA Affiliate. The contributions all this week to aea365 come from our ¡Milwaukee Evaluation! members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello, AEA365 readers! We are Antonina Rishko-Porcescu (Ukraine), Khalil Bitar (Palestine/Germany) and Bianca Montrosse-Moorhead (USA), from EvalYouth, a global, multi-stakeholders partnership which promotes young and emerging evaluators to become future leaders in evaluation.  Antonina is leader of an EvalYouth task force; Khalil is Vice-Chair of EvalYouth and Bianca is Co-Chair of the network.  We are excited to share what we have learned from EvalYouth’s use of visualization when communicating with our young audiences of evaluators.

We communicate a lot with a broad range of evaluators, especially young and emerging evaluators, and young people from around the world.  In this ever changing fast-paced world, we understand that using words is not enough.  Information must be clear, direct, coherent, and compelling.  One question we have explored is: how should we disseminate information and evidence in a way that draws novice evaluators in, and presents information in a meaningful way?

Examples:

  • Summarizing data collected through an international survey with young and emerging evaluators (e.g., here and here).
  • Summarizing results of received applications for the first EvalYouth International Mentoring Program (e.g., here).
  • Transforming the Network’s original logo to highlight special events and programs (e.g., here and here)

Hot Tips and Cool Tricks:

  • Visualization makes complex data easier to understand, but it is not easy to create good visualizations; it involves hard work and research. Do your homework.
  • Try to strike a balance between pictures and words. An infographic should include valuable information, not just cool graphics, but that too.
  • Use colorful designs and, when appropriate, humor. Doing so invites readers, especially youth and young and emerging evaluators, to read information.
  • Work collaboratively. Others bring fresh perspectives and new ideas, but also often feel more of an ownership of the project you are working on after it concludes.  Ownership often means that there is an excellent possibility they will share it with relevant contacts and on their social media channels afterward.
  • It is not enough to make a great infographic and stop there. Disseminate such work widely through mailing lists and social media outlets. There people will see your message and, very importantly, engage with it.

Lesson Learned:

  • Data visualization used well is a powerful communication tool. It can simplify complex ideas and big data in just a few items on an infographic.
  • Working with a team of people from many backgrounds or countries, helps a lot. What might be right, appropriate and trendy in one region or culture, could be the opposite in another.  Diverse team perspectives can identify and overcome such issues.
  • The ethics of data visualization is also important to consider. A well-done data visualization is a powerful tool! As Uncle Ben in the Spider-Man series said, “with great power comes great responsibility.”  Care should be taken to ensure that the visualization message is accurate, valid, coherent, and just.

Want to learn more about EvalYouth? Follow EvalYouth on social media: Facebook, Twitter, LinkedIn, and YouTube. Or, write to us: evalyouth@gmail.com.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Greetings everyone! Corey Newhouse here, Founder and Principal of Public Profit – a consulting firm helping mission-driven organizations measure and manage what matters. Have you ever toiled for hours on a data-heavy report, only to hear back from a client that it doesn’t look or feel quite right? Maybe they missed your point entirely?

As evaluators, we put a great deal of thought into how we present our findings in order to convey the right message. There is plenty of research to show that a well-formatted report increases comprehension and retention. But sometimes those old-school clients of ours get a little prickly about these updated formatting styles, and spend more time critiquing the layout than paying attention to the content. For exactly those situations, Public Profit has developed a short document to share with clients that outlines why our reports look the way they do.

Lessons Learned:

  • Show, don’t tell. It is easy (and wishful) to think you can tell a client why you did something, and they’ll just accept it. Trust me, it is much easier to just show them why. That’s where our formatting style guide comes in handy. It outlines our rationale behind things like margins, layout, font sizing, and use of color amongst others. This shows the client why our format style works well, all in a quick two-page document.
  • Show them before, not after. When possible, we show the formatting style guide to our clients early in the reporting phase. When the client receives the report, they will be more focused on the information, not the “funky” formatting.
  • Start a conversation. Clients sometimes have valid reasons (or at least strong feelings) for why they might prefer a different format. Having a short conversation early in the process often saves us from spending unnecessary time re-formatting documents at the last minute.

Rad Resources:

  • Download Public Profit’s short formatting style guide that we share with clients.
  • Stephanie Evergreen has a handy checklist to help you score your report layout.
  • Ann K. Emery has lots of great strategies on how to design and display information effectively.
  • Chris Lysy’s Creativity School offers short online course aimed at helping boost your creativity and leverage digital tools to better display information.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Greetings!  My name is Jennifer Lyons, MSW from Lyons Visualization, LLC.  I am a social worker, data designer, and speaker.  In my independent consulting business, I bring creative energy to making data intriguing and impactful, while helping clients transform the way they communicate their story. Today I want to talk about a method I like to use that is effective in both engaging clients in the interpretation of data, while also setting the stage for an impactful visual summary of findings.

In this post, I am going to focus on a process to use after data is collected and analyzed.  After analysis, it is time to dive in and highlight the story within the data.  Part of storytelling with data is making meaning of the information in context. Our clients are the experts on the delivery of their programs, people they work with, and the reporting context.  It is important to include our clients in thoughtful interpretation of their data.  In this post, I am going to focus on using a worksheet to guide a data interpretation meeting and transform findings into a visual summary.

Hot tip: Start by designing the data interpretation worksheet.  This worksheet is the backbone to a visual executive summary of your findings. Below is an example of a simple data interpretation worksheet made for an evaluation of an after-school reading program.  Included are graphic displays of the data with blank boxes that give space for clients to add their interpterion. During the data interpretation meeting, you can use this worksheet to partner with clients to highlight and frame central findings in the data.

Hot Tip: Paste each graph from the worksheet on an empty slide and ask your clients to examine each data point.  Prompt them with questions about what they see as positive, negative, and surprising about the findings.  It is also important to ask your clients to think of relevant context.  As a group, process everyone’s recommendations and thoughts.  There are often a lot of important things being shown in one graph, but together, you can decide on what is most important.  Then, write the most important takeaway/s from the graph in the graph title.  This process is repeated for each graph.  By the end, you will have something like this:

Hot Tip: This completed worksheet can easily be transformed into a visual summary of your findings.  For this worksheet to transition to a visual executive summary, there are key aspects missing. Add effective titles from the worksheet, use color to showcase your story, and add an engaging visual.

Ta-da!  You have a nice visual report based on thoughtful data interpretation using your client’s feedback and expertise.  My hope is that by reading this post, you are more inspired to think of new ways to engage your client in the data and visually display findings.

The American Evaluation Association is celebrating SW TIG Week with our colleagues in the Social Work Topical Interest Group. The contributions all this week to aea365 come from our SW TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I’m Kylie Hutchinson (a.k.a. @EvaluationMaven), independent evaluation consultant and trainer with Community Solutions Planning & Evaluation and author of Survive and Thrive: Three Steps to Enhancing Your Program Sustainability.

The word I chose to memorialize this week is a small but important one. It’s the letter “a”, as in “She asked for a final evaluation report”.

As evaluators, many of us are accustomed to providing a single (and lengthy) final report at the end of the evaluation. However, change is in the air, and many of us would also like to see the demise of the final report because it often goes unread by busy decision makers and sits on a shelf collecting dust. But a two-page briefing note doesn’t work in all situations either. Clearly, one type of report does not fit all, which is where the concept of layering comes in.

Hot Tip: Layering is a term I coined in 2008 to describe the simultaneous use of diverse report formats to communicate your evaluation results. The purpose of layering is to give different stakeholders the option to go as shallow or as deep as they choose into your evaluation findings. It works like this. Imagine a lengthy report as the meat of a burger; it can be very heavy and take a long time to digest. Final reports are often very dense documents, and not all stakeholders have the time nor appetite to eat them. Sometimes, all they want is lettuce with a bit of tomato (e.g. a newsletter), or a slice of cheese (e.g. a podcast). Or they may be rushed and can only take a quick nibble of the bun (e.g. an infographic). Some of these users will be intrigued enough to eat the whole burger, appendices and all, while others might be satisfied with just a few bites. Layering works because each communication product contains the same key messages and is linked to a more detailed option, enticing the reader to learn more if they choose. By employing diverse communication strategies for these varying appetites, you give intended users the choice of how deeply they wish to delve into the results.

Rad Resource: A Short Primer on Effective Evaluation Reporting. In this upcoming book, I talk more about the concept of layering and present different ideas for communicating results beyond the traditional lengthy report.

The American Evaluation Association is celebrating Memorial Week in Evaluation. The contributions this week are remembrances of evaluation concepts, terms, or approaches. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hey everyone! I’m Echo Rivera, owner of Creative Research Communications and research associate at Center for Policy Research. My passion is helping evaluators bring creativity to the research communication process. Today I want to talk about your presentation design workflow.

Let me ask you something: does the process of making a presentation stress you out? Do you find you’re always scrambling last minute to finish a presentation on time?

Yeah, I’ve been there. When we have to make several presentations throughout the year, they can add up to a lot of wasted time when done inefficiently.

One part of your workflow might be the biggest problem: Adding visuals.

Does your workflow look something like this?

  1. You’re working on your presentation,
  2. You look at your slide and think “is there an image for that?”,
  3. You search and search and search online until you find the right one,
  4. Then you add it to your presentation, aaaand
  5. Repeat

Am I close? Or did I miss the step where you get lost in the rabbit hole of news articles, blogs or YouTube?

This simple act of looking for one image at a time is extremely inefficient. When you’re in SPSS crunching numbers, do you suddenly stop and start searching for articles for your lit review, then come back later to finish and print your output?

That would be super inefficient, right? The same idea applies to presentations.

Hot Tip:

You will be most efficient if you approach each presentation activity as a separate, standalone task.

SUGGESTED STEPS TO AN EFFICIENT PRESENTATION DESIGN WORKFLOW:

[1] Set Presentation Goals & Figure Out Your Story

  1.  Think through who the audience is and what will resonate with them the most.
  2. Decide on 1-3 key point(s) to make in the presentation.
  3. Brainstorm a “storyboard” that funnels into the key point(s)

[2] Draft Your Presenter Notes

  1. Following your storyboard, draft what you want to say on the slides
  2. Do a quick run-through, speaking aloud all of the notes, making adjustments to the order/organization, filling in any gaps, and removing “fluff.”
  3. Do another run-through to assess time. Add/remove content as necessary. Try to finalize content as much as possible.

[3] Design the Slides

  1. Copy & paste text from your slides into the presenter notes, adding back in only a few words onto the slide, and no more than 3 points per slide.
  2. Add high quality visuals on as many slides as possible, reducing excess text as you go along.
  3. Use design elements to make the remaining text effective (e.g., minimum font size 30).
  4. If necessary, add simple animations (e.g., appear)  to walk the audience through the content.
  5. Conduct a self-check on the overall presentation to ensure all content is readable & visually appealing.

Rad Resource: Step 3b should not take very long. The trick is to have a Visual Database ready to go. I created a free 6-day email course that shows you how to create one. Check it out!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Older posts >>

Archives

To top