AEA365 | A Tip-a-Day by and for Evaluators

I am David Bernstein, CEO of DJB Evaluation Consulting and Past-President of Washington Evaluators, the DC-based local AEA affiliate, and the Evaluation 2017 Conference Co-chair.

I have a career-long commitment to volunteering as a Red Cross volunteer (CPR instructor, blood donor), a Board member with Washington Evaluators, and a frequent volunteer with AEA. Giving back is a gift to me because I learn so much and get to expand my leadership skills.

I have been a volunteer for the AEA annual conference nearly every year it has been held, and am a member of the AEA Conference Working Group. While the AEA staff do a remarkable amount of work to pull the AEA conference together, it is the membership that pull the conference together. Most frequently I have volunteered to review conference proposals as part of the Topical Interest Group (TIG) review process, which establish the conference strands.

Lesson Learned: It was through my role as a TIG Chair that I had the honor of knowing and learning from Bob Ingle, who was the AEA Annual Conference Chair for the first 10 years of the AEA conference. As Jean King so eloquently described him, “Bob Ingle knew how to put on a conference.” (See her post, Memorial Week: Jean King on Remembering Bob Ingle (1926-1998), Pioneer in establishing the annual AEA conference”). What did I learn about volunteering for AEA and the AEA Conference from Bob Ingle? A lot, and I was not alone. AEA named its Service Award after Bob Ingle!

In 2002 and 2013 I had the honor of serving as AEA Conference Local Arrangements Working Group (LAWG) Co-chair. What I learned was that the most important role of the LAWG co-chairs is to recruit other volunteers. In 2013 my fellow LAWG Co-chair Valerie Caracelli (a Robert Ingle Service Award winner) and I worked with a group of over 70 volunteers from Washington Evaluators to provide local information about DC and help with conference planning and logistics.

Rad Resource: The 2017 LAWG Co-chairs, Giovanni Dazzo and Jonathan Jones, have been working with a large number of volunteers on several initiatives for the Conference. Stop by the “Ask Me About DC” table to say hi to a friendly volunteer or two who can give you all sorts of interesting information about the DC area.

For Evaluation 2017 I had the honor of working with Kathy Newcomer, our 2017 AEA President, and a diverse group of volunteers on Kathy’s Conference Program Committee. A group of 17 of us worked with Kathy to develop the conference theme and subthemes, coordinate with the TIGs, assemble the Presidential Strand, identify plenary session speakers, and help Kathy in a variety of other ways. Susan Tucker (AEA’s Treasurer, another important volunteer position), Donna Podems (a former AEA Board Member), and I served as the Conference Program Co-chairs.

Hot Tip: Want to be an AEA volunteer? Check the AEA Volunteer Opportunities page, and find something in which you are interested. You too can make a difference in AEA, and in the evaluation profession.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Greetings!   This is Nicole Vicinanza, Senior Research Associate at JBS International and a conference co-chair of the AEA 2017 Presidential Strand committee.  Our AEA president Kathy Newcomer and the members of this committee have worked together for over a year to develop the theme – From Learning to Action! You’ll find 29 exciting and thought-provoking sessions delivered by evaluation leaders and innovators around four conference sub-themes:

  • Learning to Enhance Evaluation Practices
  • Learning What Works and Why
  • Learning from Others
  • Learning about Evaluation Users and Uses

In addition, if you can’t be there in person, many of these sessions will be offered free through the virtual conference. Here are my tips to make the most of the AEA 2017 theme and the presidential strand:

Hot Tip:  Use the Challenge Questions to organize your learning.  Uncover and use the Challenge questions to get ready for the conference! Go to the AEA web page on From Learning to Action  and click on each of the sub-themes to see challenge questions for you to consider as you attend sessions at AEA 2017.    Think about which questions you’d like to get answers for.

Cool Trick: Use the mobile app or on-line conference program to pick sessions related to your sub-themes.   Look for your invitation to download the mobile app.  If you click on a session in the mobile app schedule, and scroll down to the 2017 Theme tag, you can see the sub-theme that session addresses.  In the on-line program, select a sub-theme you are interested in in the “Theme” search box to see all the conference sessions that relate to that sub-theme. You can narrow your choices by picking a TIG or “Presidential Strand” in the “Track” search box.  Consider the sub-themes as you choose which sessions to attend.

Hot Tip:  Get the big picture- attend the Plenary Sessions.  This year four plenary sessions will explore ways that our community can learn from evaluation to create better practices and outcomes.  (Bonus: Each session will also feature a winning video from our 2017 Video Contest.)

Rad Resource: Just can’t be there? Attend virtually. The conference has so many great offerings and opportunities to connect- you should be there in person if you can!  But if you can’t, you can participate in over 20 sessions of the Presidential Strand, free, through the virtual conference.  You and your colleagues can view the Virtual Conference Sessions and register now at https://aea.digitellinc.com/aea/live/6.

I’m looking forward to all the learning possibilities at AEA 2017!   I’m particularly excited about the sub-theme Learning to Enhance Evaluation Practices and attending sessions on “Adopting Economic Cost-Effectiveness Analyses to Enhance Evaluation Practices” and “Checklists to Activate Learning and Improve Evaluation Practice.”  What are you looking forward to?

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello! I’m Kathy Newcomer. Serving as president of AEA this year has been an honor and privilege for many reasons. One of them is the opportunity to witness firsthand the incredible commitment and effort so many of our members exert on our behalf!

AEA members serve on TIGs, task forces and working groups and work diligently behind the scenes to move our association and profession forward on a variety of fronts. This year through these  groups our members have made many achievements that benefit us and that I want to acknowledge.

  • Our Evaluation Policy Task Force (EPTF) strategically worked to develop and sustain a coalition of professional associations to provide input to the deliberations of the Commission on Evidence-Based Policy. Our EPTF provided valuable testimony, and our AEA Roadmap was cited multiple times in the final Commission’s report. The recommendations on evidence-building capacity reflected our roadmap, as well as the significant contributions of AEA member and Washington Evaluators president, Nick Hart, who was a key author!
  • Our Race and Class Dialogues series led by Melvin Hall presented valuable forums for discussing how we as professional evaluators can address critical issues facing our society, and due to the dedication and time developed by Melvin and his committee, and funding provided by the Kellogg Foundation, AEA will provide an outstanding training video on this vital topic.
  • Our Competencies Task Force led by Jean King moved toward completion of their multi-year effort to develop and vet a set of evaluator competencies. Members devoted an impressive amount of time conducting focus groups, surveying our membership, and consulting with evaluators globally to ensure our competencies are comprehensive, reliable and valuable.
  • Our Guiding Principles Review Task Force led by former AEA President Beverly Parsons reached out extensively to our membership, including via a survey this fall, to update our association’s guidance to ethical practice.
  • Many members participated in shaping our selection process for a new Executive Director under the leadership of our ED Selection chair and President-elect Leslie Goodyear through contributing valuable guidance on the job description and criteria.
  • Our Membership Engagement Task Force led by Melvin Hall and Robin Miller reviewed AEA records and actions and solicited members’ input to develop a set of actionable recommendations to strengthen our association’s commitment to diversity and inclusive leadership development and membership engagement.
  • Our AEA representative to IOCE Cindy Clapp-Wincek represented us across the world, and led a group of us to participate in an awe-inspiring summit of EVAL partners in Kyrgyzstan.
  • My 2017 Conference Program Committee, comprised of 17 members from 7 countries, worked to develop our themes, recruit speakers and organize a video contest and sessions to ensure our conference provides a memorable learning experience for all.
  • Our network of affiliates led by Leah Neubauer and Allison Titcomb worked to enhance sharing across their organizations and planned their first ever affiliates workshop for Evaluation 2017.
  • Needless to say, we have all benefited immeasurably from the efforts of our TIG leaders who worked long and hard to solicit and vet conference proposals, among other important services they provide to AEA.
  • And 21 working groups comprised of more than 125 AEA members work closely with our Executive Director to conduct essential association business in a variety of areas including elections, awards and international outreach.

My most important role as outgoing president is to bear witness to the achievements of so many of our members who work on our behalf with little recognition other than seeing good work accomplished to move our profession forward. THANK YOU!!!!! We truly appreciate what you have done for us!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

Jayne Corso

Hi my name is Jayne Corso and I am the Community Manager for AEA. Evaluation 2017  is only days away, so let’s focus on social media for the conference! Here are a few tips I have compiled for staying social during an event, especially through Twitter!  Why Twitter? Twitter makes your posts visible to people who aren’t already in your networking, therefore, increasing your reach.

Hot Tip: Use the Correct Hashtag

Almost all events have a hashtag, and Evaluation 2017 is no exception. Use #Eval17 when sharing posts related to the conference. Nothing is more frustrating than tweeting at an event then later realizing you used the wrong hashtag and have missed out on conversations. Through #Eval17, you can connect with others at the event and share your thoughts and comments on sessions, content, and presenters.

Hot Tip: Find the Right Balance

Sometimes it can be difficult to figure out how often you should be tweeting at a conference. Your posts should be informative and valuable, so you want to find the happy medium in content. My suggestion is 2-3 tweets per session. This gives you the opportunity to still have a social media presence without missing valuable education and networking time.

Hot Tip: Share Valuable Information

As mentioned above, you want your posts to be valuable to other attendees or even evaluators who couldn’t attend the conference. So, how can you do this?

1) Try asking questions – this can often spark a conversation. 2) Be helpful – share session suggestions, speakers you have enjoyed, topics that were interesting, and even restaurant tips. 3) Share photos and videos – visual content often receives more engagement than simple text postings, so this is a great way to share your experience with others.

Hot Tip: Have Fun

Have fun using social media! Your posts do not need to be series or educational. We enjoy seeing your personality poke through and how you are interacting with all aspects of the conference.

And one for LinkedIn…

Hot Tip: Make Connections on LinkedIn instead of Twitter

If you want to connect with a speaker or attendee, try LinkedIn instead of Twitter. LinkedIn is a more professional social media platform and an introduction on LinkedIn is often more personal and noticeable.

I will be tweeting at Evaluation 2017! I look forward to seeing your posts!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on theaea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · ·

I’m David Keyes and I’m an independent consultant based in Portland, Oregon.

Like many who work in evaluation, my career path been less than completely straight. This, of course, mirrors the field of evaluation, which brings together many disciplines and is often a bit amorphous. One challenge this presents, and one I’ve struggled with myself, is how to find jobs in evaluation.

Given the fractured nature of evaluation, it is not surprising that jobs in the field are posted far and wide. Those looking for evaluation jobs can, of course, look in the usual places: LinkedIn, Indeed, Idealist, and the like. But that often ends up being a needle-in-a-haystack endeavor, having to sort through hundreds of irrelevant jobs to find the few that are focused on evaluation.

Many of us also rely on local AEA affiliates to find jobs. My local affiliate, the Oregon Program Evaluators Network, does an excellent job of keeping its members aware of opportunities in the area. But these local affiliates are, by definition, local. If you want to find jobs beyond a single area, you’ll have to find the local affiliate in each area you’re interested in and hope they post jobs.

There are other places to look for evaluation jobs (Matthew Von Hendy brought together many of them recently), but here’s the point: there is no single place to find evaluation jobs.

Rad Resource:

That’s why I started a new website: Evaluation Jobs. By aggregating evaluation jobs from multiple sources, Evaluation Jobs provides a one-stop shop for those looking for work in this field. The website filters through the general jobs websites to pull out evaluation jobs, combines them with those posted on local AEA affiliate websites and sprinkles in job postings from various other dark corners of the internet. There are currently over 300 jobs posted and I anticipate being able to add several dozen new ones each week. On the website, you can filter the jobs by location, keyword, and more to identify opportunities that are relevant to you.

(click for larger image)

Having just launched Evaluation Jobs, I would love to get feedback on it. Please feel free to contact me through the website and let me know what you think. Above all, I hope that Evaluation Jobs might help you find your next opportunity in evaluation!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello! I’m Judy Savageau from the Center for Health Policy and Research at UMass Medical School following up on yesterday’s post with Part II of basic data analyses. A number of posts outlining statistical/analytic details are in AEA365’s archives. For example, there are some great posts on “Readings for Numbers People (or Those Who Wish They Were)”, “Starting a Statistics [Book] Club”, and “Explaining Statistical Significance”. These posts discuss multivariate modeling, longitudinal data analysis, propensity score matching, factor analysis, structural equation modeling, and more. But what defines multivariate analyses and how do they differ from bivariate analyses?

Hot Tip:

Decisions about bivariate statistics (i.e., assessing the relationship between 2 variables; e.g., gender and school performance) are made based on the ‘type’ of data (e.g., categorical vs continuous; see yesterday’s Part I post). There are many reputable resources that show simple tables for determining which statistic to use (see Rad Resources below), including:

  • Chi-square test: 2 categorical variables (e.g., program participation: yes/no and job type)
  • T-test: 1 categorical variable with 2 levels (e.g., gender: male/female) and 1 continuous variable (e.g., IQ, SAT scores)
  • ANOVA – Analysis of Variance: 1 categorical variable with 3 or more levels (e.g., program performance: low / moderate / high) and 1 continuous variable (e.g., years of education)
  • Correlation coefficient: 2 continuous variables (e.g., years of employment and number of correct responses to knowledge about job-related standards)

Hot Tip:

Finally, use multivariate analyses when you want to look at a large number of variables and their relationship (collectively) to one outcome. The most appropriate multivariate statistic depends, in large part, on the categorical or continuous nature of the outcome variable. For example, in one federally-funded study assessing the multiple factors related to return to work after a work-related injury (e.g., severity of injury, years until anticipated retirement, pre-injury job satisfaction, employer assessment of re-injury potential, etc.), our outcome variable was ‘return to work’ measured in multiple ways:

  • Categorical measure: return to work – Yes/No. To determine which factors are most predictive of whether or not a person with a work-related injury will come back to work might best be explored using logistic regression.
  • Continuous measure: how quickly (in weeks) might a person return to work following a work-related injury might best be explored using linear regression.

There are many decisions to be made when developing a data analysis plan. I’m hoping that this 2-part introduction to the basics of statistical analyses gets you started in thinking about the best way to explore and analyze your quantitative data. Of course, having a statistician/data analyst sitting ‘at the table’ with the team as early as possible will ensure that you collect data in the best format to answer your research questions.

Rad Resources:

Here are just a couple of web pages that help with some decision-making about when it’s most appropriate to choose one statistical test over another – depending on the type of data you have.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello! I’m Judy Savageau from the Center for Health Policy and Research  at UMass Medical School. A recent post from Pei-Pei Lei, my colleague in our Office of Survey Research, introduced some options for statistical programming in R. I wondered whether a basic introduction to statistics might be in order for those contemplating ‘where do I begin’, ‘what statistics do I need to compute’, and ‘how do I choose the appropriate statistical test’. While most AEA365 blogs don’t cover every topic in detail, perhaps a basic 2-part introduction will help here. Analyses are very different with qualitative versus quantitative data; thus, I’ve concentrated on the quantitative side of statistical computations.

Hot Tip:

Analyses fall into 3 general categories: descriptive, bivariate, and multivariate; they’re typically computed in that order as we:

  • explore our data (descriptive analyses) with frequencies, percentile distributions, means, medians, and other measures of ‘central tendency’;
  • begin to look at associations between an independent variable (e.g., age, gender, level of education) and an outcome variable (e.g., knowledge, attitudes, skills; bivariate analyses); and
  • try to identify a set of factors that might be most ‘predictive’ of the outcome of interest (multivariate analyses).

Hot Tip:

The decision about what statistical test to use to describe data and its various relationships depends on the ‘nature’ of the data. Is it:

  • Categorical data:
    • nominal; e.g., gender, race, ethnicity, smoking status, participation in a program: yes/no;
    • ordinal: e.g., a Likert-type scale score of 1=Strongly disagree to 5=Strongly agree or 5 levels of education: ‘Less than high school’, ‘High school graduate/GED’, ‘Some college/Associate degree’, ‘College graduate – 4-year program’, and ‘Post-graduate (Masters or PhD degree)’;
    • interval: ordinal data in fixed/equal-sized categories; e.g., age groups in 10-year intervals or salary in $25,000 intervals; or is it:
  • Continuous data:
    • For example: age, years of education, days of school missed due to asthma exacerbations), etc.

Of course, data are often collected in one mode and then ‘collapsed’ for particular analyses (e.g., age recoded into meaningful age groups, Likert-type scales recoded as ‘agree’/’neutral’/ ’disagree’).

Hot Tip:

Decisions must take into consideration whether the data are ‘normally distributed’ (i.e., is there ‘skewness’ in the data such that the values for age are mostly in persons under 45 though you have a small number of people who are in their 60’s, 70’s, and 80’s?). Most statistical tests have a number of underlying assumptions that one must meet – all starting with data being normally distributed. Thus, one typically begins looking descriptively at their data: frequencies and percentile distributions, means, medians, and standard deviations. Sometimes, graphing the data shows the ‘devil in the detail’ with regard to how data are distributed. There are some statistics one can compute to measure the degree of skewness in the data and whether distributions are significantly different from ‘normal’. And, if the data are not normally distributed, there are several non-parametric statistics that can be computed to take this into account.

Tomorrow’s post will focus on bivariate and multivariable statistics. Stay tuned!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi y’all, we are Anne Rudnicki, Bruce Niebuhr, Mary Jo Urbani and Virginia Niebuhr from the pedi.edtech faculty development project at the University of Texas Medical Branch. For our evaluation, we needed to transcribe faculty interviews. Transcription is time-consuming; and paying transcriptionists and/or purchasing voice recognition software can be expensive.

We found that Google Voice Typing makes excellent quality transcriptions, for free. You need a Google account, computer, external microphone, and an external audio recorder.  The interview is transcribed in real time and the text automatically saved into a Google Doc by the time you finish the interview.

These instructions are for Windows computers. Mac instructions will vary.

Requirements:

  • Chrome browser
  • Google account. If none, create an account
  • Quality external microphone (not the built-in)
  • Separate digital audio recorder or smart phone for back-up
  • Quiet room

Instructions:

  1. Plug mic into PC. On the task bar at the lower right on your monitor, right click the speaker icon.  Click “Recording Devices.” Select your microphone.
  2. Using Chrome browser, go to www.google.com
  3. Click the Google Apps icon.
  4. Open Google Drive app. You may be asked to log into your Google account.
  5. Click NEW
  6. Choose Google Docs; open a Blank document.
  7. Open Tools
  8. Select Voice Typing

    (click for larger image)

  9. Choose Click to Speak
  10. Begin interview (make sure the person being interviewed speaks into the microphone).
  11. Click the red recording icon to stop
  12. The file is immediately and automatically saved in Google Docs.
  13. To move the transcription out of the cloud, go to File menu, “Download as,” choose the Word option. The transcription will be saved as “untitled” Word document in the Downloads folder of your PC. Rename the file and save elsewhere.

Lessons Learned:

  • Practice the above steps several times to become confident.
  • Because Google Voice Typing works over the Internet, there can be dropouts in the transcripts. Use the digital voice recording as a backup. Listen to this recording to fill in gaps and correct transcription errors.
  • We recommend using the Google Doc merely for temporary storage. After downloading the file, delete the files from the cloud. This adds to the security and privacy of your data.
  • This does not work well for focus groups – too many voices at one time.

 Rad Resource:

Linda Cabral and Laura Sefton describe how to use Dragon voice recognition software, a good tool but not free.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hello, AEA365 readers! We are Antonina Rishko-Porcescu (Ukraine), Khalil Bitar (Palestine/Germany) and Bianca Montrosse-Moorhead (USA), from EvalYouth, a global, multi-stakeholders partnership which promotes young and emerging evaluators to become future leaders in evaluation.  Antonina is leader of an EvalYouth task force; Khalil is Vice-Chair of EvalYouth and Bianca is Co-Chair of the network.  We are excited to share what we have learned from EvalYouth’s use of visualization when communicating with our young audiences of evaluators.

We communicate a lot with a broad range of evaluators, especially young and emerging evaluators, and young people from around the world.  In this ever changing fast-paced world, we understand that using words is not enough.  Information must be clear, direct, coherent, and compelling.  One question we have explored is: how should we disseminate information and evidence in a way that draws novice evaluators in, and presents information in a meaningful way?

Examples:

  • Summarizing data collected through an international survey with young and emerging evaluators (e.g., here and here).
  • Summarizing results of received applications for the first EvalYouth International Mentoring Program (e.g., here).
  • Transforming the Network’s original logo to highlight special events and programs (e.g., here and here)

Hot Tips and Cool Tricks:

  • Visualization makes complex data easier to understand, but it is not easy to create good visualizations; it involves hard work and research. Do your homework.
  • Try to strike a balance between pictures and words. An infographic should include valuable information, not just cool graphics, but that too.
  • Use colorful designs and, when appropriate, humor. Doing so invites readers, especially youth and young and emerging evaluators, to read information.
  • Work collaboratively. Others bring fresh perspectives and new ideas, but also often feel more of an ownership of the project you are working on after it concludes.  Ownership often means that there is an excellent possibility they will share it with relevant contacts and on their social media channels afterward.
  • It is not enough to make a great infographic and stop there. Disseminate such work widely through mailing lists and social media outlets. There people will see your message and, very importantly, engage with it.

Lesson Learned:

  • Data visualization used well is a powerful communication tool. It can simplify complex ideas and big data in just a few items on an infographic.
  • Working with a team of people from many backgrounds or countries, helps a lot. What might be right, appropriate and trendy in one region or culture, could be the opposite in another.  Diverse team perspectives can identify and overcome such issues.
  • The ethics of data visualization is also important to consider. A well-done data visualization is a powerful tool! As Uncle Ben in the Spider-Man series said, “with great power comes great responsibility.”  Care should be taken to ensure that the visualization message is accurate, valid, coherent, and just.

Want to learn more about EvalYouth? Follow EvalYouth on social media: Facebook, Twitter, LinkedIn, and YouTube. Or, write to us: evalyouth@gmail.com.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Greetings everyone! Corey Newhouse here, Founder and Principal of Public Profit – a consulting firm helping mission-driven organizations measure and manage what matters. Have you ever toiled for hours on a data-heavy report, only to hear back from a client that it doesn’t look or feel quite right? Maybe they missed your point entirely?

As evaluators, we put a great deal of thought into how we present our findings in order to convey the right message. There is plenty of research to show that a well-formatted report increases comprehension and retention. But sometimes those old-school clients of ours get a little prickly about these updated formatting styles, and spend more time critiquing the layout than paying attention to the content. For exactly those situations, Public Profit has developed a short document to share with clients that outlines why our reports look the way they do.

Lessons Learned:

  • Show, don’t tell. It is easy (and wishful) to think you can tell a client why you did something, and they’ll just accept it. Trust me, it is much easier to just show them why. That’s where our formatting style guide comes in handy. It outlines our rationale behind things like margins, layout, font sizing, and use of color amongst others. This shows the client why our format style works well, all in a quick two-page document.
  • Show them before, not after. When possible, we show the formatting style guide to our clients early in the reporting phase. When the client receives the report, they will be more focused on the information, not the “funky” formatting.
  • Start a conversation. Clients sometimes have valid reasons (or at least strong feelings) for why they might prefer a different format. Having a short conversation early in the process often saves us from spending unnecessary time re-formatting documents at the last minute.

Rad Resources:

  • Download Public Profit’s short formatting style guide that we share with clients.
  • Stephanie Evergreen has a handy checklist to help you score your report layout.
  • Ann K. Emery has lots of great strategies on how to design and display information effectively.
  • Chris Lysy’s Creativity School offers short online course aimed at helping boost your creativity and leverage digital tools to better display information.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

<< Latest posts

Older posts >>

Archives

To top