Welcome to aea365! Please take a moment to review our new community guidelines. Learn More.

Displaying Expertise for All Audiences: Don’t Forget the Simple Things by Patricia Campion

Hello AEA community, my name is Patricia Campion, PhD in Sociology and currently an independent evaluator with twenty years of experience. A valuable lesson I have learned through the years is that we shouldn’t be afraid to keep things simple to be mindful to our audience, and that it doesn’t mean we have to dumb things down.

While we all like to showcase our expertise with advanced methodologies, since it is so central to our profession, many in our audience are intimidated or confused by methods jargon. To reach them efficiently, however, we do not need to forgo methodological rigor.

I learned to combine both complex and simple reporting techniques while evaluating an Intimate Partner Violence (IPV) prevention program for Sunrise of Pasco County (https://www.sunrisepasco.org/), in Central Florida. The program was funded by a DELTA FOCUS grant with CDC (https://www.cdc.gov/violenceprevention/deltafocus/index.html) over five years.

The evaluation relied on a pre-posttest for students enrolled in an after-school program. The logical approach to analyze this type of data was a nonparametric means test. For Sunrise’s annual report, I carefully outlined the logic behind the test I had chosen to use, the Wilcoxon Signed-Ranks test (https://www.youtube.com/watch?v=dkobjvhxTro). In case anyone who read the report knew about the more commonly-used t-test, I also included it and explained why it wasn’t the best choice.

The Wilcoxon test showed significant improvement on some of the measures. The program staff and state evaluators, however, weren’t as excited as I had expected them to be: They did not connect with my SPSS result tables and its many statistics.

A conversation with the program coordinator led us to consider a simpler approach. The statistical tests were deemed necessary, but we needed to add a more impactful visual tool. So, I went back to basics, i.e., the data at the heart of the tests. Since we were looking at differences in average results on a number of survey items, I used Excel to build a simple graph showing the average class scores for each item in the pre- and posttests. The pretest showed as a red line, the posttest as a blue line. Each time the blue line was higher than the red line, the program had made a difference.

The simpler display, of course, did not have the statistical rigor of my SPSS tables. Not every difference it showed was real in a statistical sense. However, everyone connected to it. And they were not interested in looking at each individual item on its own. They looked at the graph in a holistic fashion, as showing that the program was able to make a difference, though not always and not to the same extent for all topics.

Hot Tip:

  • Be ready for people at various levels of quantitative literacy to read your reports: Mix statistical and visual content at different levels of expertise.
  • Don’t reinvent the wheel. If you already use Excel, it provides a reliable source of quality graphs. It allows you to easily import data from SPSS and copy/paste graphs into other Microsoft apps.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.

5 thoughts on “Displaying Expertise for All Audiences: Don’t Forget the Simple Things by Patricia Campion”

  1. Tyler Vaillancourt

    Hello Particia,

    Like some of the other commentors here, I am currently taking a course on Program Inquiry and Evaluation at Queens University, and as such, I appreciate the recommendation to keeping it simple. As someone that was not familiar with all the jargon of evaluation, clear language goes a long way to making knowledge accessible.

    I also think your message is rather timely; looking at the entries during decolonization week, I think keeping it simple is an important part of that work. By opening up the language that is being used, and removing barriers to participate, it becomes easier to invite new voices into the conversation. In the same way that you opened the dialogue around your evaluation by simplifying the results from your SPSS results, this helps people focus on the big picture instead of getting lost in data.

    In any field, experts like to demonstrate their expertise with a flourish, but like you say, this can serve to alienate or intimidate our audience. Your ideas about keeping the complex reporting techniques while including the more accessible ones is welcome, especially as someone who is just learning about the field of evaluation.

    Aside from clear visuals and simple language, what do you think is the best way to engage with stakeholders?

  2. Hi Sheila,
    My name is Tom Steer, I am working on completing a graduate diploma in education through Queen’s university and one of the courses I am currently enrolled in is focused on program evaluation. As part of the program, we are to find an article that resonates with us and connect with the author, this is the article that resonated with me. Reading the literature on program evaluation took due diligence to truly understand and, with a large focus of program evaluation being on the benefit of one user group or another, the question has persisted throughout the course, that question being : ” How are program evaluators going to truly going to make their findings accessible to the target populations of the programs they are evaluating, if many times these populations would not necessarily have access to higher forms of education and learning. “. It was breath of fresh air to find someone who is aware and is actively involved in making program evaluation findings more accessible for the “non-evaluator” population. Would you say that the general reaction to this concept by the program evaluation community is a supportive one? That is something I am also curious about. Hope to hear from you and thank you for your post!

  3. Hi Patricia,

    I am taking an evaluation course for my Masters in Literacy Education at Queens University. Throughout this course we have been learning about the importance of evaluation and how effective and useful evaluation can be, when used appropriately.

    During this course we learned about the importance of connecting with stakeholders and helping them gain a sense of ownership over the evaluation process. I feel this connects to your article because if stakeholders do not understand the way the data is presented to them then they will not be able to use the information to the best of their ability and improve the program.

    I love that you remind us of the important lesson of keeping things simple. Just because we make something simple and easier for the average person to access does not mean that it is not an accurate and strong representation of the data, it just means that the data will be more likely to be used!

    I also appreciate how you mention that we do not have to recreate the wheel and find new ways to represent data. I love how you suggest using excel. I wonder if you have any other great programs that are easy to share quantitative data to stakeholders?

    Thanks,
    Melissa

  4. Good afternoon Patricia!
    As a current graduate student completing a course on “Program Design and Evaluation”, I really related to your article. The course I am completing has been my first exposure to the evaluation process and it has opened my eyes to the process of evaluation as well as the complex role of an evaluator. Being new to evaluative jargon, I myself felt intimidated by research articles and it was difficult to navigate and fully comprehend material being presented. I found that simple handbooks and the articles presented on AEA 365 were much more digestible and so much easier to connect with which in turn made me feel more engaged with the subject. My experience with “simpler” information was similar to the experience that program staff had with your more visual and simplified presentation of information. Less became more.

    I can only imagine how stakeholders may feel when being presented with their final program evaluation reports. If stakeholders are not able to make sense of the information and the results presented, then evaluators are not meeting the users needs. Reports need to be clear and accessible in order for them to be used. In the Making sense of evaluation: A handbook for everyone, Superu (2017) states that a good report will have “clear and transparent explanations of the reasoning used, that are understandable to both non-evaluators and readers without deep content expertise in the subject matter”. I appreciate that you noted that simplification does not equate to “dumbing down information”. Stakeholders being presented with information are not going to have the same background knowledge or expertise regarding evaluation but that does not mean that information needs to be dumbed down for them. By simplifying findings, users are more likely to connect to information, build understanding and use the information presented from the evaluation results. I feel that your experience with presenting both complex and simple reporting techniques really shows the impact of mindfully presenting to the audience.

    -Jasmine Currie

    Reference:
    Superu, Social Policy Evaluation and Research Unit. (2017). Making sense of evaluation: A handbook for everyone. Retrieved from https://dpmc.govt.nz/sites/default/files/2018-03/Evaluation%20Handbook%20Dec%202017.pdf

  5. Hi,
    I am a masters student in the graduate program at Queen’s University and I am currently taking a course on program inquiry and evaluation. I read your blog post on keeping things simple when displaying data, with interest as we have been learning about evaluation use.
    In your blog post you explain the lesson that you learned about keeping things simple for your intended audience but at the same time not dumbing it down too much either. The lesson that you mention in your blog is much like the usability of evaluation outputs that Murray Saunders describes in The use and usability of evaluation outputs: A social practice approach. According to Saunders (2012) “usability refers to the dimensions of evaluation design, within the power of evaluators to affect, which are likely to inhibit or enhance the chances of evaluation output being used”(p. 433). Saunders uses the social practice dimension to examine evaluation usability and one reason that he states this is valuable is, “it enables the identification of practices of ‘engagement’ (with evaluation outputs, for example) that facilitate the process of establishing what changes in practice flow from the knowledge resources an evaluation might yield” (p. 426). Just as you have described Saunders states that a key practice is “Rendering evidence and data sets in ways that the non technical stakeholder or potential user can ‘read’ them, creating narratives with both qualitative and quantitative evidence” (p. 431). I like how you responded to your evaluation users and displayed the results using Excel. Your hot tip to “Be ready for people at various levels of quantitative literacy to read your reports: Mix statistical and visual content at different levels of expertise.” Is something that I will remember as I go forward with my evaluation practice. Finally your other hot tip “Don’t reinvent the wheel. If you already use Excel, it provides a reliable source of quality graphs. It allows you to easily import data from SPSS and copy/paste graphs into other Microsoft apps.” Resonates with me because I quite like Excel and I feel that it has huge functionality and is sometimes overlooked. Thank you for a very informative blog post.
    Campion, P. & Robinson, S. (2021, October 27). Displaying Expertise for All Audiences: Don’t Forget the Simple Things by Patricia Campion. AEA 365 A Tip-a-day by and for Evaluators. https://aea365.org/blog/displaying-expertise-for-all-audiences-dont-forget-the-simple-things-by-patricia-campion/
    Saunders, M. (2012). The use and usability of evaluation outputs: A social practice approach. Evaluation 18(4) 421–436 DOI: 10.1177/1356389012459113

    Sincerely,
    Donna McLaughlin

Leave a Reply to Jasmine Currie Cancel Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.