DVR TIG Week: Insights-to-Action: An Intentional Process by Kendra Thompson-Dyck and Deven Wisner

Hello there!

We’re Kendra Thompson-Dyck, Ph.D. and Deven Wisner, M.S. of Assessment and Research at the University of Arizona. We want to share an insights to action approach we used to share information and get feedback/buy-in from constituent groups. We think this could work for many projects, regardless of topic.

Recently, we set out to understand the lived experiences of students facing unmet basic needs to better address system-driven challenges. Our goal was to move beyond the standard summary report and intentionally assemble voices from across campus, including students, faculty, administration, and staff members, to address this critical issue.

Step 1.  Nice visual deliverables. 

Design played an important role in making our mixed-method reports approachable and engaging. A great first step.

Step 2.  Take it to the people. 

More than just static reports, we wanted to inspire and guide change, which required an additional step in the assessment process. With a carefully designed presentation, we needed to engage decision-makers and those directly connected to students. For us, that translated to facilitated reflection sessions across campus conferences, student affairs offices, faculty senate, and off-campus community agencies (just to name a few). We created a core slidedeck of essential components for all audiences and then adjusted accordingly based on group and time allotted. Also, we sought out presentations where folks were already gathered (e.g., weekly team meetings, conferences).

Step 3.  Gather their input.

Beyond tailoring our presentations to each of the communities (on- and off-campus), we built in reflection questions specific to the group – allowing for us to do some informal member-checking, elicit additional insights from those positioned to influence change, and create an opportunity for real-time action to address student basic needs. Nothing fancy – we used a Google document with basic prompts. Having done similar evaluation and research in the past, our approach did not assume that just because we built it, people would come. You see, from our perspective, strong visualization, like other steps in the evaluation process, is important. Yet without actively identifying and engaging folks that need to see and reflect on the insights, a good visual or report could still find itself in the proverbial file cabinet. From that, we are glad to share some of our own reflections on the process – including tips, resources, and lessons learned.

Hot Tips

  1. Bring your report or insights to the audience with the agency or power to effect change. Folks can leave your session with specific ways to enact change.
  2. Smaller (or simply more targeted) sessions create space for different communities or groups to provide unique insights and/or provide member-checking on the insights you’ve derived from the evaluation. In other words, these sessions can be an extension of your data collection and validation process.
  3. Design a foundational slide deck to draw upon – this saves time as you tailor your facilitation tool for each audience.

Rad Resources

  1. For virtual or in-person sessions, Google Docs are a great way to collect real-time feedback. You can use headers (like in this template, which you can copy!) to help facilitate sessions and have something tangible to share with the group you’re working with.

Lessons Learned

  1. Good reporting is more than aesthetics – data visualization best practices alone aren’t enough to drive use and action. Intentional reporting means including a level of facilitated reflection, which has been noted as an effective way to improve practice (regardless of domain).
  2. Intentional reporting – really everything we’ve talked about – takes time, funding, and patience. For this to be realistic, it needs to be included in your evaluation planning from the beginning.

The American Evaluation Association is hosting Data Visualization and Reporting (DVR) Week with our colleagues in the DVR Topical Interest Group. The contributions all this week to AEA365 come from DVR TIG members. Do you have questions, concerns, kudos, or content to extend this AEA365 contribution? Please add them in the comments section for this post on the AEA365 webpage so that we may enrich our community of practice. Would you like to submit an AEA365 Tip? Please send a note of interest to AEA365@eval.org. AEA365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. The views and opinions expressed on the AEA365 blog are solely those of the original authors and other contributors. These views and opinions do not necessarily represent those of the American Evaluation Association, and/or any/all contributors to this site.

Leave a Comment

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.