AEA365 | A Tip-a-Day by and for Evaluators

TAG | dashboards

My name is Jason Lawrence and I am a Management Fellow on the Economic Development and Intergovernmental Affairs teams for Leon County Government in Florida. I am also a recent graduate of the Master of Public Administration Program at the Askew School of Public Administration and Policy at Florida State University.

Local governments are the linchpins of our democracy in that these entities are able to provide optimal services to citizens, which means society is functioning at its most basic level. But how can citizens gauge how well a specific department within their local government, or the organization overall, performs in terms of service delivery and effectiveness?

Hot Tip: Statistical dashboards have become the most effective way of doing so. These online databases complement budget documents and annual reports and help residents keep track of what a local government does with taxpayers’ dollars. Dashboards are often presented as infographics, the data of which is taken from raw numbers reported by different departments to a central office, often at the executive level.

SEA

Source: https://data.cityofnewyork.us/dashboard

Lesson Learned: In local governments, for example, citizens may be interested in how many police arrests were made in the first quarter of the year. Or, how many tons of recyclable material has the solid waste department processed? While such quantitative performance data is useful in terms of invoking citizen engagement, it often dismisses service quality in favor of service quotas. Put more simply, numbers do not and cannot properly explain performance and leave out how citizens really feel about the local government services they receive as well as reflecting those citizens that have access and those that do not.

Hot Tip: Local governments considering or that already utilize a dashboard on their websites should, therefore, find ways to gather rich qualitative data to enhance performance reporting – whether it be included in a dashboard or within the pages of an annual report or budget document. This can be done by conducting surveys at community events or on a monthly basis through the main page of a website, and can be specific to a different department or service.

Adding a “human” element to performance reporting will not only shape the significance citizens place on local government, but could also help these entities better align their priorities with the pulse of their communities.

The American Evaluation Association is celebrating Southeast Evaluation Association (SEA) Affiliate Week with our colleagues in the SEA Affiliate. The contributions all this week to aea365 come from SEA Affiliate members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Veronica Olazabal (Rockefeller Foundation) and Linda Raftree (Kurante) here on considerations for data dashboards. Linda recently organized a Tech Salon on this at The Rockefeller Foundation. A session around dashboards had to include Stephanie Evergreen. We also wanted to hear from others, so we included Shawna Hoffman (MasterCard Foundation) and John DeRiggi (DAI).

Dashboards might seem simple. The private sector has it down – surely it’s not hard for nonprofits to follow suit. Surprisingly, our social sector tech/data savvy colleagues all shared similar challenges! Out of the session emerged 13 tips below are 5.

Olazabel

Hot Tips:

# 1: Ask whether you really need a dashboard or if one is even possible

It’s critical to have data dashboard discussions across the organization to understand real needs and expectations. People often say they need a dashboard because they want to make better decisions – but what kind of decisions? What information is needed to make them? Where will information come from? Who will get it?

#2: Define the audience and type of dashboard

A dashboard cannot fulfill everyone’s needs. Most organizations will need several for different levels of decision-making. It’s important to know who will own it, use it, maintain it, and collect the data. Will it be internally or externally facing? Discussing all of this early is a key part of the process.

#3: Work with users to develop your dashboard

Start by clearly identifying the audience and asking what they need. Don’t assume you know — however, don’t assume that they know either! Have a conversation where their and your expertise comes together. Perhaps take the ‘data’ out of the conversation altogether. Ask decision-makers what questions they are trying to answer, what problems they are trying to solve, and go from there.

#4: Don’t underestimate the time and resources a functional dashboard requires

You can’t make a dashboard without data to support it. Nor can you create and launch a dashboard and move it to autopilot. The dashboard will need constant change and iteration, and there will be ongoing work to maintain it. The questions being asked may change over time, and the dashboard may need to constantly adjust. You’ll need time to get buy-in for using dashboards.

#5: A dashboard shouldn’t be the only basis for decision.

Like a car dashboard – data dashboards signal that something is changing but you still need to look under the hood to see what’s going on. A dashboard should trigger questions and be a launch pad for discussion.

Rad Resources:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Greetings! I’m Sara Vaca, independent consultant at EvalQuality.com and Creative Advisor of this blog. I started with this post (link) observing where and how evaluation can use a dash of creativity, and now I’m going to share my experience using creativity to better understand evaluation.

After my first AEA conference in Washington D.C. (October 2013), during my daily stroll, all the words and concepts I had been hearing during that week –mixed methods, rubrics, approaches, participation, values, dashboards, etc.- were flying around in my head. I was wondering: there are so many different possibilities (stance, paradigm, approach, methods) to design an evaluation, and yet they are not clearly visible in evaluation reports…

Suddenly, it all clicked in my mind and I thought: What if you could see many of these evaluator’s decisions in just one page? I know! I will create a “meta-evaluation” dashboard!

Some months later, after much reading and research and many sketches and drafts, I came up with this dashboard, where you can see reflected in a very visual way 10 (for me) major issues of an evaluation:

  1. Complexity
  2. Purpose ranking
  3. Evaluative synthesis thermometer
  4. Participation scan
  5. Sampling decisions
  6. Mix-methods scan
  7. Core tools
  8. Credible evidence
  9. Evaluation standards and
  10. Evaluation outputs.

Vaca 3

Initially, the Dashboard can be used to visualize the evaluation methodology of an evaluation report after its completion. Or it can be used by evaluators to explain the methodology they have followed.

Also it is a tool for meta-evaluating and quality assurance. But, it can also be used to visualize an evaluation design prior to its realization. Finally, it can be useful in discussing evaluation design with evaluation commissioners, to explore various options. And, it could be used to show the evaluation design proposed by the commissioner in the Terms of Reference. Or, even to teach evalution.

I presented it as a poster in both European Evaluation Society and Evaluation 2014 conferences (Dublin and Denver) and I would like to thank all the comments and feedback received, from the people who didn’t understand it at first, to those who told me that it was inspiring. Special thanks to Michael, Scriven, Jennifer Greene, Patricia Rogers, Jane Davidson, Beverly Parsons, Ian Davies and many others who took the time to take a look at it and comment on it.

For more information: http://www.evalquality.com/the-meta-evaluative-dashboard/

For reactions and comments: Sara.vaca@EvalQuality.com

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Greetings aea365 community! I’m Ann Emery and I’ve been both an external evaluator and an internal evaluator. Today I’d like to share a few of the reasons why I absolutely love internal evaluation.

Lessons Learned: Internal evaluation is a great career option for fans of utilization-focused evaluation. It gives me opportunities to:

  • Meet regularly with Chief Operating Officers and Executive Directors, so evaluation results get put into action after weekly staff meetings instead of after annual reports.
  • Participate on strategic planning committees, where I can make sure that evaluation results get used for long-term planning.

Lessons Learned: Internal evaluators often have an intimate understanding of organizational history, which allows us to:

  • Build an organizational culture of learning where staff is committed to making data-driven decisions.
  • Create a casual, non-threatening atmosphere by simply walking down the hallway to chat face-to-face with our “clients.” I hold my best client meetings in the hallways and in the mailroom.
  • Use our organizational knowledge to plan feasible evaluations that take into account inevitable staff turnover.
  • Tailor dissemination formats to user preferences, like dashboards for one manager and oral presentations for another.
  • Participate in annual retreats and weekly meetings. Data’s always on the agenda.

Lessons Learned: Internal evaluators can build evaluation capacity within their organizations in various ways:

  • I’ve co-taught Excel certification courses to non-evaluators. Spreadsheet skills can help non-evaluators feel more comfortable with evaluation because it takes some of the mystery out of data analysis.
  • I’ve also led brown bags about everything from logic models to research design. As a result, I’ve been more of a data “coach,” guiding staff through evaluation rather than making decisions on their behalf.

Hot Tips: Internal evaluators can use their skills to help their organizations in other ways, including:

  • Volunteering at program events. When I served food to child and teen participants at Thanksgiving, my time spent chatting with them helped me design more responsive data collection instruments.
  • Contributing to organization-wide research projects, such as looking for patterns in data across the participants that programs serve each year.
  • Partnering with graduate interns and external evaluators to conduct more in-depth research on key aspects of the organization.

Cool Trick: Eun Kyeng Baek and SeriaShia Chatters wrote about the Risks in Internal Evaluation. When internal evaluators get wrapped inside internal politics, we can partner with external evaluators like consulting firms, independent consultants, and even graduate interns. Outsider perspectives are valuable and keep things transparent.

Rad Resources:

AEA is celebrating Internal Evaluators TIG Week. The contributions all week come from IE members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluator.

· · · · · · · · ·

My name is Susan Kistler, the American Evaluation Association’s Executive Director. This week I am excited to welcome AEA’s newest Topical Interest Group (TIG) focusing on Data Visualization and Reporting (DVR). All this week, we’ll be reading aea365 posts from our DVR colleagues and have DVR items highlighted on the Headlines and Resources list (you can view or subscribe here). On Thursday, we’ll host a webinar on Developing Evaluation Reports That Are Useful, User-friend, and Used, and stay tuned next Saturday for our first aea365 drawing.

The DVR team is exploring issues that are near and dear to my heart – I always seem to be swimming in data and want to share it with the world, but getting it cleaned, converted, beautified and delivered can be time-consuming and requires skills that I need to improve. Even before the DVR TIG started, we’ve had lots of posts on aea365 focusing in this area. This seemed like the perfect time to dive into the archives and highlight past DVR posts.

Hot Tip – Data Visualization and Recording posts from the aea365 archives:

Interested in design clarity? See Stephanie Evergreen on Graphic Design, Daniela Schröter on Improving Evaluation Document Clarity, or my own post on Finding a Great Font.

Need tips to increasing the accessibility and readability of your materials? Jennifer Sullivan Sulewski on Using Universal Design to Make Your Evaluations More Inclusive, Lyn Paleo on Graphic-Based Reports and Graphics for Color-Impaired Readers, and Lija Greenseid on Using a Readability Calculator.

Want to improve and deliver great slides for your next presentation? See John Nash on Creating Outstanding Presentation Slides and Chris Lysy on Slideshare and Slidecasting.

Called on to create a dashboard? Check out Kile Dyer on Dashboards or Veronica Smith on Data Dashboard Design.

Hoping for data visualization inspiration? See Bianca Montrosse on Innovative Data Displays, or my own posts on Data Visualization Part I and Part II.

Looking for data visualization tools? Try Laura Blasi on Sparklines, Nina Potter on Tableau for Data Visualization, or Sue Griffey on Wordle.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org.

· · · ·

My name is Kile Dyer, and I am the Director of Organizational Development and Training at Fox Entertainment Group. One of my interests is pursuing the “holy grail” of the development field, demonstrating the ROI. I’ll be sharing a couple of successful practices and great resources that can help you build a scorecard framework for reporting to key stakeholders.

Hot Tip: Tell Them What They Want to Hear, Not Everything You Know: Not being heard? Providing too much information could be costing you buy-in. In my experience, business executives tend not to care about methodology or detailed variable views. They tend to want “dashboards” with aggregate scores on key impact areas.

We experienced the most success and greatest buy-in when using a tiered approach to evaluating and reporting. Providing executives with an overall score on key impact areas and holding the details for other stakeholders. For an overview on dashboards, read “Creating Effective Learning Measurement Dashboards” in Training Industry Quarterly, Fall 2009. Without advocating any one company’s services, sample designs and reports can be viewed at: http://bit.ly/evaldashboards

Rad Resource: An essential read for practitioners seeking to demonstrate ROI on training and development programs. The concepts and frameworks presented helped me translate my formal knowledge of evaluation into practical application that resonated with business leaders in my presentations and reports. The Training Measurement Book: Best Practices, Proven Methodologies, and Practical Approaches, Josh Bersin (Pfeiffer, 2008)

This contribution is from the aea365 Daily Tips blog, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org.

·

Archives

To top