I’m Susan Kistler, AEA’s Executive Director, and this week John Gargani reflected on what evaluation might look like in 10 years (go ahead, click away – it’s worth a read – but come back!).
Lessons [to be] Learned looking into a hazy crystal ball for what evaluation might look like 10 years hence, I wanted to build on John’s contributions.
Culturally Competent Evaluation Is the Norm: Living the lessons reflected in AEA’s Statement on Cultural Competence in Evaluation, evaluators will consistently incorporate considerations of culture and context into all evaluation in order to realize increased effectiveness and use of evaluation results. They will attend to issues of power, privilege, and access, whether stemming from historical inequities or newer divides brought upon by differential access to information and the capacity to use it.
Funders and the Public Expect Data-Informed Decision-Making: We’re seeing increased expectations of transparency from a technologically-adept generation of digital natives. Hand-in-hand, come accountability questions of how resources were allocated, money spent, and to what end. Evaluation must be a foundational facet of programming planning and execution in order to have ready answers.
Stakeholders Perform Most Analyses: Excel and other spreadsheet applications allow my local barber to work with data in ways once reserved for academics and analysts. More recently, Tableau and the like have put the power of data visualization in the hands of everyman. Over the next ten years, advances in analysis and visualization tools will continue this trend so that stakeholders can, and will expect to, perform a range of analyses and explorations once reserved for the specially trained.
Many Evaluators Focus on Developing Systems and Building Capacity: Increasingly, evaluators will serve a fundamental role as partners in program planning, creating systems that incorporate measurement and real-time reporting (as John suggested) to drive the data-informed culture. A growing subset of evaluators will focus on building the capacity of program staff to identify data needs, understand the available data, know their limitations in terms of both analysis and interpretation, and leverage external specialized evaluation assistance when needed.
Evaluators Debate (more) the Tradeoffs of Access and Insight: Ultimately, evaluation will be challenged in a tech-savvy era that places analytic power in the hands of constituencies with varying capacities to interpret accurately what they are seeing. The simplification required to render an analysis, may limit its value. The next big debate? How to balance access and insights – when are we willing to sacrifice the depth of analysis in order to increase access and use? Is such sacrifice even needed? What tools and strategies can create a both/and rather than an either/or situation?
What do you see happening in the next 10 years? Share here, or add to the discussion on John’s blog.
The above is my own opinion and does not necessarily represent that of AEA. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to email@example.com. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.