AEA365 | A Tip-a-Day by and for Evaluators

TAG | evaluative thinking

Greetings! We’re Guy Sharrock (Catholic Relief Services), Tom Archibald (Virginia Tech), and Jane Buckley (JCBConsulting). Following a much earlier aea365 post dated April 29, 2012, Evaluative Thinking: The ‘Je Ne Sais Quoi’ of Evaluation Capacity Building and Evaluation Practice, we’d like to describe what we are learning from our evaluative thinking (ET) work in Ethiopia and Zambia.

A paradigm shift is taking place in the aid community away from linear models of change to more dynamic, reflective, and responsive models. This requires adaptive management. It necessitates “teams with skills and interests in learning and reflection” and a recognition that “evaluative thinking is indispensable for informed choices.”

Defined as critical thinking in the context of M&E motivated by an attitude of inquisitiveness and a belief in the value of evidence, the process of ET is summarized in this figure:

sharrock-1

With Catholic Relief Services in Ethiopia and Zambia, we have organized and led ET capacity-building interventions over three years that take participants through a complete ET process. We work with three audiences: locally-based partners who have daily contact with rural community members, program leaders who oversee technical program management, and country leadership who set the tone for learning and reflection.

Results to date are encouraging. After embedding ET techniques in existing work processes, staff report there is now more substantive and productive dialogue during regular monitoring and reflection meetings. This arises from the improved quality of inquiry whereby the perspectives of field staff, volunteers, project participants, program managers can generate new insights to inform program decisions. In turn, this enriches the content of reporting and communication with donors and other key stakeholders.

Hot Tips:

  1. Ensure a safe environment for participants engaged in potentially contentious conversations around assumptions.
  2. Supportive leadership is a pre-requisite in the febrile atmosphere of a results- and target-driven culture that can all-too-easily crowd-out more reflective practice.
  3. Distinguish between questioning and criticizing to encourage debate and transparency.

Lessons Learned:

  1. A trusting relationship with the donor is critical for creating safe spaces for learning.
  2. Take time to listen and to find ways to engage frontline staff in decision-making.
  3. Encourage curiosity by suspending the rush to an easy conclusion and finding tangible ways to manage uncertainty.

Rad Resources:

  1. Adaptive Management: What it means for CSOs: A report written by Michael O’Donnell in 2016 for
  2. Working with assumptions: Existing and emerging approaches for improved program design, monitoring and evaluation: A December 2016 Special Issue of Evaluation and Program Planning.
  3. Realising the SDGs by reflecting on the way(s) we reason, plan and act: The importance of evaluative thinking: An October 2016 brief from IIED and EvalPartners.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

I’m Tom Archibald, Chief of Party of the USAID/Education and Research in Agriculture project in Senegal and Assistant Professor of Agricultural, Leadership, and Community Education at Virginia Tech. I’ve just arrived in Chicago and am looking forward to another exciting week full of learning, connecting, and fun! Jane Buckley and I are getting ready for our workshop on evaluative thinking (ET), “Evaluative Thinking: Principles and Practices to Promote Exemplary Evaluation.”

What is ET, you might ask? In essence, ET is critical thinking applied to contexts of program planning and evaluation. More specifically, we define ET as:

critical thinking applied in the context of evaluation, motivated by an attitude of inquisitiveness and a belief in the value of evidence, that involves identifying assumptions, posing thoughtful questions, pursuing deeper understanding through reflection and perspective taking, and informing decisions in preparation for action.

And what better place to practice ET than at the annual AEA conference!? Put another way, how ironic would it be if we somehow spent this week together without practicing ET?

To that end, I’ll be playing a little game of ET scavenger hunt this year, and I invite you to play along. The rules are simple. Any time you see or hear ET in action, or hear ET discussed (either explicitly or implicitly), whether in a professional development workshop, a conference session, in the hallways, or even at a social event, tweet it to @aeaweb with the hashtag #evalthink (and include #Eval15 if you can to tag it to the conference). If you don’t tweet, you can just say out loud, “Oh, that’s evaluative thinking!” People might think you’re odd, but that is OK.

In our workshop, Jane and I offer a list of ideas on how one might “know it when you see it.” To give you some hints to help you with your scavenger hunt, here are some of those ideas:

Things you may hear:

  • Why are we assuming X?
  • How do we know X?
  • How might we be wrong about X?
  • What evidence do we have for X?
  • What is the thinking behind the way we do X?
  • How could we do X better?
  • How does X connect to our intended outcomes?
  • Stakeholder X’s perspective on this might be Y!

Things you may see:

  • More evidence gathering and sharing
  • More feedback (all directions)
  • Reflective conversations among staff, beneficiaries, leadership, etc.
  • More diagrams/models used to illustrate thinking
  • Program evolution
  • More effective staff and programs
  • Greater field staff influence over project decisions

People often look for ways to get the most out of the conference. I propose that an ET scavenger hunt, and taking on an ET mindset while at the conference, can enhance your experience and help you go home with new insights and lessons.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Keiko Kuji-Shikatani, the current chair of the Evaluation Use Topical Interest Group (TIG), one of the original AEA TIGs. The Evaluation Use TIG was born of the interest in evaluation utilization in the 1970s, extending into both theoretical and empirical work on Use in the 1980s and 1990s, and to a broader conceptualization of use and influence in the 2000s. The Evaluation Use TIG is committed to understanding and enhancing the use of evaluation in a variety of contexts and to maximizing the positive influence of evaluation through both the evaluation process and the results produced.

Program evaluation began with the desire to seek information that can be utilized to improve the human condition. Use may not be apparent to those who are not internal to an organization since the process of using evaluation requires discussions that may be very sensitive in nature. This week’s AEA365 will examine how Evaluation Use TIG members are striving to support various efforts in diverse and complex contexts.

As for me, as an internal evaluator for the Ontario Ministry of Education, utilization of evaluation is something that is the norm in what I do every day in pursuit of reaching every student. The world in which our students are growing up and going to be leaders and learners throughout their lifetime is a complex and a quickly changing place. In order to support students so they are the best that they can be, those in the system needs to work smarter and use evaluative thinking to guide every facet of improvement efforts.

Rad Resource: Evaluative thinking is systematic, intentional and ongoing attention to expected results. It focuses on how results are achieved, what evidence is needed to inform future actions and how to improve future results. One cannot really discuss Evaluation Use without Michael Quinn Patton – check out (http://www.mcf.org/news/giving-forum/making-evaluation-meaningful).

Our work as internal evaluators involve continually communicating the value of evaluative thinking and guiding developmental evaluation (DE) by modeling the use of evidence to understand more precisely the needs of all students and to monitor and evaluate progress of improvement efforts.

Hot Tips: Check out how evaluation (http://edu.gov.on.ca/eng/teachers/studentsuccess/CCL_SSE_Report.pdf) is used to inform the next steps https://www.edu.gov.on.ca/eng/teachers/studentsuccess/strategy.html) and how that change can look like (http://edu.gov.on.ca/eng/research/EvidenceOfImprovementStudy.pdf).

In our work, the ongoing involvement of evaluators, who are intentionally embedded in program and policy development and implementation teams contribute to modeling evaluative thinking and guiding DE that build system evaluation capacity. The emphasis is on being a learning organization through evidence-informed, focused improvement planning and implementation.

Hot Tips: check out how evaluative thinking is embedded in professional learning (http://sim.abel.yorku.ca/ )or how evaluation thinking is embedded in improvement planning (http://www.edu.gov.on.ca/eng/policyfunding/memos/september2012/ImprovePlanAssessTool.pdf).

The American Evaluation Association is celebrating Evaluation Use (Eval Use) Topical Interest Group Week. The contributions all this week to aea365 come from our Eval Use TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi, we are Tom Archibald (Assistant Professor and Extension Specialist, Department of Agricultural, Leadership, and Community Education at Virginia Tech) and Guy Sharrock (Senior Technical Advisor for Learning with Catholic Relief Services). We believe one way to integrate and sustain learning in an organization is by intentionally promoting “evaluative thinking.”

Evaluative thinking (ET) is an increasingly popular idea within the field of evaluation. A quick overview of ET is provided in a previous post here. Today, we share some principles and practices for instilling ET in organizations and programs, based on our experiences facilitating ET-promoting workshops with development practitioners in Ethiopia and Zambia.

Lesson Learned: From our research and practice, we identified these guiding principles for promoting ET:

  1. Promoters of ET should be opportunist about engaging learners in ET processes, building on and maximizing intrinsic motivation. Meet people where they are and in what they are doing.
  2. Promoting ET should incorporate incremental experiences, following the developmental process of “scaffolding.” For example, instead of starting by asking people to question their deeply-held beliefs, begin with something less threatening, such as critiquing a newspaper article, and then work up to more advanced ET.
  3. High-level ET is not a born-in skill, nor does it depend on any particular educational background; therefore, promoters should offer opportunities for it to be intentionally practiced by all who wish to develop as evaluative thinkers.
  4. Evaluative thinkers must be aware of—and work to overcome—assumptions and belief preservation.
  5. ET should be applied in many settings—program design, monitoring, evaluation, and so on. In order to best learn to think evaluatively, the skill should be applied and practiced in multiple contexts and alongside peers and colleagues.
  6. Old habits and practices die hard. It may take time for ET to infuse existing processes and practices. Be patient and persevere!

Lesson Learned: In addition, we learned that:

  • Interest and buy-in in the effort must be both top-down and bottom-up. From the top, in international development, some funders and large organizations (e.g., the US Agency for International Development) are increasingly supportive of learning-centered and complexity-aware approaches, favoring the promotion of ET.
  • Existing levels and structures of evaluation capacity must be considered; ET can and should fit within and augment those structures.
  • Hierarchical power dynamics and cultural norms, especially around giving and receiving constructive criticism (without getting defensive) must be addressed.

Rad Resource: InterAction and the Centre for Learning on Evaluation and Results for Anglophone Africa have undertaken a study of international NGO ET practices in sub-Saharan Africa. Their report provides some great insights on the enabling factors (at a general, organizational, and individual level) that can help ET, and learning, take hold.

The American Evaluation Association is celebrating Organizational Learning and Evaluation Capacity Building (OL-ECB) Topical Interest Group Week. The contributions all this week to aea365 come from our OL-ECB TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello. I am Karen Widmer, a 4th year doctoral student in the Evaluation program at Claremont Graduate University. I’ve been developing and evaluating systems for performance (business, education, healthcare, and nonprofits) for a long time. I think organizations are a lot like organisms. While each organization is unique, certain conditions help them all grow. I get enthusiastic about designing evaluations that optimize those conditions!

Theme: My master’s research project looked at evaluation-related activities shared by high-performing organizations. For these organizations, evaluation was tied to decision making. Evaluation activity pulled together knowledge about organizational impact, direction, processes, and developments, and this fed the decisions. The challenge for evaluation is to pool the streams of organizational knowledge most relevant for each decision.

Hot Tip:

  • Evaluative thinking identifies the flow of organizational knowledge and this provides decision makers with a point of reference for quality decisions.
  • In technical language, Knowledge Flow may mediate or moderate the relationship between evaluative thinking and decision quality. Moreover, the quality of the decision could be measured by the performance outcomes resulting from the decision!

Widmer.aea365.graphic.quality decisions 4 3 13

Cool Trick:

  • Design your evaluation to follow the flow of knowledge throughout the evaluand lifecycle.
  • Document what was learned when tacit knowledge was elicited; when knowledge was discovered, captured, shared, or applied; and knowledge regarding the status quo was challenged. (To explore further, look to the work of: M. Polanyi, I. Becerra-Fernandez, and C. Argyris and D. Schon.)
  • For the organizations I looked at, these knowledge activities contained the evaluative feedback desired by decision makers. The knowledge generated at these points told what’s going on.
  • For example, tacit perceptions could be drawn out through peer mentoring or a survey; knowledge captured on a flipchart or by software; or a team might “discover” knowledge new to the group or challenge knowledge previously undisputed.

Conclusion: By design or still shot, evaluative thinking can view the flow of knowledge critical to decisions about outcomes. Knowledge Flow offers a framework for connecting evaluation with the insights decision makers want for reflection and adaptive response. Let’s talk about it!

Rad Resource: The Criteria for Performance Excellence is a great government publication that links evaluative thinking so closely with decisions about outcomes that you can’t pry them apart.

Rad resource: Neat quote by Nielsen, Lemire, and Skov in the American Journal of Evaluation (2011) defines evaluation capacity as  “…an organization’s ability to bring about, align, and sustain its objectives, structure, processes, culture, human capital, and technology to produce evaluative knowledge [emphasis added] that informs on-going practices and decision-making to improve organizational effectiveness.”

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· · ·

Hi, we are Tom Archibald and Jane Buckley with the Cornell Office for Research on Evaluation. Among other initiatives, we work in partnership with non-formal educators to build evaluation capacity. We have been exploring the idea of evaluative thinking, which we believe is an essential, yet elusive, ingredient in evaluation capacity building (ECB). Below, we share insights gained through our efforts to understand, describe, measure, and promote evaluative thinking (ET)—not to be confused with the iconic alien!

Lesson Learned: From evaluation

  • Michael Patton, in an interview with Lisa Waldick from the International Development Research Center (IDRC), defines it as a willingness to ask: “How do we know what we think we know? … Evaluative thinking is not just limited to evaluation projects…it’s an analytical way of thinking that infuses everything that goes on.”
  • Jean King, in her 2007 New Directions for Evaluation article on developing evaluation capacity through process use, writes “The concept of free-range evaluation captures the ultimate outcome of ECB: evaluative thinking that lives unfettered in an organization.”
  • Evaluative thinkers are not satisfied with simply posing the right questions. According to Preskill and Boyle’s multidisciplinary model of ECB in the American Journal of Evaluation in 2008, they possess an “evaluative affect.”

Lesson Learned: From other fields

Notions related to ET are common in both cognitive research (e.g., evaluativist thinking and metacognition) and education research (e.g., critical thinking), so we searched the literature in those fields and came to define ET as comprised of:

  • Thinking skills (e.g., questioning, reflection, decision making, strategizing, and identifying assumptions), and
  • Evaluation attitudes (e.g., desire for the truth, belief in the value of evaluation, belief in the value of evidence, inquisitiveness, and skepticism.)

Then, informed by our experience with a multi-year ECB initiative, we identified five macro-level indicators of ET:

  • Posing thoughtful questions
  • Describing and illustrating thinking
  • Active engagement in the pursuit of understanding
  • Seeking alternatives
  • Believing in the value of evaluation

Rad Resource: Towards measuring ET

Based on these indicators, we have begun developing tools (scale, interview protocol, observation protocol) to collect data on ET. They are still under development and have not yet undergone validity and reliability testing, which we hope to accomplish in the coming year. You can access the draft measures here. We value any feedback you can provide us about these tools.

Rad Resource: Towards promoting ET

One way we promote ET is through The Guide to the Systems Evaluation Protocol, a text that is part of our ECB process. It contains some activities and approaches which we feel foster ET, and thus internal evaluation capacity, among the educators with whom we work.

 

Tom and Jane will be offering an AEA Coffee Break Webinar on this topic on May 31st. If you are an AEA member, go here to learn more and register. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Archives

To top