AEA365 | A Tip-a-Day by and for Evaluators

TAG | developmental evaluation

Hi, I’m Nora F. Murphy, a developmental evaluator and co-founder of TerraLuna Collaborative. Qualitative Methods have been a critical component of every developmental evaluation I have been a part of. Over the years I’ve learned a few tricks about making qualitative methods work in a developmental evaluation context.

Hot Tip: Apply systems thinking. When using developmental evaluation to support systems change it’s important to apply systems thinking. When thinking about the evaluation’s design and methods I am always asking: Where are we drawing the boundaries in this system? Whose perspectives are we seeking to understand? What are the important inter-relationships to explain? And who benefits or is excluded by the methods that I choose? Qualitative methods can be time and resource intensive and we can’t understand everything about systems change. But it’s important, from a methodological and ethical perspective to be intentional about where we draw the boundaries, whose perspectives we include, and which inter-relationships we explore.  

Hot TipPractice flexible budgeting. I typically budget for qualitative inquiry but create the space to negotiate the details of that inquiry. In one project I budgeted for qualitative inquiry that would commence six months after the contract was finalized. It was too early to know how strategy would develop and what qualitative method would best for learning about the developing strategy. In the end we applied systems thinking and conducted case studies that looked at the developing strategy in three ways: from the perspective of individual educators’ transformation, from the perspective educators participating in school change, and from the perspective of school leaders leading school change. It would have been impossible to predict that this was the right inquiry for the project at the time the budget was developed.

Hot Tip: Think in layers. The pace of developmental evaluations can be quick and there is a need for timely data and spotting patterns as they emerge. But often there is a need for a deeper look at what is developing using a method that takes more time. So I think in layers. With the case studies, for example, we structured the post-interview memos so they can be used with program developer to spot emergent patterns by framing memos around pattern surfacing questions such as: “I was surprised…  A new concept for me was… This reinforced for me… I’m wondering…” The second layer was sharing individual case studies. The third layer was the cross-analysis that surfaced deeper themes. Throughout we engaged various groups of stakeholders in the meaning making and pattern spotting.

Rad Resources:

The American Evaluation Association is celebrating Developmental Evaluation Week. The contributions all this week to aea365 come from evaluators who do developmental evaluation. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

I’m Nan Wehipeihana, a member of the Kinnect Group in New Zealand and co-editor of a forthcoming book on Developmental Evaluation Exemplars. I want to share what we have learned about roles and responsibilities in Developmental Evaluation (DE) by reviewing the reflective practice experiences of DE practitioners.

Hot Tip: Understand the implications of DE being embedded in innovative and complex situations. This means:

  • Evaluative advice is ongoing, iterative, rapid and adaptive.
  • The evaluator can expect to work closely and collaboratively on the development of the innovation as well as the evaluation.
  • The DE evaluator will play a number of roles and innovators will become evaluators.
  • The tools and approaches the DE evaluator will draw on will come from many fields and disciplines.

Hot Tip: Because DE is collaborative, it is fundamentally relationship-based. This makes clarity about roles and responsibilities essential.

Rad Resource: Know and practice 5 critical DE roles and responsibilities:

Nan W

Hot Tip: Four practice-based ways of building the credibility and utility of DE

  1. Identify, develop, and use an inquiry organizing framework.
  2. Layer and align data with the organizing framework.
  3. Time data collection, reporting and sense making to meet the needs of key stakeholders
  4. Engage in values based collaborative sense making

Cool Tricks: Effective and experienced DE practitioners do the following:

  1. Place priority on deep understanding of the context
  2. Appreciate the varying needs of different stakeholders in relation to the innovation
  3. Build and nurture relational trust between the evaluator and the social innovators and funders
  4. Build a deep well of evaluation and methodological experience
  5. Maintain clarity of purpose – supporting innovation development and adaptation

The American Evaluation Association is celebrating Developmental Evaluation Week. The contributions all this week to aea365 come from evaluators who do developmental evaluation. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

I’m Kate McKegg, Director of The Knowledge Institute Ltd, member of the Kinnect Group, and co-editor of a forthcoming book on Developmental Evaluation Exemplars. I want to share what we have learned about readiness for developmental evaluation (DE) by reviewing the experiences of and lessons from DE practitioners.

DE isn’t appropriate for every situation.  So, when we suggest that a client or community undertakes a developmental evaluation, we begin by jointly assessing appropriateness and readiness.

Rad Resource: Differentiate appropriate from inappropriate DE situations.

McKegg

Hot Tip: Readiness extends to evaluators. Developmental evaluators need a deep and diverse methodological toolkit and the ability to be methodologically agile.

Hot Tip: Be prepared to use multiple methods from different disciplines, contexts and cultures, and to be adept enough to develop and adapt methods and approaches to work better in different contexts.

Hot Tip: Know and practice the three DE dispositions.

  1. Embrace unknowability so as to be comfortable about not knowing in advance a sure destination, or known pathway to tread; acknowledge risks and go anyway.
  2. Develop an enquiring mindset, where the DE evaluator and others in the innovation team are open to possibilities, multiple perspectives, puzzles and learning.
  3. Be ready to persevere, to begin an unknown journey and stick with it.

Hot Tip: DE is relational – alignment of values is essential

Alignment of values (the initiative and the DE evaluator) is essential for a DE journey; it’s shared values and trust that create the glue that holds people in relationship with each other.

Hot Tip: Readiness applies to both organizations engaged in innovation and developmental evaluators. Look for readiness alignment.

Rad Resource: Organizational readiness aligned with evaluator readiness

McKegg 2

Cool Trick: Be honest that DE can be hard, is not appropriate for every situation, requires readiness and perseverance, and sometimes even courage.

The American Evaluation Association is celebrating Developmental Evaluation Week. The contributions all this week to aea365 come from evaluators who do developmental evaluation. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Michael Quinn Patton and I am an independent evaluation consultant based in Minnesota but working worldwide. In the last few months I have been editing a book on Developmental Evaluation Exemplars with Kate McKegg and Nan Wehipeihana. (The book will be out in September.) Tomorrow Kate will share what the Developmental Evaluation (DE) cases we’ve reviewed and analyzed reveal about readiness for DE. The following day Nan will share what we’ve learned about developmental evaluator roles and responsibilities. The rest of the week will include reflections from three more developmental evaluators. Today I’m going to introduce the principles of DE that have emerged from this collaborative work with DE practitioners. .

Hot Tip: Understand the specific niche of DE. DE provides evaluative information and feedback to social innovators, and their funders and supporters, to inform adaptive development of change initiatives in complex dynamic environments.

Rad Resource: Eight Essential Principles of Developmental Evaluation

  1. Developmental purpose
  2. Evaluation rigor
  3. Utilization focus
  4. Innovation niche
  5. Complexity perspective
  6. Systems thinking
  7. Co-creation
  8. Timely feedback

Hot Tip: The principles are inter-related and mutually reinforcing. The developmental purpose (#1) frames and focuses evaluation rigor (#2), just as rigor informs and sharpens understanding of what’s being developed. Being utilization-focused (#3) requires actively engaging with social innovators as primary intended users and staying attuned to the developmental purpose of the evaluation as the priority. The innovation niche of DE (#4) necessitates understanding the situation and what is developed through the lens of complexity (#5) which further requires understanding and applying systems thinking (#6) with timely feedback (#8). Utilization-focused engagement involves collaborative co-creation (#7) of both the innovation and the empirically-based evaluation, making the developmental evaluation part of the intervention.

MQPatton

Cool Trick: Work with social innovators, funders, and others involved in social innovation and DE to determine how the principles apply to a particular developmental evaluation. This increases their relevance based on contextual sensitivity and adaptation, while illuminating the practical implications of applying guiding DE principles to all aspects of the evaluation.

Rad Resources:

The American Evaluation Association is celebrating Developmental Evaluation Week. The contributions all this week to aea365 come from evaluators who do developmental evaluation. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

·

I’m Bethany Laursen, Evaluation Outreach Specialist with the Solid & Hazardous Waste Education Center (SHWEC) at the University of Wisconsin. I’m also principal consultant at Laursen Evaluation and Design, LLC. At SHWEC, I help staff design programs that engage opportunities to achieve our mission. Opportunity hunting requires a form of situation assessment, which has not been widely or deeply discussed in evaluation—especially when it comes to evaluating opportunities in complex, dynamical situations.

Rad Resource: AEA’s EvalTalk and TIG group listservs as peer learning communities.

Through EvalTalk, several colleagues helped me distinguish among three approaches/tools that all claim to be useful in developing programs in complex situations: needs assessment (NA), developmental evaluation (DE), and strengths, weaknesses, opportunities and threats (SWOT) analysis.

Lesson Learned: NA, DE and SWOT are all necessary parts of evaluating complex situations and program responses.

To summarize this discussion so far, we have the following options, where () means “as a part of” e.g. NA is a part of SWOT:

  1. NA–>SWOT–>DE
  2. SWOT(NA)–>DE
  3. NA–>DE(SWOT)
  4. DE(NA, SWOT)

Any of these combinations is logical, although #4 might be difficult without one of the others occurring first. What is not logical is leaving one of the triumvirate out (NA, DE, and SWOT). Here’s why:

SWOT is inherently evaluative: it assigns data a certain value label (S, W, O, or T), based on the criteria “effect on our organization’s goals.” Clearly, we need data to do a reality-based SWOT, and this is why we must include a needs assessment. But a NA per se is not going to be enough data, because many clients think a NA is just about external stakeholders’ needs (Os), not internal capacity (Ss and Ws) or larger system realities (often Ts). (If preferred, one could also frame a NA as an ‘asset assessment.’) These external and internal ‘lessons learned’ from our situation should inform developmental program evaluation.

In complex situations, needs assessment is more usefully framed as ongoing situation assessment. This is what I see as the main evaluation task in the Creative Destruction phase of the adaptive cycle. Once we have a lay of the land (situation assessment) and we’ve evaluated the best path to start down (SWOT analysis), then we can jump into developmental evaluation of that path. Of course, what we find along the way might cause us to re-assess our situation and strategy, which is why #4 above is a logical choice.

Lesson Learned: Listen to the language your clients are using to identify relevant evaluation approaches and tools. In SHWEC’s case, our connection to the business sector led me to SWOT analysis, strategic planning, and Lean Six Sigma, all of which are evaluative without necessarily marketing themselves as evaluation approaches.

Figure 1: Augmenting a traditional logic model, this is a metaphorical picture of how SHWEC understands our complex, dynamical situation and our potential evaluation questions. (Each sailboat is a staff member.) Next, I had to find evaluation approaches that would fit.

Figure 1: Augmenting a traditional logic model, this is a metaphorical picture of how SHWEC understands our complex, dynamical situation and our potential evaluation questions. (Each sailboat is a staff member.) Next, I had to find evaluation approaches that would fit.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Greetings to the aea365 community.  I am Randi Ilyse Roth, Executive Director of Philanthropy for the Otto Bremer Foundation, based in Saint Paul, Minnesota, with the mission of assisting people in achieving full economic, civic and social participation in and for the betterment of their communities in places where there are Bremer Banks.

Not long ago trustees and staff of the Otto Bremer Foundation engaged in a series of learning seminars on evaluation. In order to make the core concepts easily accessible and retrievable, we asked Michael Quinn Patton, who led these seminars, to create a set of basic reference cards. These became the Evaluation Flash Cards presented here, with the idea that a core concept can be revisited in a flash. Thus, each is a single page. Illustrations of the concepts are drawn from Otto Bremer Foundation grants. We hope this resource is useful to other organizations committed to understanding and improving the results of the programs they support.

Rad Resource: There are 25 Evaluation Flash Cards.  Here are examples of the subjects addressed.

1. Evaluative Thinking

2. Evaluation Questions

5. Evaluation vs. Research

12. Qualitative Evaluation

18. Developmental Evaluation

19. TheITQuestion

20. Fidelity or Adaptation

24. Utilization-Focused Evaluation

25. Distinguish Different Kinds of Evidence

The full set of flash cards is available here.

Hot Tip:  The Foundation’s Trustees and Staff were involved in the evaluation seminars that led to the evaluation flash cards. So, everyone in the Foundation understands, values, and uses the evaluation flash cards.

Hot Tip:We are not evaluators and we don’t have an evaluation position on our staff.  But we need to think evaluatively.  Evaluation textbooks are not a useable resource for quick reference when an evaluation issue arises. The flash cards provide rapid access to core evaluation ideas and reinforce our commitment to evaluative thinking, the very first flash card in the set.

Cool Trick:We have posted the Evaluation Flash Cards on our website so that others, including our grantees and partners, can share concepts and language as we work together and think evaluatively in our collaborations. 

Rad Resource:  We invite you to use the Evaluation Flash Cards and let us know how you use them.

Here’s the blog post that launched the flash cards.

Clipped from http://www.ottobremer.org/news/ottoblog/april-16-2014/new-evaluation-resource-community-organizations-and-funders

This week, we’re diving into issues of Developmental Evaluation (DE) with contributions from DE practitioners and authors. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi, I’m Nora F. Murphy, co-founder of TerraLuna Collaborative, an evaluation cooperative in Minneapolis, MN with some thoughts about practicing Developmental Evaluation. Like many, I eagerly read Michael Quinn Patton’s Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use from cover to cover. The approach excited me because naturally taps into my chosen profession of evaluator and my past experience as a non-profit program manager. Upon reaching the last page I was left feeling inspired, but also like I had read a riveting book about learning to ride a bike but still had no idea how to actually ride a bike. So I took a deep breath and jumped right in, carrying Patton’s book with me and consulting it often. What follows are two things I do differently as a developmental evaluation practitioner.

Flexible and responsive reporting. Timing is everything. Working as a nonprofit program manager I have experienced what it’s like to get information at the wrong time—such as formative reports presented after all of the next years’ program planning has occurred. Because DE seeks to enhance innovation and support evaluation use, timing is everything. Rather than producing findings at arbitrary time points, we seek to share findings at times that naturally support development. We might share the findings of a relevant survey before a staff planning session, facilitate an activity at stakeholder retreat about patterns we are observing, or share a PowerPoint report at Board Meeting. As opportunities for stakeholder engagement emerges, we revisit and revise our reporting timeline.

Hot Tip: Our evaluation teams often keep a calendar of agencies’ events so we can plan evaluation activities accordingly, allowing us to support development by providing the right information at the right time.

Pay attention to the “it” that is being developed. Often times the “it” (evaluand) being developed in a developmental evaluation is an approach, a set of strategies, or a collaboration, rather than a clearly defined program. With rapidly developing innovative approaches it can be challenging to understand where the boundaries around the evaluand are. And once you have it figured out, it might change. For example, if the evaluation is supporting the development of an innovative approach to working with students, should the evaluation focus on what happens in classrooms? In schools? In the district? In the community? And in community-driven or participatory work, who gets to decide? Engagement in the Adaptive Action cycle helps our team focus this line of inquiry.

Hot Tip: It’s helpful to periodically revisit the simple question: What is the “it” that is being developed?

Rad Resource: Adaptive Action: Leveraging Uncertainty in Your Organization by Eoyang and Holladay

This week, we’re diving into issues of Developmental Evaluation (DE) with contributions from DE practitioners and authors. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

I’m Kate McKegg, Director of The Knowledge Institute Ltd a member of the Kinnect Group with Nan Wehipeihana. We want to share what we have learned about explaining developmental evaluation (DE).

Evaluation isn’t something that our clients or our communities fully understand and it can create anxiety.  So, when we suggest that a client or community undertakes a developmental evaluation this can be extra puzzling for folks.

Rad Resource:We usually begin by reinforcing some key messages about what evaluation is:

McKegg 1

Hot Tip:  In our experience, stressing the importance of systematic, well informed evaluative reasoning is a key step in convincing people that DE is evaluative, and not just some kind of continuous quality improvement process.

Hot Tip:  We explain why we think DE is best suited to their situation, meaning:

  • There is something innovative going on, something is in development and people are exploring, innovating, trying things out and creating something they hope will make a difference
  • The situation is socially and/or technically complex, and rapidly changing.  People are experimenting with new ideas, new ways of doing things, approaches, different relationships and roles – and this is likely to be happening for a while
  • There is a high degree of uncertainty about what is likely to work, in terms of process, practice and outcomes.  Which pathway the initiative might take is not yet clear, i.e., what the future holds is still unknown
  • The situation is emergent, i.e., there are continually emerging questions, challenges, successes and issues for people to deal with in real time.

Hot Tip:  Finally, we explain the key features of DE. We typically focus on the following 4 features:

  • DE has a systems orientation i.e., that understanding a DE evaluation challenge systemically involves paying attention to relationships, different perspectives and boundaries, and that this approach is ideally suited to working with complexity and emergence
  • DE involves cycles of learning to inform action using real time data, as part of an ongoing process of development – probing, venturing, sensing, learning, and re-learning

Rad Resource: Adaptive action and reflection graphic:

McKegg 2

  • DE typically has an emergent evaluation design – in order that it can be responsive and changing needs, issues, and challenges as they arise
  • With DE, the evaluator typically becomes part of the team bringing together evaluative thinking with evidence in ways that support key stakeholders to understand the quality and value of something in real time.

Rad Resource: The Australasian Evaluation Society (AES) Best Evaluation Policy and Systems Award, 2013, was for a Developmental Evaluation we conducted of He Oranga Poutama, a M?ori sport and recreation initiative. You can read about it here.

This week, we’re diving into issues of Developmental Evaluation (DE) with contributions from DE practitioners and authors. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Greetings to the aea365 community.  We are Heather Huseby, Deborah Loon, Major Arnel Ruppel, Matt Lasure, Denise Smieja, and Andrea Simonett representing six agencies in Minneapolis and Saint Paul, Minnesota, serving homeless youth.

Rad Resource: Through an 18-month developmental evaluation process our six quite distinct agencies collaborated to develop and evaluate nine guiding principles to help youth overcome homelessness.

The principles begin with the perspective that youth experiencing homelessness are on a journey; all of our interactions with these youth are filtered through that journey perspective. This means we must be trauma-informed, non-judgmental, and work to reduce harm. By holding these principles, we can build a trusting relationship that allows us to focus on youths’ strengths and opportunities for positive development. Through all of this, we approach youth as whole beings through a youth-focused collaborative system of support.

The full set of principles, how we developed them, the evidence for their effectiveness, and how we are using them is available here.

Hot Tip:  All collaboration members were involved in every aspect of the developmental evaluation from selecting the focus of inquiry to designing the evaluation, interpreting findings, and finalizing the report.

Hot Tip:We met together every month to keep the momentum of the project going.

Cool Trick:Working together on the developmental evaluation solidified our collaboration, gave us a common focus, and facilitated building trust and mutual understanding, which continue.

Rad Resource:  The developmental evaluation was also a doctoral dissertation by Nora Murphy, University of Minnesota.

This week, we’re diving into issues of Developmental Evaluation (DE) with contributions from DE practitioners and authors. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Greetings to the aea365 community.  We are with the Blandin Foundation based in Grand Rapids, Minnesota with the mission of strengthening rural Minnesota communities, especially the Grand Rapids area. We are Wade Fauth, Vice President, and Allison Ahcan, Director of Communications.

Since 2007, Blandin Foundation has committed itself to building an organization-wide assessment system that contributes to improved performance and adaptation to a changing world.  In 2013, Blandin Foundation engaged evaluation expert and author Michael Q. Patton to take its assessment work to a new level.  Together we explored how the field of “developmental assessment” might strengthen the work of the foundation, and how all of the various assessments at play might work together.

Rad Resource: A simple graphic has come to be valuable in understanding and describing what is, admittedly, a complex system.  Blandin Foundation’s Mountain of Accountability© summarizes its three levels of accountability, and the interconnections among them.  The journey to the summit (mission fulfillment) begins in the foothills of basic accountability.  From there, the ascent leads up to the middle of the mountain where more complexity and commitment is involved.  The final level leading to the summit, with its holistic and comprehensive panorama, offers no pre-set trail.  This is first-ascent territory, where the conditions along the route and what has been learned along the way combine to inform further learning and guide the way to the summit. This is the territory of Developmental Evaluation.

Fauth 2

 

Hot Tip:  The Foundation’s entire senior leadership team was involved in developing the Mountain of Accountability through a developmental evaluation reflective practice process. So, the entire leadership team understands, feels ownership of, and uses the Mountain of Accountability. 

Hot Tip:The Foundation’s Trustees have devoted time to reflective practice framed by the Mountain of Accountability.

Cool Trick:On our website where the Mountain of Accountability is posted and explained, we invite comments.  We would be interested in your reactions and any uses you make of the graphic. Please click here to add your comments.

Rad Resource:  We invite you to use the Mountain of Accountability and let us know how you use it.  It is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

This week, we’re diving into issues of Developmental Evaluation (DE) with contributions from DE practitioners and authors. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

<< Latest posts

Older posts >>

Archives

To top