AEA365 | A Tip-a-Day by and for Evaluators

Search

Hi, I’m Chad Green, program analyst at Loudoun County Public Schools in northern Virginia.  Over the past year I’ve been seeking developmental evaluation (DE) practitioners in school districts throughout the U.S. and abroad. Recently I had the pleasure of interviewing Keiko Kuji-Shikatani (C.E.) who is an educator and internal evaluator with the Ontario Ministry of Education. She also helped launch the Credentialed Evaluator designation process for the Canadian Evaluation Society (CES).

Credentialed Evaluators (currently 394 total) are committed to continuous professional learning, which is also the focus of DE as Keiko explained.  More specifically, DE “supports innovation development to guide adaptation to emergent and dynamic realities in complex environments” (Patton, 2010).  Keiko believes that DE is well-suited to public sector work in which adaptation and innovation are the norm in providing services given the changing realities of society.

Hot Tips:

  • The best way to introduce DE, whether to program/policy staff or senior leadership, is to be conscious that DE is about learning, and that when properly applied, evaluation capacity building is happening 24/7.
  • DE involves learning as you go which requires evaluators to engage in systems thinking so they can zoom in and out as they work and continue to co-create innovative solutions to complex challenges.
  • DE is not evaluation light. Developmental evaluators must have a thorough knowledge of evaluation so they can facilitate user-centric use of learning (i.e., a focus on utilization) gained from the DE approach in real time to tackle complex issues.

Keiko prefers to use conventional evaluation tools like logic models to co-construct a theory of change with the team of stakeholders, resulting in a shared understanding of the evolving evaluand. What is unique here is that she insists on describing their ideas in full sentences, much like the clear language used in the AEA Evaluator Competencies, rather than short phrases so as to avoid misunderstandings which are easy to make when complexity is the norm in huge systems such as hers.

Once the team members feel like the desired changes are plausible, she helps them to co-construct the theory of action so that they can collaboratively embed evaluative thinking in the way they work and make the changes feasible. She then takes the team further into what the year looks like to identify (a) the forks in the road where evaluation rigor is fundamental and (b) the use of appropriate data collection methods, analysis, and user-centric use of data so DE or “learning as we go” becomes the way the team makes sense of changing circumstances.

Rad Resources:

The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PreK-12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi there fellow evaluators. We’re Peter Rudiak-Gould and Rochelle Zorzi, evaluation consultants (and obsessive seekers of better ways to do things) at Cathexis Consulting in Toronto, Canada. We’d like to share a simple trick that makes Developmental Evaluation much easier and more productive.

Developmental Evaluation sounds easy. Traditional evaluation is an exhaustively rehearsed dance routine, while Developmental Evaluation is just moving your body to the music. But when you try it, you may find yourself wishing for a choreographer! It’s hard to explain to a client exactly what Developmental Evaluation is. You can’t make any promises about the questions that will be answered or even the methods that will be used, because that will change and grow along with the program. You can’t even tell the program staff how much of their time the process will take. All of this may leave your client (and yourself!) asking: “Wait…what are we doing again?”

We have found that one simple technique can go a long way in bringing some method to the madness. We call it the “Burning Question.”

Cool Trick: Each month, focus the Developmental Evaluation process on a single “Burning Question,” formulated by the program staff with some coaching by the evaluators. The Burning Question is the single most important thing that the program staff need to figure out that month, in order to design or run the program effectively.

Hot Tip: We recommend actually using the phrase “Burning Question,” because it communicates a sense of urgency and buning question markrelevance.

Once the program staff have decided on a Burning Question for the month, help them to identify some very easy data collection that they can do within the month to begin to answer it. At the next monthly meeting, reflect on what’s been learned about the previous month’s Burning Question through that data collection. Then decide on the next month’s Burning Question and how it will be answered. (Next month’s Burning Question can be the same as the previous month’s Burning Question, if it still feels urgent and needs further investigation).

Lesson learned: The Burning Question technique gives a bit of structure and predictability to the Developmental Evaluation process – program staff won’t know what question they’ll be tackling each month, but at least they know that there’ll be a question! The technique also makes sure that the process focuses on information needs that really matter right now, rather than data collection for its own sake. As trust in the process builds, program staff can start to tackle more ambitious evaluation activities: bigger questions with longer-term data gathering, KPIs, a dashboard…But they won’t do any of this until they’ve gotten a flavor of evidence-based decision-making – and that’s where the Burning Question shines.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

We are Valerie Hutcherson and Rebekah Hudgins, Research and Evaluation Consultants with the Georgia Family Connection Partnership (GaFCP) (gafcp.org). Started with 15 communities in 1991, Family Connection is the only statewide network of its kind in the nation with collaboratives in all 159 counties dedicated to the health and well-being of families and communities. Through local collaboratives, partners are brought together to identify critical issues facing the community and to develop and implement strategies to improve outcomes for children and families. The GaFCP strongly believes that collaboration and collective effort yield collective impact. Evaluation has always been a significant part of Family Connection, though capacity within each local collaborative greatly differs.

In 2013, GaFCP invited 6 counties to participate in a cohort focused on early childhood health and education (EC-HEED) using the Developmental Evaluation (DE) framework developed by Michael Quinn Patton. (Patton, 2011. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use). Each county was identified by GaFCP based on need and interest in developing a EC-HEED strategy and had the autonomy to identify collaborative partners, programs and activities to create a strategy tailored to meet the needs and resources of the county. As evaluators we recognized the collaborative and their strategy formation as existing in a complex system with multiple partners and no single model to follow. The DE approach was the best fit for capturing data on the complexity of the collaborative process in developing and implementing their strategies. DE allows for and encourages innovation which is a cornerstone of the Family Connection Collaborative model. Further, this cohort work gave us, as evaluation consultants, the unique opportunity to implement an evaluation system that recognized that understanding this complexity and innovation was as important as collecting child and family outcome data. With DE, the evaluator’s primary functions are to elucidate the innovation and adaptation processes, track their implications and results, and facilitate ongoing, real-time, data-based decision-making. Using this approach, we were able to engage in and document the decision making process, the complexity of the relationships among partners and how those interactions impact the work.

Lessons Learned: Just a few of the lessons we’ve learned are:

  1. Participants using a DE approach may not recognize real-time feedback and evaluation support as “evaluation”. Efforts must be taken throughout the project to clarify the role of evaluation as an integral part of the work.
  2. Successful DE evaluation in a collaborative setting requires attention to the needs of individual partners and organizations.
  3. The DE evaluator is part anthropologist thus is required to be comfortable in the emic-etic (insider-outsider) role as a member of the team as well as one involved in elucidating the practice and work of the team.

We’re looking forward to October and the Evaluation 2016 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

I am Kim Leonard, Senior Evaluation Officer at The Oregon Community Foundation. Today I want to share lessons learned from a developmental evaluation we’re undertaking for our five-year arts education grantmaking initiative – Studio to School.

The nature of this Initiative is developmental and rooted in the arts. Creativity, adaptation, and risk are supported. The first phase of the evaluation is focused on understanding and supporting the arts education programming being developed by the project teams funded through Studio to School. We are now approaching the mid-point of the Studio to School Initiative evaluation, and have learned a lot about the benefits and challenges of implementing a developmental evaluation.

Lesson Learned: Taking a developmental evaluation approach has allowed the Research team to adapt the evaluation in response to the evolution of the Initiative. It took us a little while to get used to this approach! We’ve summarized our evaluation on this handout, and find ourselves coming back to it repeatedly to keep us grounded as we plan new evaluation activities.

Lesson Learned: The Research team has worked in an ongoing way to develop rigorous evaluation activities to collect and provide useful information in a feedback loop. Robust reflection is built into the process; debrief meetings are held following each major learning community and evaluation activity to share and document learnings. These often turn into planning sessions for future evaluation and learning community efforts. In addition, the project teams are journaling electronically – quarterly reflections on what they are learning in response to prompts have been one of the most valuable data sources to date. Prompts (like this example) are developed one or two at a time, so that they are as timely and relevant as possible.

Lesson Learned: A key element of the evaluation, and goal of the Initiative, is to surface and articulate principles of high quality sustainable arts education programming. We began developing principles based on the first year’s evaluation findings, and asked project teams to reflect and provide feedback on draft principles at a recent gathering. We were thrilled with how engaged the teams were in this effort. The photo below shows a project team member reviewing feedback provided on sticky notes. Attendees also placed red dots (as seen in photo) next to those principles that most resonated with their experience. Doing this as a larger group allowed project teams to discuss their feedback and for attendees to react to and comment on one another’s feedback.Leonard

Rad Resources: In addition to the excellent Developmental Evaluation Exemplars (Patton, McKegg, and Wehipeihana, 2016), we have found the Developmental Evaluation Primer and DE 201: A Practitioner’s Guide to Developmental Evaluation from the McConnell Foundation especially helpful. Additional resources are listed at http://betterevaluation.org/plan/approach/developmental_evaluation.

The American Evaluation Association is celebrating Oregon Community Foundation (OCF) week. The contributions all this week to aea365 come from OCF team members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi, we’re Tosca Bruno-van Vijfeijken (Director of the Syracuse University Transnational NGO Initiative) and Gabrielle Watson (independent evaluator). We engaged a group of practitioners at the 2015 AEA conference to talk about organizational change in International Non-Governmental Organizations (INGOs), and explore a hunch that Developmental Evaluation could help organizations manage change.

Several large INGOs are undergoing significant organizational change. These are complex processes – they’re always disruptive and often painful. The risk of failure is high. Roughly half of all organizational change processes either implode or fizzle out ( ). A common approach is not to build in learning systems at all, but rather to take an “announce, flounder, learn” approach ( ).

Lesson Learned: Most INGOs support change processes in three main ways: (1) external “expert” reviews; (2) CEO- level exchanges with peer organizations; (3) staff-level reviews. It is this last category – where change is actually implemented – that is least developed but where it’s most needed. Successful organizational change hinges on deep culture and mindset change ( ).

AEA Session participants highlighted key challenges:

  • Headquarters and country staff experience change very differently
  • Frequent revisiting of decisions
  • Ineffective communication; generates uncertainty and anxiety
  • Learning not well supported at country or implementation team level
  • Country teams retain a passive mindset when should be more assertive
  • Excessive focus on legal and administrative; not enough on culture and mind-set

Can organizations do better? Might Developmental Evaluation offer useful approaches and tools?

Hot Tip: seems tailor-made for large-scale organizational change processes. It is designed for innovative interventions in complex environments when the optimum approach and end-state are not known or knowable. It involves stakeholder sense-making supported by tailored & evolving evaluative inquiry (often also participatory) to quickly test iterations, track progress and guide adaptations. It’s designed to evolve along with the intervention itself.

Hot Tips: Session participants share some good practices:

  • Action learning. Exchanges among implementers increased adaptive capacity and made emotional experience with change easier
  • Pilot initiatives. Time-bound, with frequent reviews and external support
  • “Guerrilla” roll-out. Hand-picked early adopters sparked “viral” spread of new approaches

Lesson Learned: Our review suggests Developmental Evaluation can address many of the challenges of organizational change, including shifting organizational culture. Iterative participatory learning facilitates adaptations that are appropriate and owned by staff. It adds value by building a learning culture – the ultimate driver of large scale organizational change.

We are curious how many organizations are using Developmental Evaluation for their change processes, and what we can learn from this experience. Add your thoughts to the comments, or write to Tosca or Gabrielle if you have an experience to share.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

We are Nora Murphy and Keith Miller with TerraLuna Collaborative, an evaluation cooperative in the Twin Cities, Minnesota. We feel fortunate to be the evaluation partners on several large developmental evaluations.

One project we are working on seeks to support the inner wellbeing journey of seasoned social entrepreneurs. On a recent conference call, a project team member asked: “How do you know when to use the data to make a change to the program? Isn’t struggle an important part of the individual’s wellbeing journey? If we react too quickly to data and ‘fix’ everything the participant isn’t comfortable with, aren’t we minimizing their opportunities for growth?”

He’s right. I (Nora) shared my perspective that evaluation data is only one source of information that should be used when making a decision. Also important to consider is: 1) our intuition, 2) our accumulated personal and professional wisdom, and 3) the collective wisdom of the group of people seeking to use the evaluation findings.

Hot Tip: Be reflective and identify the source(s) of wisdom you are drawing on.

Reflecting on that conversation, Keith and I realized that my response was rooted in the guiding principles of a three-year partnership with the Minnesota Humanities Center, Omaha Public Schools, and The Sherwood Foundation. The guiding principles are:

  • Build and strengthen relationships;
  • Recognize the power of story and the danger of absence;
  • Learn from and with multiple voices; and
  • Amplify community solutions for change.

These principles guide how we show up as evaluators and how we do our work. Evaluation use happens when there is a foundation of trust–trust in both the results and the evaluators. We’ve learned to build trust by investing in relationships, intentionally including multiple voices, seeking absent narratives, and amplifying community ideas and solutions.

Hot Tip: Be responsive, not reactive.

Patton (2010) suggests that one role of developmental evaluators is to look for and document “forks in the road that move the program in new directions. (p. 150)” As developmental evaluators we can facilitate conversations about whether the data be used immediately because it indicates a fork in the road, or whether the data is something to be aware of and track. During these conversations we can also create space for intuition and wisdom.

Lesson Learned: These guiding principles have helped us shape our role as evaluation partners and increase evaluation use. Our partners trust us to engage them in reflective conversations about what the findings mean how they might be used.

Rad Resource: Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use, Michael Quinn Patton, Guilford (2010).

Rad Resource: Nora F. Murphy and Jennifer Tonko – How Do You Understand the Impact of the Humanities?

The American Evaluation Association is celebrating Evaluation Use (Eval Use) Topical Interest Group Week. The contributions all this week to aea365 come from our Eval Use TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

I am Ricardo Wilson-Grau, an evaluator based in Rio de Janeiro but working internationally. Increasingly, I am called upon to serve as a developmental evaluator, I have found the concept of “inquiry framework” (Chapter 8 in Developmental Evaluation[1]) to be invaluable for co-creating developmental evaluation questions and agreeing how they will be answered. As Michael Quinn Patton says: “Matching evaluation questions to particular situations is the central challenge in developmental evaluation’s situational responsiveness and adaptability…”.[2]

DE does not rely on any particular inquiry framework, just as its toolbox is open to a diversity of designs, methods and tools. What is appropriate depends on the innovation challenges a project, program or organization faces at a given point in time. For example, with one client I used a complexity inquiry framework to support the two-month design of a regional peace-building initiative in a continent with a track record of failures in similar attempts. Then, we considered these potential frameworks to support the first stage of implementation: a) Driving innovation with principles, b) Focusing on systems change, c) Fomenting collaboration for innovation, d) Confronting wicked problems and e) Outcome Harvesting.

In the light of the nature of the developmental challenge this emerging initiative faced, there were sound reasons for using one or more or a combination of these frameworks. The client’s most pressing immediate need, however, was to know in as real time as possible what observable and verifiable changes it was influencing in actors who could not be predetermined. Thus, they choose Outcome Harvesting.

Hot Tip: Are you are in a situation of social innovation that aims to influence changes in behavior writ large — from change in individual actions to organizational or institutional changes of policies or practices? Do you need concrete evidence of those achievements as they happen, along with an understanding of whether and how the innovative efforts contributed to those changes? If yes and yes, Outcome Harvesting may be a useful inquiry framework for you.

Rad Resources: In this video I explain in less than three minutes the Outcome Harvesting tool. There you will also find further information.

You can obtain more information about Outcome Harvesting at Better Evaluation.

To explore using the tool with a client, consider this animated PowerPoint slide to support you in operationalizing the iterative six Outcome Harvesting steps.

[1] For more on developmental inquiry frameworks, see Michael Quinn Patton, Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use, Guilford, 2011, Chapter 8.

[2] Ibid, pages 227-228.

The American Evaluation Association is celebrating Developmental Evaluation Week. The contributions all this week to aea365 come from evaluators who do developmental evaluation. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

I’m Charmagne Campbell-Patton, Director of Organizational Learning and Evaluation for Utilization-Focused Evaluation based in Minnesota, and Evaluation & Assessment Specialist for World Savvy, a national education nonprofit that works with educators, schools, and districts to integrate global competence teaching and learning into K-12 classrooms.

World Savvy has staff in Minneapolis, San Francisco, and New York City. We have found reflective practice to be useful in integrating innovative program development, organizational development, and staff development. These three areas of development need to be aligned, occur simultaneously, and be mutually reinforcing. Developmental evaluation both tracks and supports that alignment.

Rad Resource: Model of integrated development using reflective practice.

CCPatton

Hot Tip: Focus the reflective practice on issues that cut across areas of development. Collaboration is a core value of World Savvy in everything we do, so we began by sharing and analyzing positive and negative experiences with collaboration. Other core values that served as the focus for reflective practice included integrity, inclusivity, excellence and (appropriately) learning & reflection.

Hot Tip: Make reflection together a regular practice. Of course, everyone is busy and it is easy to let reflective practice slide. World Savvy has committed to doing it quarterly and making time to do it well.

Hot Tip: Reflective practice involves staff learning and using skills in recounting an incident descriptively, listening attentively, identifying patterns across stories, generating insights and lessons, and identifying actions to take from what is learned. All of this is enhanced with regular practice. It is also an effective way to intergate new staff into the organization’s culture and the meaning of core values.

Hot Tip: Ask staff to identify what they will share in advance so they come prepared. Even better, have them bring the experiences they will share in writing to contribute to the documentation of reflective practice engagement and learning.

Cool Trick: Begin each session with a review of what emerged in prior sessions to provide a sense of what has been developing.

Cool Trick: Use small groups. If facilitating the session remotely or with more than 10-15 participants, use breakout rooms or small groups as a way to create an environment more conducive to sharing.

Hot Tip: Follow through with action. Reflective practice typically yields insights with actionable implications. Failure of the program or organization to follow through can undermine interest in future reflectice practice sessions.

Rad Resources:

Patton, M.Q. (2015) “Reflective practive guidelines” in Qualitative Research and Evaluation Methods, 4th ed. (Sage), pp. 213-216.

Patton, M.Q. (2011). “Reflective practice for developmental evaluation,” in Developmental Evalkatiion: Applying Complexity Concepts to Enhnace Innovation and Use (Guildfiord Press), pp. 265-270.

The American Evaluation Association is celebrating Developmental Evaluation Week. The contributions all this week to aea365 come from evaluators who do developmental evaluation. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi, I’m Nora F. Murphy, a developmental evaluator and co-founder of TerraLuna Collaborative. Qualitative Methods have been a critical component of every developmental evaluation I have been a part of. Over the years I’ve learned a few tricks about making qualitative methods work in a developmental evaluation context.

Hot Tip: Apply systems thinking. When using developmental evaluation to support systems change it’s important to apply systems thinking. When thinking about the evaluation’s design and methods I am always asking: Where are we drawing the boundaries in this system? Whose perspectives are we seeking to understand? What are the important inter-relationships to explain? And who benefits or is excluded by the methods that I choose? Qualitative methods can be time and resource intensive and we can’t understand everything about systems change. But it’s important, from a methodological and ethical perspective to be intentional about where we draw the boundaries, whose perspectives we include, and which inter-relationships we explore.  

Hot TipPractice flexible budgeting. I typically budget for qualitative inquiry but create the space to negotiate the details of that inquiry. In one project I budgeted for qualitative inquiry that would commence six months after the contract was finalized. It was too early to know how strategy would develop and what qualitative method would best for learning about the developing strategy. In the end we applied systems thinking and conducted case studies that looked at the developing strategy in three ways: from the perspective of individual educators’ transformation, from the perspective educators participating in school change, and from the perspective of school leaders leading school change. It would have been impossible to predict that this was the right inquiry for the project at the time the budget was developed.

Hot Tip: Think in layers. The pace of developmental evaluations can be quick and there is a need for timely data and spotting patterns as they emerge. But often there is a need for a deeper look at what is developing using a method that takes more time. So I think in layers. With the case studies, for example, we structured the post-interview memos so they can be used with program developer to spot emergent patterns by framing memos around pattern surfacing questions such as: “I was surprised…  A new concept for me was… This reinforced for me… I’m wondering…” The second layer was sharing individual case studies. The third layer was the cross-analysis that surfaced deeper themes. Throughout we engaged various groups of stakeholders in the meaning making and pattern spotting.

Rad Resources:

The American Evaluation Association is celebrating Developmental Evaluation Week. The contributions all this week to aea365 come from evaluators who do developmental evaluation. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

I’m Nan Wehipeihana, a member of the Kinnect Group in New Zealand and co-editor of a forthcoming book on Developmental Evaluation Exemplars. I want to share what we have learned about roles and responsibilities in Developmental Evaluation (DE) by reviewing the reflective practice experiences of DE practitioners.

Hot Tip: Understand the implications of DE being embedded in innovative and complex situations. This means:

  • Evaluative advice is ongoing, iterative, rapid and adaptive.
  • The evaluator can expect to work closely and collaboratively on the development of the innovation as well as the evaluation.
  • The DE evaluator will play a number of roles and innovators will become evaluators.
  • The tools and approaches the DE evaluator will draw on will come from many fields and disciplines.

Hot Tip: Because DE is collaborative, it is fundamentally relationship-based. This makes clarity about roles and responsibilities essential.

Rad Resource: Know and practice 5 critical DE roles and responsibilities:

Nan W

Hot Tip: Four practice-based ways of building the credibility and utility of DE

  1. Identify, develop, and use an inquiry organizing framework.
  2. Layer and align data with the organizing framework.
  3. Time data collection, reporting and sense making to meet the needs of key stakeholders
  4. Engage in values based collaborative sense making

Cool Tricks: Effective and experienced DE practitioners do the following:

  1. Place priority on deep understanding of the context
  2. Appreciate the varying needs of different stakeholders in relation to the innovation
  3. Build and nurture relational trust between the evaluator and the social innovators and funders
  4. Build a deep well of evaluation and methodological experience
  5. Maintain clarity of purpose – supporting innovation development and adaptation

The American Evaluation Association is celebrating Developmental Evaluation Week. The contributions all this week to aea365 come from evaluators who do developmental evaluation. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Older posts >>

Archives

To top