AEA365 | A Tip-a-Day by and for Evaluators

TAG | developmental evaluation

Hi, I’m Chad Green, program analyst at Loudoun County Public Schools in northern Virginia.  Over the past year I’ve been seeking developmental evaluation (DE) practitioners in school districts throughout the U.S. and abroad. Recently I had the pleasure of interviewing Keiko Kuji-Shikatani (C.E.) who is an educator and internal evaluator with the Ontario Ministry of Education. She also helped launch the Credentialed Evaluator designation process for the Canadian Evaluation Society (CES).

Credentialed Evaluators (currently 394 total) are committed to continuous professional learning, which is also the focus of DE as Keiko explained.  More specifically, DE “supports innovation development to guide adaptation to emergent and dynamic realities in complex environments” (Patton, 2010).  Keiko believes that DE is well-suited to public sector work in which adaptation and innovation are the norm in providing services given the changing realities of society.

Hot Tips:

  • The best way to introduce DE, whether to program/policy staff or senior leadership, is to be conscious that DE is about learning, and that when properly applied, evaluation capacity building is happening 24/7.
  • DE involves learning as you go which requires evaluators to engage in systems thinking so they can zoom in and out as they work and continue to co-create innovative solutions to complex challenges.
  • DE is not evaluation light. Developmental evaluators must have a thorough knowledge of evaluation so they can facilitate user-centric use of learning (i.e., a focus on utilization) gained from the DE approach in real time to tackle complex issues.

Keiko prefers to use conventional evaluation tools like logic models to co-construct a theory of change with the team of stakeholders, resulting in a shared understanding of the evolving evaluand. What is unique here is that she insists on describing their ideas in full sentences, much like the clear language used in the AEA Evaluator Competencies, rather than short phrases so as to avoid misunderstandings which are easy to make when complexity is the norm in huge systems such as hers.

Once the team members feel like the desired changes are plausible, she helps them to co-construct the theory of action so that they can collaboratively embed evaluative thinking in the way they work and make the changes feasible. She then takes the team further into what the year looks like to identify (a) the forks in the road where evaluation rigor is fundamental and (b) the use of appropriate data collection methods, analysis, and user-centric use of data so DE or “learning as we go” becomes the way the team makes sense of changing circumstances.

Rad Resources:

The American Evaluation Association is celebrating Ed Eval TIG Week with our colleagues in the PreK-12 Educational Evaluation Topical Interest Group. The contributions all this week to aea365 come from our Ed Eval TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi there fellow evaluators. We’re Peter Rudiak-Gould and Rochelle Zorzi, evaluation consultants (and obsessive seekers of better ways to do things) at Cathexis Consulting in Toronto, Canada. We’d like to share a simple trick that makes Developmental Evaluation much easier and more productive.

Developmental Evaluation sounds easy. Traditional evaluation is an exhaustively rehearsed dance routine, while Developmental Evaluation is just moving your body to the music. But when you try it, you may find yourself wishing for a choreographer! It’s hard to explain to a client exactly what Developmental Evaluation is. You can’t make any promises about the questions that will be answered or even the methods that will be used, because that will change and grow along with the program. You can’t even tell the program staff how much of their time the process will take. All of this may leave your client (and yourself!) asking: “Wait…what are we doing again?”

We have found that one simple technique can go a long way in bringing some method to the madness. We call it the “Burning Question.”

Cool Trick: Each month, focus the Developmental Evaluation process on a single “Burning Question,” formulated by the program staff with some coaching by the evaluators. The Burning Question is the single most important thing that the program staff need to figure out that month, in order to design or run the program effectively.

Hot Tip: We recommend actually using the phrase “Burning Question,” because it communicates a sense of urgency and buning question markrelevance.

Once the program staff have decided on a Burning Question for the month, help them to identify some very easy data collection that they can do within the month to begin to answer it. At the next monthly meeting, reflect on what’s been learned about the previous month’s Burning Question through that data collection. Then decide on the next month’s Burning Question and how it will be answered. (Next month’s Burning Question can be the same as the previous month’s Burning Question, if it still feels urgent and needs further investigation).

Lesson learned: The Burning Question technique gives a bit of structure and predictability to the Developmental Evaluation process – program staff won’t know what question they’ll be tackling each month, but at least they know that there’ll be a question! The technique also makes sure that the process focuses on information needs that really matter right now, rather than data collection for its own sake. As trust in the process builds, program staff can start to tackle more ambitious evaluation activities: bigger questions with longer-term data gathering, KPIs, a dashboard…But they won’t do any of this until they’ve gotten a flavor of evidence-based decision-making – and that’s where the Burning Question shines.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

Hi, I’m Nora F. Murphy, co-founder of TerraLuna Collaborative. I fell in love with the approach when I first practiced it in 2012 as a mother first, and then as an evaluator.

In “Principles-Focused Evaluation: The Guide”, I share what it was like to make sense of life after my four-year-old son died. Reflecting on my parenting, it was clear that rules didn’t matter—not the number of vegetables he ate, screen minutes he viewed, or vocabulary words he heard. So what did matter? I loved him fiercely. I have no doubt he knew that. I valued his imagination, his laugh, his questions, the spark in his eye. I valued him and what he brought to our family and the world. When I could, I kept him safe. But I couldn’t keep him and when I couldn’t, I walked with him when he was scared. My son knew he was loved, he was valued, and that I would walk with him when he was scared. That’s what truly mattered. These have become my personal guiding principles.

As I made my way from grieving back to life, I wondered: What is my purpose? Is it possible to find and fulfill my purpose through evaluation? If so, how? Can I reimagine myself as an evaluator that works in alignment with who I am becoming, not who I was? Waking Lumina, my guiding principles for professional engagement emerged. These are:

  1. Engage heart, mind, and spirit in all aspects of living my life: my relationship with myself, my relationship with others, my work, and the decisions I make.
  2. Make choices that let my light shine more brightly, and engage with others in a way that supports their ability to shine more brightly.
  3. Build and deepen connections between and amongst people, spirit, nature, passion and purpose.
  4. Increase social justice and equity, recognizing my privilege and the opportunities it affords me to create change.
  5. Inspire and be inspired.

Since 2012, I have conducted numerous principles-focused developmental evaluations for social-justice oriented systems change. I engage my Waking Lumina principles to guide how I approach the work, and I help people in complex systems discover and define their own guiding principles. People I’m working are ecstatic, relieved, curious, or all of the above when I describe a principles-focused approach. They find that principles allow us to work together while seeing the world as it actually is, see people as they are, bring people together around hard issues without asking for complete agreement of uniformity, and provide a framework for coherent systems change with room for adaptation. It’s the most human way of practicing evaluation I’ve ever experienced.

Rad Resource: Read more about the first principles-focused developmental evaluation from the 2014 AEA365 DE Week post, Homeless Youth Collaborative on Developmental Evaluation.

Hot Tip: Follow the Waking Lumina blog to learn more about how the Waking Lumina guiding principles play out in life and evaluation: www.wakinglumina.com

Red Resource:  Principles-Focused Evaluation: The GUIDE. 

The American Evaluation Association is celebrating Principles-Focused Evaluation (PFE) week. All posts this week are contributed by practitioners of a PFE approach. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hello! I’m Dani de García, Director of Performance Evaluation, Innovation, and Learning for Social Impact, an international development management consulting firm. We’re working to innovate within the international evaluation space, especially with evaluation approaches. One of our contracts pilots Developmental Evaluation (DE) at the US Agency for International Development (USAID). We’re trying to see if, how, and when DE is feasible and useful for USAID. I’ll use this contract to illustrate some challenges to implementing innovative approaches, and tips we’re learning on how to overcome them.

Challenge: Bureaucracy can stifle innovation.

Hot Tip: Don’t rush into an innovation until you know whether it’s feasible to implement well. For DE, if the activity is unable to adapt based on what we’re finding, it doesn’t make sense for us to use that approach. So, do your due diligence. Figure out what the opportunities and barriers are. Only move forward if the innovation will truly meet the users’ needs and isn’t just innovation for innovation’s sake.

Challenge: Users don’t want to be guinea pigs for new approaches.

Some call this the penguin effect: everyone wants to see another penguin jump off the ledge into the water before following suit.

Hot Tip: Find what relevant examples you can, even if they’re not the exact same sector or innovation. Show what the innovation looks like in a tangible sense. For us, that meant putting together memos detailing options of what DE could look like for their scenario. We highlighted what data collection would look like, who would be involved, and examples of deliverables for each option.

Challenge: New approaches (or more rigorous ones) can be expensive!

Hot Tip: Be upfront with the costs and benefits. There are many times where innovative approaches are not the right solution for users’ needs. Other times, these investments can save lots of money in the long run. For us, this means turning down teams who are interested in DE, but don’t have the resources for us to believe we believe are necessary to meet their needs.  We have found it helpful to reframe DE to highlight its potential contributions to design/implementation elements rather than just the evaluation side of things.

Challenge: Expectations are lofty (and may not be aligned with what you’re offering).

Hot Tip: Get everyone in the same place to talk about what an innovation can and cannot achieve (and be realistic with yourself about what’s feasible). In our case, we hold initial scoping discussions with stakeholders to understand their needs, educate them about DE, and talk explicitly about what DE can and cannot do. Once the DEs are underway, we reinforce this through workshops that seek to get stakeholders on the same page.

To learn more about this and other examples, consider attending the ICCE AEA session on November 11th: 1472:Challenges to adopting innovations in Monitoring, Evaluation, Research and Learning (and potential solutions!).

The American Evaluation Association is celebrating International and Cross-Cultural (ICCE) TIG Week with our colleagues in the International and Cross-Cultural Topical Interest Group. The contributions all this week to aea365 come from our ICCE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi there, I’m Erin Bock, Director of Research and Evaluation for The Sherwood Foundation in Omaha, Nebraska. I am writing to honor Nora Murphy as a leader in “Next Generation” evaluation.

Why I chose to honor this evaluator:

I met Nora while an evaluation newbie. I was struck by her “special sauce” of authenticity, passionate commitment to social justice, and deep love for people of all kinds. She is on the theoretical front lines of applying developmental evaluation to wicked problems using principles as an anchor. I was fortunate to meet her because her approach was a perfect match for the causes that Sherwood cares about. I personally have grown from our relationship.

Contributions to our field:

Nora’s contributions to our field are many…

  • The idea that initiatives can roll out their complex processes with multiple programs/agencies or levels of participation, but can find alignment and forward progress using principles is our next frontier as a field. Nora has been among the first to attempt this form of developmental evaluation. She won the “Michael Scriven Dissertation Award for Outstanding Contribution to Evaluation Theory, Methodology, or Practice” for this work.
  • Nora and her TerraLuna Collaborative co-founders are changing the way the business of evaluation is run. TerraLuna employees are voted in and they work to “collaboratively heal our communities (and ourselves) and eliminate inequities by helping organizations create the impact to which they aspire.” (TerraLuna website)
  • In partnership with their clients, TerraLuna makes space for local diversity evaluation fellowships. This builds local capacity for evaluative thinking in the communities where collective action is taking place.

Resources:
Ways to learn more about Nora’s work…

The American Evaluation Association is celebrating Labor Day Week in Evaluation: Honoring Evaluation’s Living Pioneers. The contributions this week are tributes to our living evaluation pioneers who have made important contributions to our field and even positive impacts on our careers as evaluators. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

We are Valerie Hutcherson and Rebekah Hudgins, Research and Evaluation Consultants with the Georgia Family Connection Partnership (GaFCP) (gafcp.org). Started with 15 communities in 1991, Family Connection is the only statewide network of its kind in the nation with collaboratives in all 159 counties dedicated to the health and well-being of families and communities. Through local collaboratives, partners are brought together to identify critical issues facing the community and to develop and implement strategies to improve outcomes for children and families. The GaFCP strongly believes that collaboration and collective effort yield collective impact. Evaluation has always been a significant part of Family Connection, though capacity within each local collaborative greatly differs.

In 2013, GaFCP invited 6 counties to participate in a cohort focused on early childhood health and education (EC-HEED) using the Developmental Evaluation (DE) framework developed by Michael Quinn Patton. (Patton, 2011. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use). Each county was identified by GaFCP based on need and interest in developing a EC-HEED strategy and had the autonomy to identify collaborative partners, programs and activities to create a strategy tailored to meet the needs and resources of the county. As evaluators we recognized the collaborative and their strategy formation as existing in a complex system with multiple partners and no single model to follow. The DE approach was the best fit for capturing data on the complexity of the collaborative process in developing and implementing their strategies. DE allows for and encourages innovation which is a cornerstone of the Family Connection Collaborative model. Further, this cohort work gave us, as evaluation consultants, the unique opportunity to implement an evaluation system that recognized that understanding this complexity and innovation was as important as collecting child and family outcome data. With DE, the evaluator’s primary functions are to elucidate the innovation and adaptation processes, track their implications and results, and facilitate ongoing, real-time, data-based decision-making. Using this approach, we were able to engage in and document the decision making process, the complexity of the relationships among partners and how those interactions impact the work.

Lessons Learned: Just a few of the lessons we’ve learned are:

  1. Participants using a DE approach may not recognize real-time feedback and evaluation support as “evaluation”. Efforts must be taken throughout the project to clarify the role of evaluation as an integral part of the work.
  2. Successful DE evaluation in a collaborative setting requires attention to the needs of individual partners and organizations.
  3. The DE evaluator is part anthropologist thus is required to be comfortable in the emic-etic (insider-outsider) role as a member of the team as well as one involved in elucidating the practice and work of the team.

We’re looking forward to October and the Evaluation 2016 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

I’m Caitlin Blaser Mapitsa, working with CLEAR Anglophone Africa to coordinate the Twende Mbele programme, a collaborative initiative started by the governments of Uganda, Benin, and South Africa to strengthen national M&E systems in the region. The programme is taking an approach of “peer learning and exchange” to achieve this, in response to an overall weak body of knowledge about methods and approaches that are contextually relevant.

Lessons Learned:

Since 2007, the African evaluation community has been grappling with what tools and approaches are best-suited to “context-responsive evaluations” in Africa. Thought leaders have engaged on this thorough various efforts, including a special edition of the African Journal of Evaluation, a Bellagio conference, an AfrEA conference, the Anglophone and Francophone African Dialogues, and recently a stream in the 2015 SAMEA conference.

Throughout these long-standing discussions, practitioners, scholars, civil servants and others have debated the methods and professional expertise that are best placed to respond to the contextual complexities of the region. Themes emerging from the debate include the following:

  • Developmental evaluations are emerging as a relevant tool to help untangle a context marked by decentralized, polycentric power that often reaches beyond traditional public sector institutions.
  • Allowing evaluations to mediate evidence-based decision making among diverse stakeholders, rather than an exclusively learning and accountability role, which is more relevant for a context where there is a single organizational decision maker.
  • Action research helps in creating a body of knowledge that is grounded in practice.
  • Not all evidence is equal, and having an awareness of the kind of evidence that is valued, produced, and legitimized in the region will help evaluators ensure they are equipped with methods which recognize this.

Peer learning is an often overlooked tool for building evaluation capacity. In Anglophone Africa there is still a dearth of research on evaluation capacity topics. There is too little empirical evidence and consensus among stakeholders about what works to strengthen the role evaluation could play in bringing about better developmental outcomes.

Twende Mbele works to fill this knowledge gap by building on strengths in the region. At Twende Mbele, we completed a 5 month foundation period to prepare for the 3 year program. This is now formalizing peer learning as an approach that will ensure our systems strengthening work is appropriate to the regional context, and relevant to the needs of collaborating partners.

Rad Resources:

The American Evaluation Association is celebrating Centers for Learning on Evaluation and Results (CLEAR) week. The contributions all this week to aea365 come from members of CLEAR. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

I am Kim Leonard, Senior Evaluation Officer at The Oregon Community Foundation. Today I want to share lessons learned from a developmental evaluation we’re undertaking for our five-year arts education grantmaking initiative – Studio to School.

The nature of this Initiative is developmental and rooted in the arts. Creativity, adaptation, and risk are supported. The first phase of the evaluation is focused on understanding and supporting the arts education programming being developed by the project teams funded through Studio to School. We are now approaching the mid-point of the Studio to School Initiative evaluation, and have learned a lot about the benefits and challenges of implementing a developmental evaluation.

Lesson Learned: Taking a developmental evaluation approach has allowed the Research team to adapt the evaluation in response to the evolution of the Initiative. It took us a little while to get used to this approach! We’ve summarized our evaluation on this handout, and find ourselves coming back to it repeatedly to keep us grounded as we plan new evaluation activities.

Lesson Learned: The Research team has worked in an ongoing way to develop rigorous evaluation activities to collect and provide useful information in a feedback loop. Robust reflection is built into the process; debrief meetings are held following each major learning community and evaluation activity to share and document learnings. These often turn into planning sessions for future evaluation and learning community efforts. In addition, the project teams are journaling electronically – quarterly reflections on what they are learning in response to prompts have been one of the most valuable data sources to date. Prompts (like this example) are developed one or two at a time, so that they are as timely and relevant as possible.

Lesson Learned: A key element of the evaluation, and goal of the Initiative, is to surface and articulate principles of high quality sustainable arts education programming. We began developing principles based on the first year’s evaluation findings, and asked project teams to reflect and provide feedback on draft principles at a recent gathering. We were thrilled with how engaged the teams were in this effort. The photo below shows a project team member reviewing feedback provided on sticky notes. Attendees also placed red dots (as seen in photo) next to those principles that most resonated with their experience. Doing this as a larger group allowed project teams to discuss their feedback and for attendees to react to and comment on one another’s feedback.Leonard

Rad Resources: In addition to the excellent Developmental Evaluation Exemplars (Patton, McKegg, and Wehipeihana, 2016), we have found the Developmental Evaluation Primer and DE 201: A Practitioner’s Guide to Developmental Evaluation from the McConnell Foundation especially helpful. Additional resources are listed at http://betterevaluation.org/plan/approach/developmental_evaluation.

The American Evaluation Association is celebrating Oregon Community Foundation (OCF) week. The contributions all this week to aea365 come from OCF team members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi, we’re Tosca Bruno-van Vijfeijken (Director of the Syracuse University Transnational NGO Initiative) and Gabrielle Watson (independent evaluator). We engaged a group of practitioners at the 2015 AEA conference to talk about organizational change in International Non-Governmental Organizations (INGOs), and explore a hunch that Developmental Evaluation could help organizations manage change.

Several large INGOs are undergoing significant organizational change. These are complex processes – they’re always disruptive and often painful. The risk of failure is high. Roughly half of all organizational change processes either implode or fizzle out ( ). A common approach is not to build in learning systems at all, but rather to take an “announce, flounder, learn” approach ( ).

Lesson Learned: Most INGOs support change processes in three main ways: (1) external “expert” reviews; (2) CEO- level exchanges with peer organizations; (3) staff-level reviews. It is this last category – where change is actually implemented – that is least developed but where it’s most needed. Successful organizational change hinges on deep culture and mindset change ( ).

AEA Session participants highlighted key challenges:

  • Headquarters and country staff experience change very differently
  • Frequent revisiting of decisions
  • Ineffective communication; generates uncertainty and anxiety
  • Learning not well supported at country or implementation team level
  • Country teams retain a passive mindset when should be more assertive
  • Excessive focus on legal and administrative; not enough on culture and mind-set

Can organizations do better? Might Developmental Evaluation offer useful approaches and tools?

Hot Tip: seems tailor-made for large-scale organizational change processes. It is designed for innovative interventions in complex environments when the optimum approach and end-state are not known or knowable. It involves stakeholder sense-making supported by tailored & evolving evaluative inquiry (often also participatory) to quickly test iterations, track progress and guide adaptations. It’s designed to evolve along with the intervention itself.

Hot Tips: Session participants share some good practices:

  • Action learning. Exchanges among implementers increased adaptive capacity and made emotional experience with change easier
  • Pilot initiatives. Time-bound, with frequent reviews and external support
  • “Guerrilla” roll-out. Hand-picked early adopters sparked “viral” spread of new approaches

Lesson Learned: Our review suggests Developmental Evaluation can address many of the challenges of organizational change, including shifting organizational culture. Iterative participatory learning facilitates adaptations that are appropriate and owned by staff. It adds value by building a learning culture – the ultimate driver of large scale organizational change.

We are curious how many organizations are using Developmental Evaluation for their change processes, and what we can learn from this experience. Add your thoughts to the comments, or write to Tosca or Gabrielle if you have an experience to share.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

We are Nora Murphy and Keith Miller with TerraLuna Collaborative, an evaluation cooperative in the Twin Cities, Minnesota. We feel fortunate to be the evaluation partners on several large developmental evaluations.

One project we are working on seeks to support the inner wellbeing journey of seasoned social entrepreneurs. On a recent conference call, a project team member asked: “How do you know when to use the data to make a change to the program? Isn’t struggle an important part of the individual’s wellbeing journey? If we react too quickly to data and ‘fix’ everything the participant isn’t comfortable with, aren’t we minimizing their opportunities for growth?”

He’s right. I (Nora) shared my perspective that evaluation data is only one source of information that should be used when making a decision. Also important to consider is: 1) our intuition, 2) our accumulated personal and professional wisdom, and 3) the collective wisdom of the group of people seeking to use the evaluation findings.

Hot Tip: Be reflective and identify the source(s) of wisdom you are drawing on.

Reflecting on that conversation, Keith and I realized that my response was rooted in the guiding principles of a three-year partnership with the Minnesota Humanities Center, Omaha Public Schools, and The Sherwood Foundation. The guiding principles are:

  • Build and strengthen relationships;
  • Recognize the power of story and the danger of absence;
  • Learn from and with multiple voices; and
  • Amplify community solutions for change.

These principles guide how we show up as evaluators and how we do our work. Evaluation use happens when there is a foundation of trust–trust in both the results and the evaluators. We’ve learned to build trust by investing in relationships, intentionally including multiple voices, seeking absent narratives, and amplifying community ideas and solutions.

Hot Tip: Be responsive, not reactive.

Patton (2010) suggests that one role of developmental evaluators is to look for and document “forks in the road that move the program in new directions. (p. 150)” As developmental evaluators we can facilitate conversations about whether the data be used immediately because it indicates a fork in the road, or whether the data is something to be aware of and track. During these conversations we can also create space for intuition and wisdom.

Lesson Learned: These guiding principles have helped us shape our role as evaluation partners and increase evaluation use. Our partners trust us to engage them in reflective conversations about what the findings mean how they might be used.

Rad Resource: Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use, Michael Quinn Patton, Guilford (2010).

Rad Resource: Nora F. Murphy and Jennifer Tonko – How Do You Understand the Impact of the Humanities?

The American Evaluation Association is celebrating Evaluation Use (Eval Use) Topical Interest Group Week. The contributions all this week to aea365 come from our Eval Use TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Older posts >>

Archives

To top