AEA365 | A Tip-a-Day by and for Evaluators

TAG | developmental evaluation

Hello! I’m Dani de García, Director of Performance Evaluation, Innovation, and Learning for Social Impact, an international development management consulting firm. We’re working to innovate within the international evaluation space, especially with evaluation approaches. One of our contracts pilots Developmental Evaluation (DE) at the US Agency for International Development (USAID). We’re trying to see if, how, and when DE is feasible and useful for USAID. I’ll use this contract to illustrate some challenges to implementing innovative approaches, and tips we’re learning on how to overcome them.

Challenge: Bureaucracy can stifle innovation.

Hot Tip: Don’t rush into an innovation until you know whether it’s feasible to implement well. For DE, if the activity is unable to adapt based on what we’re finding, it doesn’t make sense for us to use that approach. So, do your due diligence. Figure out what the opportunities and barriers are. Only move forward if the innovation will truly meet the users’ needs and isn’t just innovation for innovation’s sake.

Challenge: Users don’t want to be guinea pigs for new approaches.

Some call this the penguin effect: everyone wants to see another penguin jump off the ledge into the water before following suit.

Hot Tip: Find what relevant examples you can, even if they’re not the exact same sector or innovation. Show what the innovation looks like in a tangible sense. For us, that meant putting together memos detailing options of what DE could look like for their scenario. We highlighted what data collection would look like, who would be involved, and examples of deliverables for each option.

Challenge: New approaches (or more rigorous ones) can be expensive!

Hot Tip: Be upfront with the costs and benefits. There are many times where innovative approaches are not the right solution for users’ needs. Other times, these investments can save lots of money in the long run. For us, this means turning down teams who are interested in DE, but don’t have the resources for us to believe we believe are necessary to meet their needs.  We have found it helpful to reframe DE to highlight its potential contributions to design/implementation elements rather than just the evaluation side of things.

Challenge: Expectations are lofty (and may not be aligned with what you’re offering).

Hot Tip: Get everyone in the same place to talk about what an innovation can and cannot achieve (and be realistic with yourself about what’s feasible). In our case, we hold initial scoping discussions with stakeholders to understand their needs, educate them about DE, and talk explicitly about what DE can and cannot do. Once the DEs are underway, we reinforce this through workshops that seek to get stakeholders on the same page.

To learn more about this and other examples, consider attending the ICCE AEA session on November 11th: 1472:Challenges to adopting innovations in Monitoring, Evaluation, Research and Learning (and potential solutions!).

The American Evaluation Association is celebrating International and Cross-Cultural (ICCE) TIG Week with our colleagues in the International and Cross-Cultural Topical Interest Group. The contributions all this week to aea365 come from our ICCE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi there, I’m Erin Bock, Director of Research and Evaluation for The Sherwood Foundation in Omaha, Nebraska. I am writing to honor Nora Murphy as a leader in “Next Generation” evaluation.

Why I chose to honor this evaluator:

I met Nora while an evaluation newbie. I was struck by her “special sauce” of authenticity, passionate commitment to social justice, and deep love for people of all kinds. She is on the theoretical front lines of applying developmental evaluation to wicked problems using principles as an anchor. I was fortunate to meet her because her approach was a perfect match for the causes that Sherwood cares about. I personally have grown from our relationship.

Contributions to our field:

Nora’s contributions to our field are many…

  • The idea that initiatives can roll out their complex processes with multiple programs/agencies or levels of participation, but can find alignment and forward progress using principles is our next frontier as a field. Nora has been among the first to attempt this form of developmental evaluation. She won the “Michael Scriven Dissertation Award for Outstanding Contribution to Evaluation Theory, Methodology, or Practice” for this work.
  • Nora and her TerraLuna Collaborative co-founders are changing the way the business of evaluation is run. TerraLuna employees are voted in and they work to “collaboratively heal our communities (and ourselves) and eliminate inequities by helping organizations create the impact to which they aspire.” (TerraLuna website)
  • In partnership with their clients, TerraLuna makes space for local diversity evaluation fellowships. This builds local capacity for evaluative thinking in the communities where collective action is taking place.

Resources:
Ways to learn more about Nora’s work…

The American Evaluation Association is celebrating Labor Day Week in Evaluation: Honoring Evaluation’s Living Pioneers. The contributions this week are tributes to our living evaluation pioneers who have made important contributions to our field and even positive impacts on our careers as evaluators. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

We are Valerie Hutcherson and Rebekah Hudgins, Research and Evaluation Consultants with the Georgia Family Connection Partnership (GaFCP) (gafcp.org). Started with 15 communities in 1991, Family Connection is the only statewide network of its kind in the nation with collaboratives in all 159 counties dedicated to the health and well-being of families and communities. Through local collaboratives, partners are brought together to identify critical issues facing the community and to develop and implement strategies to improve outcomes for children and families. The GaFCP strongly believes that collaboration and collective effort yield collective impact. Evaluation has always been a significant part of Family Connection, though capacity within each local collaborative greatly differs.

In 2013, GaFCP invited 6 counties to participate in a cohort focused on early childhood health and education (EC-HEED) using the Developmental Evaluation (DE) framework developed by Michael Quinn Patton. (Patton, 2011. Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use). Each county was identified by GaFCP based on need and interest in developing a EC-HEED strategy and had the autonomy to identify collaborative partners, programs and activities to create a strategy tailored to meet the needs and resources of the county. As evaluators we recognized the collaborative and their strategy formation as existing in a complex system with multiple partners and no single model to follow. The DE approach was the best fit for capturing data on the complexity of the collaborative process in developing and implementing their strategies. DE allows for and encourages innovation which is a cornerstone of the Family Connection Collaborative model. Further, this cohort work gave us, as evaluation consultants, the unique opportunity to implement an evaluation system that recognized that understanding this complexity and innovation was as important as collecting child and family outcome data. With DE, the evaluator’s primary functions are to elucidate the innovation and adaptation processes, track their implications and results, and facilitate ongoing, real-time, data-based decision-making. Using this approach, we were able to engage in and document the decision making process, the complexity of the relationships among partners and how those interactions impact the work.

Lessons Learned: Just a few of the lessons we’ve learned are:

  1. Participants using a DE approach may not recognize real-time feedback and evaluation support as “evaluation”. Efforts must be taken throughout the project to clarify the role of evaluation as an integral part of the work.
  2. Successful DE evaluation in a collaborative setting requires attention to the needs of individual partners and organizations.
  3. The DE evaluator is part anthropologist thus is required to be comfortable in the emic-etic (insider-outsider) role as a member of the team as well as one involved in elucidating the practice and work of the team.

We’re looking forward to October and the Evaluation 2016 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

I’m Caitlin Blaser Mapitsa, working with CLEAR Anglophone Africa to coordinate the Twende Mbele programme, a collaborative initiative started by the governments of Uganda, Benin, and South Africa to strengthen national M&E systems in the region. The programme is taking an approach of “peer learning and exchange” to achieve this, in response to an overall weak body of knowledge about methods and approaches that are contextually relevant.

Lessons Learned:

Since 2007, the African evaluation community has been grappling with what tools and approaches are best-suited to “context-responsive evaluations” in Africa. Thought leaders have engaged on this thorough various efforts, including a special edition of the African Journal of Evaluation, a Bellagio conference, an AfrEA conference, the Anglophone and Francophone African Dialogues, and recently a stream in the 2015 SAMEA conference.

Throughout these long-standing discussions, practitioners, scholars, civil servants and others have debated the methods and professional expertise that are best placed to respond to the contextual complexities of the region. Themes emerging from the debate include the following:

  • Developmental evaluations are emerging as a relevant tool to help untangle a context marked by decentralized, polycentric power that often reaches beyond traditional public sector institutions.
  • Allowing evaluations to mediate evidence-based decision making among diverse stakeholders, rather than an exclusively learning and accountability role, which is more relevant for a context where there is a single organizational decision maker.
  • Action research helps in creating a body of knowledge that is grounded in practice.
  • Not all evidence is equal, and having an awareness of the kind of evidence that is valued, produced, and legitimized in the region will help evaluators ensure they are equipped with methods which recognize this.

Peer learning is an often overlooked tool for building evaluation capacity. In Anglophone Africa there is still a dearth of research on evaluation capacity topics. There is too little empirical evidence and consensus among stakeholders about what works to strengthen the role evaluation could play in bringing about better developmental outcomes.

Twende Mbele works to fill this knowledge gap by building on strengths in the region. At Twende Mbele, we completed a 5 month foundation period to prepare for the 3 year program. This is now formalizing peer learning as an approach that will ensure our systems strengthening work is appropriate to the regional context, and relevant to the needs of collaborating partners.

Rad Resources:

The American Evaluation Association is celebrating Centers for Learning on Evaluation and Results (CLEAR) week. The contributions all this week to aea365 come from members of CLEAR. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

I am Kim Leonard, Senior Evaluation Officer at The Oregon Community Foundation. Today I want to share lessons learned from a developmental evaluation we’re undertaking for our five-year arts education grantmaking initiative – Studio to School.

The nature of this Initiative is developmental and rooted in the arts. Creativity, adaptation, and risk are supported. The first phase of the evaluation is focused on understanding and supporting the arts education programming being developed by the project teams funded through Studio to School. We are now approaching the mid-point of the Studio to School Initiative evaluation, and have learned a lot about the benefits and challenges of implementing a developmental evaluation.

Lesson Learned: Taking a developmental evaluation approach has allowed the Research team to adapt the evaluation in response to the evolution of the Initiative. It took us a little while to get used to this approach! We’ve summarized our evaluation on this handout, and find ourselves coming back to it repeatedly to keep us grounded as we plan new evaluation activities.

Lesson Learned: The Research team has worked in an ongoing way to develop rigorous evaluation activities to collect and provide useful information in a feedback loop. Robust reflection is built into the process; debrief meetings are held following each major learning community and evaluation activity to share and document learnings. These often turn into planning sessions for future evaluation and learning community efforts. In addition, the project teams are journaling electronically – quarterly reflections on what they are learning in response to prompts have been one of the most valuable data sources to date. Prompts (like this example) are developed one or two at a time, so that they are as timely and relevant as possible.

Lesson Learned: A key element of the evaluation, and goal of the Initiative, is to surface and articulate principles of high quality sustainable arts education programming. We began developing principles based on the first year’s evaluation findings, and asked project teams to reflect and provide feedback on draft principles at a recent gathering. We were thrilled with how engaged the teams were in this effort. The photo below shows a project team member reviewing feedback provided on sticky notes. Attendees also placed red dots (as seen in photo) next to those principles that most resonated with their experience. Doing this as a larger group allowed project teams to discuss their feedback and for attendees to react to and comment on one another’s feedback.Leonard

Rad Resources: In addition to the excellent Developmental Evaluation Exemplars (Patton, McKegg, and Wehipeihana, 2016), we have found the Developmental Evaluation Primer and DE 201: A Practitioner’s Guide to Developmental Evaluation from the McConnell Foundation especially helpful. Additional resources are listed at http://betterevaluation.org/plan/approach/developmental_evaluation.

The American Evaluation Association is celebrating Oregon Community Foundation (OCF) week. The contributions all this week to aea365 come from OCF team members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi, we’re Tosca Bruno-van Vijfeijken (Director of the Syracuse University Transnational NGO Initiative) and Gabrielle Watson (independent evaluator). We engaged a group of practitioners at the 2015 AEA conference to talk about organizational change in International Non-Governmental Organizations (INGOs), and explore a hunch that Developmental Evaluation could help organizations manage change.

Several large INGOs are undergoing significant organizational change. These are complex processes – they’re always disruptive and often painful. The risk of failure is high. Roughly half of all organizational change processes either implode or fizzle out ( ). A common approach is not to build in learning systems at all, but rather to take an “announce, flounder, learn” approach ( ).

Lesson Learned: Most INGOs support change processes in three main ways: (1) external “expert” reviews; (2) CEO- level exchanges with peer organizations; (3) staff-level reviews. It is this last category – where change is actually implemented – that is least developed but where it’s most needed. Successful organizational change hinges on deep culture and mindset change ( ).

AEA Session participants highlighted key challenges:

  • Headquarters and country staff experience change very differently
  • Frequent revisiting of decisions
  • Ineffective communication; generates uncertainty and anxiety
  • Learning not well supported at country or implementation team level
  • Country teams retain a passive mindset when should be more assertive
  • Excessive focus on legal and administrative; not enough on culture and mind-set

Can organizations do better? Might Developmental Evaluation offer useful approaches and tools?

Hot Tip: seems tailor-made for large-scale organizational change processes. It is designed for innovative interventions in complex environments when the optimum approach and end-state are not known or knowable. It involves stakeholder sense-making supported by tailored & evolving evaluative inquiry (often also participatory) to quickly test iterations, track progress and guide adaptations. It’s designed to evolve along with the intervention itself.

Hot Tips: Session participants share some good practices:

  • Action learning. Exchanges among implementers increased adaptive capacity and made emotional experience with change easier
  • Pilot initiatives. Time-bound, with frequent reviews and external support
  • “Guerrilla” roll-out. Hand-picked early adopters sparked “viral” spread of new approaches

Lesson Learned: Our review suggests Developmental Evaluation can address many of the challenges of organizational change, including shifting organizational culture. Iterative participatory learning facilitates adaptations that are appropriate and owned by staff. It adds value by building a learning culture – the ultimate driver of large scale organizational change.

We are curious how many organizations are using Developmental Evaluation for their change processes, and what we can learn from this experience. Add your thoughts to the comments, or write to Tosca or Gabrielle if you have an experience to share.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

We are Nora Murphy and Keith Miller with TerraLuna Collaborative, an evaluation cooperative in the Twin Cities, Minnesota. We feel fortunate to be the evaluation partners on several large developmental evaluations.

One project we are working on seeks to support the inner wellbeing journey of seasoned social entrepreneurs. On a recent conference call, a project team member asked: “How do you know when to use the data to make a change to the program? Isn’t struggle an important part of the individual’s wellbeing journey? If we react too quickly to data and ‘fix’ everything the participant isn’t comfortable with, aren’t we minimizing their opportunities for growth?”

He’s right. I (Nora) shared my perspective that evaluation data is only one source of information that should be used when making a decision. Also important to consider is: 1) our intuition, 2) our accumulated personal and professional wisdom, and 3) the collective wisdom of the group of people seeking to use the evaluation findings.

Hot Tip: Be reflective and identify the source(s) of wisdom you are drawing on.

Reflecting on that conversation, Keith and I realized that my response was rooted in the guiding principles of a three-year partnership with the Minnesota Humanities Center, Omaha Public Schools, and The Sherwood Foundation. The guiding principles are:

  • Build and strengthen relationships;
  • Recognize the power of story and the danger of absence;
  • Learn from and with multiple voices; and
  • Amplify community solutions for change.

These principles guide how we show up as evaluators and how we do our work. Evaluation use happens when there is a foundation of trust–trust in both the results and the evaluators. We’ve learned to build trust by investing in relationships, intentionally including multiple voices, seeking absent narratives, and amplifying community ideas and solutions.

Hot Tip: Be responsive, not reactive.

Patton (2010) suggests that one role of developmental evaluators is to look for and document “forks in the road that move the program in new directions. (p. 150)” As developmental evaluators we can facilitate conversations about whether the data be used immediately because it indicates a fork in the road, or whether the data is something to be aware of and track. During these conversations we can also create space for intuition and wisdom.

Lesson Learned: These guiding principles have helped us shape our role as evaluation partners and increase evaluation use. Our partners trust us to engage them in reflective conversations about what the findings mean how they might be used.

Rad Resource: Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use, Michael Quinn Patton, Guilford (2010).

Rad Resource: Nora F. Murphy and Jennifer Tonko – How Do You Understand the Impact of the Humanities?

The American Evaluation Association is celebrating Evaluation Use (Eval Use) Topical Interest Group Week. The contributions all this week to aea365 come from our Eval Use TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

I am Ricardo Wilson-Grau, an evaluator based in Rio de Janeiro but working internationally. Increasingly, I am called upon to serve as a developmental evaluator, I have found the concept of “inquiry framework” (Chapter 8 in Developmental Evaluation[1]) to be invaluable for co-creating developmental evaluation questions and agreeing how they will be answered. As Michael Quinn Patton says: “Matching evaluation questions to particular situations is the central challenge in developmental evaluation’s situational responsiveness and adaptability…”.[2]

DE does not rely on any particular inquiry framework, just as its toolbox is open to a diversity of designs, methods and tools. What is appropriate depends on the innovation challenges a project, program or organization faces at a given point in time. For example, with one client I used a complexity inquiry framework to support the two-month design of a regional peace-building initiative in a continent with a track record of failures in similar attempts. Then, we considered these potential frameworks to support the first stage of implementation: a) Driving innovation with principles, b) Focusing on systems change, c) Fomenting collaboration for innovation, d) Confronting wicked problems and e) Outcome Harvesting.

In the light of the nature of the developmental challenge this emerging initiative faced, there were sound reasons for using one or more or a combination of these frameworks. The client’s most pressing immediate need, however, was to know in as real time as possible what observable and verifiable changes it was influencing in actors who could not be predetermined. Thus, they choose Outcome Harvesting.

Hot Tip: Are you are in a situation of social innovation that aims to influence changes in behavior writ large — from change in individual actions to organizational or institutional changes of policies or practices? Do you need concrete evidence of those achievements as they happen, along with an understanding of whether and how the innovative efforts contributed to those changes? If yes and yes, Outcome Harvesting may be a useful inquiry framework for you.

Rad Resources: In this video I explain in less than three minutes the Outcome Harvesting tool. There you will also find further information.

You can obtain more information about Outcome Harvesting at Better Evaluation.

To explore using the tool with a client, consider this animated PowerPoint slide to support you in operationalizing the iterative six Outcome Harvesting steps.

[1] For more on developmental inquiry frameworks, see Michael Quinn Patton, Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use, Guilford, 2011, Chapter 8.

[2] Ibid, pages 227-228.

The American Evaluation Association is celebrating Developmental Evaluation Week. The contributions all this week to aea365 come from evaluators who do developmental evaluation. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

I’m Charmagne Campbell-Patton, Director of Organizational Learning and Evaluation for Utilization-Focused Evaluation based in Minnesota, and Evaluation & Assessment Specialist for World Savvy, a national education nonprofit that works with educators, schools, and districts to integrate global competence teaching and learning into K-12 classrooms.

World Savvy has staff in Minneapolis, San Francisco, and New York City. We have found reflective practice to be useful in integrating innovative program development, organizational development, and staff development. These three areas of development need to be aligned, occur simultaneously, and be mutually reinforcing. Developmental evaluation both tracks and supports that alignment.

Rad Resource: Model of integrated development using reflective practice.

CCPatton

Hot Tip: Focus the reflective practice on issues that cut across areas of development. Collaboration is a core value of World Savvy in everything we do, so we began by sharing and analyzing positive and negative experiences with collaboration. Other core values that served as the focus for reflective practice included integrity, inclusivity, excellence and (appropriately) learning & reflection.

Hot Tip: Make reflection together a regular practice. Of course, everyone is busy and it is easy to let reflective practice slide. World Savvy has committed to doing it quarterly and making time to do it well.

Hot Tip: Reflective practice involves staff learning and using skills in recounting an incident descriptively, listening attentively, identifying patterns across stories, generating insights and lessons, and identifying actions to take from what is learned. All of this is enhanced with regular practice. It is also an effective way to intergate new staff into the organization’s culture and the meaning of core values.

Hot Tip: Ask staff to identify what they will share in advance so they come prepared. Even better, have them bring the experiences they will share in writing to contribute to the documentation of reflective practice engagement and learning.

Cool Trick: Begin each session with a review of what emerged in prior sessions to provide a sense of what has been developing.

Cool Trick: Use small groups. If facilitating the session remotely or with more than 10-15 participants, use breakout rooms or small groups as a way to create an environment more conducive to sharing.

Hot Tip: Follow through with action. Reflective practice typically yields insights with actionable implications. Failure of the program or organization to follow through can undermine interest in future reflectice practice sessions.

Rad Resources:

Patton, M.Q. (2015) “Reflective practive guidelines” in Qualitative Research and Evaluation Methods, 4th ed. (Sage), pp. 213-216.

Patton, M.Q. (2011). “Reflective practice for developmental evaluation,” in Developmental Evalkatiion: Applying Complexity Concepts to Enhnace Innovation and Use (Guildfiord Press), pp. 265-270.

The American Evaluation Association is celebrating Developmental Evaluation Week. The contributions all this week to aea365 come from evaluators who do developmental evaluation. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi, I’m Nora F. Murphy, a developmental evaluator and co-founder of TerraLuna Collaborative. Qualitative Methods have been a critical component of every developmental evaluation I have been a part of. Over the years I’ve learned a few tricks about making qualitative methods work in a developmental evaluation context.

Hot Tip: Apply systems thinking. When using developmental evaluation to support systems change it’s important to apply systems thinking. When thinking about the evaluation’s design and methods I am always asking: Where are we drawing the boundaries in this system? Whose perspectives are we seeking to understand? What are the important inter-relationships to explain? And who benefits or is excluded by the methods that I choose? Qualitative methods can be time and resource intensive and we can’t understand everything about systems change. But it’s important, from a methodological and ethical perspective to be intentional about where we draw the boundaries, whose perspectives we include, and which inter-relationships we explore.  

Hot TipPractice flexible budgeting. I typically budget for qualitative inquiry but create the space to negotiate the details of that inquiry. In one project I budgeted for qualitative inquiry that would commence six months after the contract was finalized. It was too early to know how strategy would develop and what qualitative method would best for learning about the developing strategy. In the end we applied systems thinking and conducted case studies that looked at the developing strategy in three ways: from the perspective of individual educators’ transformation, from the perspective educators participating in school change, and from the perspective of school leaders leading school change. It would have been impossible to predict that this was the right inquiry for the project at the time the budget was developed.

Hot Tip: Think in layers. The pace of developmental evaluations can be quick and there is a need for timely data and spotting patterns as they emerge. But often there is a need for a deeper look at what is developing using a method that takes more time. So I think in layers. With the case studies, for example, we structured the post-interview memos so they can be used with program developer to spot emergent patterns by framing memos around pattern surfacing questions such as: “I was surprised…  A new concept for me was… This reinforced for me… I’m wondering…” The second layer was sharing individual case studies. The third layer was the cross-analysis that surfaced deeper themes. Throughout we engaged various groups of stakeholders in the meaning making and pattern spotting.

Rad Resources:

The American Evaluation Association is celebrating Developmental Evaluation Week. The contributions all this week to aea365 come from evaluators who do developmental evaluation. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Older posts >>

Archives

To top