AEA365 | A Tip-a-Day by and for Evaluators

CAT | Evaluation Use

We are Nora Murphy and Keith Miller with TerraLuna Collaborative, an evaluation cooperative in the Twin Cities, Minnesota. We feel fortunate to be the evaluation partners on several large developmental evaluations.

One project we are working on seeks to support the inner wellbeing journey of seasoned social entrepreneurs. On a recent conference call, a project team member asked: “How do you know when to use the data to make a change to the program? Isn’t struggle an important part of the individual’s wellbeing journey? If we react too quickly to data and ‘fix’ everything the participant isn’t comfortable with, aren’t we minimizing their opportunities for growth?”

He’s right. I (Nora) shared my perspective that evaluation data is only one source of information that should be used when making a decision. Also important to consider is: 1) our intuition, 2) our accumulated personal and professional wisdom, and 3) the collective wisdom of the group of people seeking to use the evaluation findings.

Hot Tip: Be reflective and identify the source(s) of wisdom you are drawing on.

Reflecting on that conversation, Keith and I realized that my response was rooted in the guiding principles of a three-year partnership with the Minnesota Humanities Center, Omaha Public Schools, and The Sherwood Foundation. The guiding principles are:

  • Build and strengthen relationships;
  • Recognize the power of story and the danger of absence;
  • Learn from and with multiple voices; and
  • Amplify community solutions for change.

These principles guide how we show up as evaluators and how we do our work. Evaluation use happens when there is a foundation of trust–trust in both the results and the evaluators. We’ve learned to build trust by investing in relationships, intentionally including multiple voices, seeking absent narratives, and amplifying community ideas and solutions.

Hot Tip: Be responsive, not reactive.

Patton (2010) suggests that one role of developmental evaluators is to look for and document “forks in the road that move the program in new directions. (p. 150)” As developmental evaluators we can facilitate conversations about whether the data be used immediately because it indicates a fork in the road, or whether the data is something to be aware of and track. During these conversations we can also create space for intuition and wisdom.

Lesson Learned: These guiding principles have helped us shape our role as evaluation partners and increase evaluation use. Our partners trust us to engage them in reflective conversations about what the findings mean how they might be used.

Rad Resource: Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use, Michael Quinn Patton, Guilford (2010).

Rad Resource: Nora F. Murphy and Jennifer Tonko – How Do You Understand the Impact of the Humanities?

The American Evaluation Association is celebrating Evaluation Use (Eval Use) Topical Interest Group Week. The contributions all this week to aea365 come from our Eval Use TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Keiko Kuji-Shikatani, the current chair of the Evaluation Use Topical Interest Group (TIG), one of the original AEA TIGs. The Evaluation Use TIG was born of the interest in evaluation utilization in the 1970s, extending into both theoretical and empirical work on Use in the 1980s and 1990s, and to a broader conceptualization of use and influence in the 2000s. The Evaluation Use TIG is committed to understanding and enhancing the use of evaluation in a variety of contexts and to maximizing the positive influence of evaluation through both the evaluation process and the results produced.

Program evaluation began with the desire to seek information that can be utilized to improve the human condition. Use may not be apparent to those who are not internal to an organization since the process of using evaluation requires discussions that may be very sensitive in nature. This week’s AEA365 will examine how Evaluation Use TIG members are striving to support various efforts in diverse and complex contexts.

As for me, as an internal evaluator for the Ontario Ministry of Education, utilization of evaluation is something that is the norm in what I do every day in pursuit of reaching every student. The world in which our students are growing up and going to be leaders and learners throughout their lifetime is a complex and a quickly changing place. In order to support students so they are the best that they can be, those in the system needs to work smarter and use evaluative thinking to guide every facet of improvement efforts.

Rad Resource: Evaluative thinking is systematic, intentional and ongoing attention to expected results. It focuses on how results are achieved, what evidence is needed to inform future actions and how to improve future results. One cannot really discuss Evaluation Use without Michael Quinn Patton – check out (http://www.mcf.org/news/giving-forum/making-evaluation-meaningful).

Our work as internal evaluators involve continually communicating the value of evaluative thinking and guiding developmental evaluation (DE) by modeling the use of evidence to understand more precisely the needs of all students and to monitor and evaluate progress of improvement efforts.

Hot Tips: Check out how evaluation (http://edu.gov.on.ca/eng/teachers/studentsuccess/CCL_SSE_Report.pdf) is used to inform the next steps https://www.edu.gov.on.ca/eng/teachers/studentsuccess/strategy.html) and how that change can look like (http://edu.gov.on.ca/eng/research/EvidenceOfImprovementStudy.pdf).

In our work, the ongoing involvement of evaluators, who are intentionally embedded in program and policy development and implementation teams contribute to modeling evaluative thinking and guiding DE that build system evaluation capacity. The emphasis is on being a learning organization through evidence-informed, focused improvement planning and implementation.

Hot Tips: check out how evaluative thinking is embedded in professional learning (http://sim.abel.yorku.ca/ )or how evaluation thinking is embedded in improvement planning (http://www.edu.gov.on.ca/eng/policyfunding/memos/september2012/ImprovePlanAssessTool.pdf).

The American Evaluation Association is celebrating Evaluation Use (Eval Use) Topical Interest Group Week. The contributions all this week to aea365 come from our Eval Use TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi all!  Liz Zadnik here, aea365 Outreach Coordinator and occasional Saturday Contributor.  I wanted to share some insights and reflection I had as the result from a recent EVALTALK discussion thread.  Last month, someone posed the following request:

I’m searching for a “Why Evaluate” article for parents/community members/stakeholders. An article that explains in clear and plain language why organizations evaluate (particularly schools) and evaluation’s potential benefits. Any suggestions?

Rad Resources: Others were kind enough to share resources, including this slideshare deck that moves through some language and reasoning for program evaluation and assessment, book recommendations  There is also a very helpful list from PlainLanguage.gov offering possible replacements for commonly-used words.  (Even the headings – “Instead of…” and “Try…” – make the shift seems much more manageable).

Lessons Learned: Making evaluation accessible and understandable requires tapping into an emotional and experiential core.

  • Think about never actually saying “evaluate” or “evaluation.”  It’s OK not to use phrases or terms if they are obstacles for engaging people in the evaluation process.  If “capturing impact,” “painting a picture,” “tracking progress” or any other combination of words works…use it!  It may be helpful to talk with interested or enthusiastic community members about what they think of evaluation and what it means to them.  This helps gain insight into relevant language and framing for future discussions.
  • Have the group brainstorm potential benefits, rather than listing them for them.  Similar to engaging community members in discussion of the “how” is also asking them what they feel is the “why” of evaluation.  I have heard the most amazing and insightful responses when I have done this with organizations and community members.  Ask the group “What can we do with the information we get from this question/item/approach?” and see what happens!
  • Evaluation is about being responsible and accountable.  For me, program evaluation and assessment is about ethical practice and stewardship of resources.  I have found community members and colleagues receptive when I frame evaluation as a way to make sure we are doing what we say we’re doing – that we are being transparent, accountable, and clear on our expectations and use of funds.

We’d love to hear how others in the aea365 readership are engaging communities in accessible conversations about evaluation.  Share your tips and resources in the comments section!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hi – I’m Erik Mason, the Curator of Research at the Longmont Museum and Cultural Center, located in Longmont, Colorado, about 35 miles northwest of Downtown Denver. I am not an evaluator – in fact, the word “evaluation” does not appear in my job description.  I have come to believe, however, that evaluation is critical to the success of my work as a museum curator.  Much of that realization is the result of my participation in the Denver Evaluation Network (DEN), a collection of 15 museums across the Denver metro area that have made a commitment to learn about, and do, evaluation on a regular basis.

Only two members of DEN have full-time evaluators on staff. The rest of us are a mix of educators, exhibit developers, administrators, and curators.  Our daily work is filled with school tours, fundraising, label writing, and all the other stuff that goes into making museums fun and interesting places to visit. As a result, evaluation can get short shrift. We fall back to anecdote and what we think we know.

Over the last two years, the members of DEN have been presenting at museum conferences about the work we are doing to bring evaluation to a broader community.  It has been fascinating watching people who always thought evaluation was something scary and hard, and required a large supply of clipboards, realize that it can be done in many ways.

Within my workplace, I have been pleasantly surprised as we have begun incorporating evaluation into more and more of what we do. Data gathered from iPad surveys provides a baseline understanding of our audience demographics and allows us to compare the changes in our audience as our special exhibits change. Evaluation is now a part of the development of all our exhibits. In the course of doing evaluation, I’ve seen attitudes change from “Why are we wasting our time doing this?” to “When are we doing another evaluation?”

Rad Resource: Check out this video of testimonials from members of DEN.

Hot Tip for Evaluation 2014 Attendees: Denver really is the “Mile High City” and you can take home proof of this fact with a short jaunt and a camera. A free shuttle and brief walk away from the Colorado Convention Center is the Colorado State Capitol building, a Neoclassical building that sits at the eastern end of Denver’s Civic Center Park. The Capitol building sits exactly one mile above sea level, and the official marker can be found on 13th step. The Capitol building is emerging from a multi-year restoration effort with a shiny new coat of gold on its dome, in honor of Colorado’s mining heritage. Free tours of the Colorado Capitol Building are offered Monday-Friday.

We’re looking forward to October and the Evaluation 2014 annual conference all this week with our colleagues in the Local Arrangements Working Group (LAWG). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to contribute to aea365? Review the contribution guidelines and send your draft post to aea365@eval.org.

I am Humberto Reynoso-Vallejo, a private consultant on health services research. A few years ago, I was part of an exploratory study of Latino caregivers in the Boston area caring for a family member suffering Alzheimer’s disease. Difficulties facing those families coping with the disease have promoted the rise of support groups for diverse population group. Support groups for racial/ethnic diverse caregivers were scarce, and in the case of Latino caregivers in the Boston area nonexistent. To respond to this need, I tried to develop a support group for Latinos with the assistance of the Alzheimer’s Association. After several unsuccessful attempts, I conducted a focus group with four caregivers to identify barriers to participation. Findings indicated that caregivers faced a number of issues including: lack of transportation; lack of available time to take off from other responsibilities; the absence of linguistically appropriate support groups; caring for other family members dealing with an array of health problems (multiple caregiving); and, other personal and social stressors.

I designed an alternative and pragmatic model support group, which took the form of a radio program. The “radio support group” directly targeted caregiver’s concerns and aimed to:

a) Disseminate culturally relevant information, largely from the point of view of the caregivers themselves, either as guest in the program or when calling into; and,

b) Reduce the sense of isolation that many caregivers feel on a daily basis as a result of their caregiving roles.

I facilitated the radio support group with the participation of caregivers, professionals and service providers. Four programs were aired exploring topics such as memory problems, identifying signs of dementia, caregiver needs, and access to services. After each radio program was aired, I called the 14 participant caregivers to explore their reactions, and found that the majority of them were not able to participate. Since the “live” radio support group was not accomplishing its original purpose of disseminating information and reducing caregivers sense of isolation, I decided to distribute the edited audiotapes of the 4 programs to all caregivers. Overall, caregivers found the information useful and many established contact with others. 

Lessons Learned:

  • This model of intervention, the radio support group, showed that innovation simultaneously with cultural relevant material is promising.
  • Research and evaluation should adapt to the particular needs and social context of Latino caregivers of family members with Alzheimer’s disease.
  • There is a need for more culturally appropriate types of interventions that mobilize caregivers’ own strengths, values, and resources.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

 

No tags

Hi, I am Jindra Cekan, PhD, an independent evaluator with 25 years of international development fieldwork, at www.ValuingVoices.com.

What if we saw our true clients as project participants and wanted the return on investment of projects be maximally sustained? How would this change how we evaluate, capture, learn together?

Lesson Learned: Billions of dollars of international development assistance are spent every year and we do baseline, midterm and final evaluations on most of them.  We even sometimes evaluate sustainability using OECD’s DAC Criteria for Evaluating Development Assistance: relevance, effectiveness, efficiency, impact and sustainability.  This is terrific, but deeply insufficient. We rarely ask communities and local NGOs during or after implementation what they think about our projects, how to best sustain activities themselves and how to help them do so.

Also, very rarely do we return 3, 5, or 10 years after projects close and ask participants what is “still standing” that they managed to sustain themselves. How often do we take community members, local NGOs, or national evaluators as the leaders of evaluations of long-term self-sustainability of our projects? Based on my research 99% of international aid projects are not evaluated for sustainability or impact after project close by anyone, much less by the communities they are designed to serve.

With $1.52 trillion dollars in US and EU foreign aid being programmed for 2014–2020, our industry desperately needs feedback on what communities feel will be sustainable now, what interventions offer the likelihood of positive impact beyond the performance of the project’s planned (log-framed) activities. Shockingly, this does not exist today.

Further, such learning needs to be transparently captured and shared in open-date format for collective learning, especially at the country and implementer level. Creating feedback loops between project participants, national stakeholders, partners and donors that foster self-sustainability will foster true impact.

Hot Tip: We can start in current project evaluations. We need to ask these questions of men, women, youth, elders, the richer and poorer in communities as well as of local stakeholders. Ideally we would request national evaluators to ask (revise!) questions such as:

  • How valuable have you found the project overall in terms of being able to sustain activities yourselves?
  • How well were project activities transferred to local stakeholders?

o   Who is helping you sustain the project locally once it ends?

  • What were the activities do you think you can least maintain yourselves?

o   What should be done to help you?

  • What were activities that you wish the project had supported that build on your community’s strengths?
  • Was there any result that came of the project that was surprising or unexpected?
  • What else do we need to learn from you to have greater success in the future?
Clipped from http://www.oecd.org/dac/evaluation/daccriteriaforevaluatingdevelopmentassistance.htm

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Mike Morris and I’m Professor of Psychology at the University of New Haven, where I direct the Master’s Program in Community Psychology. My research focuses on ethical issues in evaluation, and I am an Associate Editor of the American Journal of Evaluation. The best book I’ve ever read for managing my relationships with stakeholders in an evaluation was not written by an evaluator, nor was it written specifically for evaluators.

Rad Resource: Peter Block (2000). Flawless Consulting: A Guide to Getting Your Expertise Used (2nd ed.). San Francisco: Jossey-Bass. http://www.josseybass.com/WileyCDA/WileyTitle/productCd-0787948039.html

2014 Update:  Flawless Consulting is now in its 3rd edition (2011).

Among organizational consultants this book is legendary. Evaluation is, in my view, one form of consultation, so it’s not surprising that Block’s book is relevant to our work. His discussion of such issues as entry/contracting, dealing with resistance, and managing the feedback of results is invaluable. Central to his analysis is the concept of “authenticity,” which means putting into words what you are experiencing with stakeholders as you work with them. It might sound a bit scary at first, but the more you practice it, the more effective at managing these relationships you become. I also believe that Block’s approach to consulting can enhance the ethical quality of evaluations, especially in terms of helping evaluators identify strategies for raising and pursuing ethical issues with stakeholders.

Flawless Consulting is exceedingly well-written. It probably helps that Block does not have a doctoral degree, since writing a dissertation is a process that can extinguish one’s ability to compose a sentence that anyone would be interested in reading. Flawless Consulting gets very positive reviews from my students. I hope you’ll agree with them. 

The American Evaluation Association is celebrating Best of aea365 week. The contributions all this week are reposts of great aea365 blogs from our earlier years. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Greetings colleagues. My moniker is Michael Quinn Patton and I do independent evaluation consulting under the name Utilization-Focused Evaluation, which just happens also to be the title of my main evaluation book, now in its 4th edition. I am a former AEA president. One of the challenges I’ve faced over the years, as many of us do, is making evaluation user-friendly, especially for non-research clients, stakeholders, and audiences. One approach that has worked well for me is using children’s stories. When people come to a meeting to work with or hear from an external evaluator, they may expect to be bored or spoken down to or frightened, but they don’t expect to be read a children’s story. It can be a great ice-breaker to set the tone for interaction.

Hot Tip: I first opened an evaluation meeting with a children’s story when facilitating a stakeholder involvement session with parents and staff for an early childhood/family education program evaluation. The trick is finding the right story for the group you’re working with and the issues that will need to be dealt with in the evaluation.

Rad Resource: Dr. Seuss stories are especially effective. The four short stories in Sneeches and Other Stories are brief and loaded with evaluation metaphors. “What was I scared of?” is about facing something alien and strange — like evaluation, or an EVALUATOR. “Too Many Daves” is about what happens when you don’t make distinctions and explains why we need to distinguish different types of evaluation. “Zaks” is about what happens when people get stuck in their own perspective and can’t see other points of view or negotiate differences. “Sneeches” is about hierarchies and status, and can be used to open up discussions of cultural, gender, ethic, and other stakeholder differences. I use it to tell the story, metaphorically, of the history of the qualitative-quantitative debate.

Hot Tip: Children’s stories are also great training and classroom materials to open up issues, ground those issues in a larger societal and cultural context, and stimulate creativity. Any children’s fairy tale has evaluation messages and implications.

Rad Resource: In the AEA eLibrary I’ve posted a poetic parody entitled “The Snow White Evaluation,” that opens a book I did years ago (1982) entitled Practical Evaluation (Sage, pp. 11-13.) Download it here http://ow.ly/1BgHk.

Hot Tip: What we do as evaluators can be hard to explain. International evaluator Roger Mirada has written a children’s book in which a father and his daughter interact around what an evaluator does. Eva is distressed because she has trouble on career day at school describing what her dad, an evaluator, does. It’s beautifully illustrated and creatively written. I now give a copy to all my clients and it opens up wonderful and fun dialogue about what evaluation is and what evaluators do.

Rad Resource: Eva the Evaluatorby Roger Miranda. http://evatheevaluator.com/

The American Evaluation Association is celebrating Best of aea365 week. The contributions all this week are reposts of great aea365 blogs from our earlier years. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

We are Dayna Albert (Project Coordinator) and Rochelle Zorzi (Editorial Board Co-chair) of the Evaluation Stories Project, an EvalPartners Innovation Challenge recipient. Our project will soon launch an International Call for Evaluation Stories. The purpose is to:

  • Identify and share stories of evaluations that have made a difference
  • Increase the demand for and use of evaluation

Minimal literature exists on the benefits or impacts of evaluation use, particularly from the perspective of evaluation users. Furthermore, most evaluation literature is very academic. Our project will employ a story-telling format in order to better communicate the benefits of evaluation use to evaluation users.

As an international project, one of our challenges is to reach a multilingual audience despite limited translation resources. A second challenge is to explain what we mean by evaluation impact – a concept that turns evaluative thinking on its head and tends to be misconstrued.

Lessons Learned: Anticipate that people may have difficulty ‘getting’ a new concept. Words alone can be inadequate and ambiguous.

Use story to explain new concepts. Here is a story that Chris Lysy helped us develop to explain the concept of evaluation impacts.

(Click here to see the video!)image005

 

Hot Tip:

–        Follow-up with clients after an evaluation to reflect on and track evaluation impacts.

–        Act now! The Call for Evaluation Stories is a great opportunity to reconnect with a client and explore their interest in participating in the Call for Evaluation Stories

 

Rad Resource:

–        To reach a multilingual online audience, add Google’s Website Translate plug-in to your website. Albeit imperfect, it provides a free and virtually instantaneous website translation.

–        To translate a blog, paste the following code into a text widget. Insert your blog’s URL where indicated. The code is written for English (en) to French (fr) translation. For English to Spanish translation, replace ‘fr’ with ‘sp’ and ‘français’ with ‘español’.

<a href=”//translate.google.com/translate?u=http%3A%2F%2FyourblogURL&amp;hl=fr&amp;ie=UTF8&amp;sl=en&amp;tl=fr”” title=””français“”><img src=”http://yourblogURL /2010/02/icons-flag-gb.png” alt=”français” /></a>

Get Involved:

Rad Resources: See these posts for additional information on evaluation stories:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

innovationchallengeMy name is Susan Kistler and I am the American Evaluation Association’s Executive Director Emeritus and aea365’s regular Saturday contributor. Today, we’re talking about VOPEs and a unique opportunity for global impact.

Lesson Learned – VOPEs: What’s a VOPE? A VOPE is a Voluntary Organization for Professional Evaluation (VOPE).  According to the freely downloadable Evaluation and Civil Society: Stakeholders’ Perspectives on National Evaluation Capacity Development, “VOPEs include formally constituted associations or societies, as well as informal networks and communities of practice. Their memberships are open not only to those who conduct evaluations but also to those who commission and utilize evaluations and those engaged in building the evaluation field.”

Rad Resource – EvalPartners: I’ve written before about EvalPartners, their great training and background resources, and their work to map and strengthen collaboration among VOPEs. This month, EvalPartners has announced a new project to strengthen the field of evaluation around the world through thinking outside the box.

Hot Tip – EvalPartners Innovation Challenge: The EvalPartners Innovation Challenge aims to “identify [and implement] innovative ideas to strengthening the demand for and use of evaluation to inform policy making, including in the context and spirit of EvalYear, the international year of evaluation. Ideas can relate to proposed actions at international, national and/or sub-national levels.” It is a small grants program, providing US$15,000 to each of three winning proposals. Eligible candidates are VOPEs, and partnerships including VOPEs, and innovation is defined broadly.

Get Involved: Be sure to read all of the details at http://www.mymande.org/evalpartners/innovation_challenge. What ideas do you have  to increase the demand for and use of evaluation in the policy making arena? Share via comments or discuss with your evaluation colleagues.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

<< Latest posts

Older posts >>

Archives

To top