AEA365 | A Tip-a-Day by and for Evaluators

My name is Susan Wolfe and I am a Senior Consultant at CNM Connect where I provide evaluation and capacity building services to non-profit organizations.  I am also the incoming chair for the CP TIG.  This week, the CP TIG will be presenting AEA 365 blogs related to working with initiatives to reduce health disparities.

As an evaluator, much of my more recent work has been with programs and organizations working to reduce health disparities.  While working in this area is very rewarding, it also presents challenges.  One challenge is that racism plays a role in creating many health disparities. There is mounting evidence that the everyday stressors African American women encounter contribute to low birthweight infants and infant mortality.  As a white woman participating in these discussions it is important that I exercise cultural competence and be sensitive to the needs of this population. Issues of race and ethnicity are frequently discussed during focus groups or key informant interviews relevant to health disparities, and having a level of comfort with them is important so as to not discourage dialogue.

Rad Resource:  One of the best resources I found for understanding and developing cross-cultural competency is by Kien Lee in her book chapter “Effecting Social Change in Diverse Contexts” in Scott and Wolfe’s “Community Psychology: Foundations for Practice”. The chapter includes clear definitions, strategies to engage to develop the competence, and resources with self-exploration and self-development assessment tools.

Another challenge I regularly confront is convincing organizations to truly engage the communities experiencing health disparities and work WITH them to solve the communities’ problems.  Professionals in most organizations are accustomed to the hierarchical relationship that has been established within the helping professions, so convincing them to come to the table as equal partners requires an important paradigm shift.

HOT TIP:  Be patient.  Large scale and systems level changes take time. Likely a series of baby steps will be needed and small wins will hopefully lead to systemic change.

HOT TIP: Coalition evaluation is useful for assisting community collaboratives with their development.  

COOL TRICK: Some time after the community collaborative has been established, conduct a member survey and a SWOT analysis to provide feedback regarding member perceptions of areas such as leadership, communication, and impact. Synthesize the information to help the group to identify potential areas of growth, as well are areas that require attention – internally and externally. More effective collaboratives are more likely to have greater impact on reducing health disparities.

RAD RESOURCE:  I often use the Coalition Member Assessment Tool develop by Tom Wolff for the member survey. It includes scales to measure vision, leadership and membership, structure, communication, activities, and outcomes.

The American Evaluation Association is celebrating CP TIG Week with our colleagues in the Community Psychology Topical Interest Group. The contributions all week come from CP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello aea365-ers! I’m Sheila B Robinson, Lead Curator and sometimes Saturday contributor. Have you ever visited the AEA Public eLibrary and seen this when you click on a resource?

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

We certainly hope so! Our eLibrary features a wealth of evaluation resources. Are you familiar with Creative Commons and the funny symbols that make up that sort of dashboard looking icon?

Lesson Learned: Creative Commons licensing allows content creators to share their work under conditions they specify, and opens access to this content to users. 

According to their website, Creative Commons is a nonprofit organization that enables the sharing and use of creativity and knowledge through free legal tools.

Our free, easy-to-use copyright licenses provide a simple, standardized way to give the public permission to share and use your creative work — on conditions of your choice. CC licenses let you easily change your copyright terms from the default of “all rights reserved” to “some rights reserved.”

Creative Commons licenses are not an alternative to copyright. They work alongside copyright and enable you to modify your copyright terms to best suit your needs.

Why is all of this important?

  • Do you want to download and distribute someone else’s content?
  • Do you want to use a picture or icon you found on the internet on a PowerPoint slide for a presentation?
  • Do you want to post someone’s picture or content on your blog*?
  • Did you upload content to share on a website but you don’t want people making money selling your work?
  • Do you want people who use your work to credit you?
  • Do you want to allow people to change the work in any way, or leave it exactly as you created it?

Hot Tip: Familiarize yourself with the 6 different CC licensing options. You don’t have to be an attorney to understand the language. That’s the beauty of CC. Everything is written in “human readable” language.

Cool Trick: Websites such as Flickr.com and search engines such as Google allow users to search for CC licensed content. Look for “advanced search options” or “filter your search” to find these options.

Rad Resource: Back in April, Martha Meacham offered this excellent post on understanding some copyright issues, and also mentioned Creative Commons.

*If you have contributed to aea365 and your post included any picture or graphic, you have likely corresponded with me about CC and copyright. We play by the rules here! If you are interested in writing for aea365 and want to use an image, graphic, video, etc. in your post, you need to know if you have permission to share it in this way!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

We are Erin Bock of The Sherwood Foundation and Nora Murphy of TerraLuna Collaborative. We feel fortunate to have been partners in developmental evaluations for several years now, each of acting as an important thought partner and sounding board for the other.

We recently partnered on an evaluation for a community-wide initiative. The Adolescent Health Project, led by the Women’s Fund of Omaha, seeks to change a wicked problem–high STI and teen pregnancy rates–using a systems approach.

Project leadership, in the face of incredible urgency (the county’s STI rates are epidemic levels), knew that there was a need not only to expanded services, but to change the way the present system functions. A learning collaborative was created, facilitated by the evaluation team and made up of grantee leadership who had previously been competitors. The learning collaborative is charged with establishing learning priorities that they, as a group, want to take on. In other words, instead of releasing grant funds and expecting immediate results, the project leaders created space and time for grantees to build trusting relationships.

The foundation and the Women’s Fund of Omaha calls its work “squishy” and embraces complexity, but the learning collaborative experience has been an act of faith. It feels risky to create space for trust when there’s no objective or completion date tied to it. It is an honor that nonprofits would enter into this risky space with project leadership and it is an honor to work with evaluation professionals who can hold us steady through the grey area.

Already we’ve seen the benefits of creating this space. The issue of trauma was surfaced during the fourth learning collaborative meeting. There was a sense that something deeper is going on for young people and that to reduce risky behaviors, we needed to open ourselves up to those difficult experiences…to become culturally and experientially humble.

Hot Tip: Amongst the rush of evaluation deadlines, create intentional space to build trust with your partners.

This space for trust will ensure that we can supersede the hard boundaries of community organizations and health centers and we can get real about the issues that drive this problem in in our community. Our ability to be real with each other will drive authentic use of the evaluation for real change.

Rad Resource: Not only have service recipients experienced trauma, but so have the professionals working with them. Check out this resource to gauge secondary trauma: http://academy.extensiondlc.net/file.php/1/resources/TMCrisis20CohenSTSScale.pdf

Rad Resource: The upcoming book Developmental Evaluation Exemplars edited by Michael Quinn Patton, Kate McKegg and Nan Wehipeihana has a chapter, written by Nora Murphy, describing the process of convening a learning collaborative.

The American Evaluation Association is celebrating Evaluation Use (Eval Use) Topical Interest Group Week. The contributions all this week to aea365 come from our Eval Use TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I am Arnold Love from Toronto, the recent host city of the 2015 Para- and Pan American Games. Toronto also hosted the first Accessibility Innovation Showcase to mark the 25th Anniversary of the Americans With Disabilities Act and the 10th Anniversary of the Ontarians with Disabilities Act.

My evaluation interests include both sports and accessibility, so I want to share with you a powerful and enjoyable way of increasing evaluation use, called Jane’s Walk. It was a pivotal feature of the Para- and Pan Am Games and the Accessibility Showcase.

Jane’s Walk is named after Jane Jacobs, noted researcher and author of The Death and Life of Great American Cities. Jacobs championed the use of direct observation through “eyes on the street” and direct engagement to understand the “messy and complex systems” that comprise the urban landscape and to mobilize findings into action.

Rad Resource: Jane’s Walk is an informal walking tour. Check out the Jane’s Walk website to find out how walks “get people to tell stories about their communities, explore their cities, and connect with neighbors.”

Hot Tip: Several walks take place at the same time, each on a different theme. Local volunteers organize them based on their interests and expertise. For example, one walk during the Accessibility Innovation Showcase explored ideas to make busy intersections and entry to stores more accessible.

Hot Tip: Invite people of different ages and backgrounds to participate. The informal nature of Jane’s Walk encourages each person to voice their perspectives based on unique experience and insights. This energizes the conversations.

Hot Tip: Evaluators need diverse yet balanced views of the discussion topics. Facilitate this by finding two people with different viewpoints to co-lead each walk.

Hot Tip: Taking notes shuts down the trust and free exchange of ideas that are the hallmark of the Jane’s Walk. Instead, tweet your notes to yourself and encourage the other walkers to tweet their comments and ideas or share on social media.

Rad Resource: Adding an incentive can greatly increase use of the findings coming from the Jane’s Walk methodology. Check out how Jane’s Walk partnered with Evergreen CityWorks to offer micro-grants to implement the best ideas (http://janeswalk.org/canada/toronto/grants) with little money, but big results.

Rad Resource: Change Jane’s Walk into a game by geocaching. Hide small items (toys, badges, stories) in locations that fit a specific evaluation theme, such as a coffee shop with an accessible ramp. Then log the coordinates and cache description on http://www.geocaching.com. Use the app to find the cache. Its fun!

Evaluation 2015 Challenge: Organize a few Jane’s Walks for AEA 2015. A great opportunity to experience the methodology first hand and get to know Chicago and other AEA members better.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

We are Joy Kaufman, Associate Professor at Yale University School of Medicine and Director of Program and Service System Evaluation and Evaluation Research and Andrew Case, Assistant Professor or Psychology at the University of North Carolina Charlotte. We are pleased that the Evaluation Use TIG asked us to share work we have done in engaging consumers of mental health services in the evaluation process.

With a primary goal of better understanding consumer perspectives of receiving services at the Connecticut Mental Health Center, four consumer researchers were recruited from the clients served at the Center and trained in all aspects of focus group evaluation. The most salient aspect of this evaluation is the fact that it was developed, implemented and reported by consumers who receive services within the mental health center. Over the past 4 years this team has provided feedback regarding many aspects of care at the Center and their recommendations serve as a blueprint for Center administrators to use in improving the care environment. Perhaps one of the most important outcomes is that this consumer driven evaluation process is now part of how things are done at the mental health center.

Lessons Learned:

Having consumers of behavioral health services evaluate and report their results to the center where they receive care was profound. In our experience as professional evaluators leadership and front line staff, while interested in the results of an evaluation, are often passive recipients of the information. That was not the case in this evaluation, the professionals listened and immediately began reviewing ways to enhance the care experience for consumers.

Having peers lead the evaluation process led service recipients to feel that their voices were heard, a phenomena that consumers of publically behavioral health services do not often experience.

The Center leadership and clinical supervisors reported that the evaluation had added legitimacy and authenticity because of the central role of the consumer researchers.

As evaluators we have learned that while true partnership with service recipients may take more time, the results of the evaluation have increased validity, value and usefulness to the program.

Rad Resources: Patient-Centered Outcomes Research Institute provides resources, including funding to further the engagement of consumers in evaluation of health services.

A first person account of the evaluation process highlighted above was conducted and published in the American Journal of Community Psychology. This paper includes accounts from four stakeholder groups regarding how the project was perceived by stakeholders at the mental health center and the impact of this project on the care environment.

The Focus Group Kit (Morgan & Krueger 1997, Sage Publications) includes a very helpful volume on including community members in focus groups.

The American Evaluation Association is celebrating Evaluation Use (Eval Use) Topical Interest Group Week. The contributions all this week to aea365 come from our Eval Use TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

My name is Charmagne Campbell-Patton and I am an independent evaluation consultant. About a year ago, I made the transition from my role as program manager and internal evaluator at an education nonprofit, to an external evaluation consultant. I continued working with my former employer as a client and, in my naiveté, I thought the transition would be relatively straightforward. I figured that since I knew the inner workings of the organization and had strong relationships with most staff members, it would be easy to continue to conduct useful evaluation.

Lessons Learned: My first mistake was failing to recognize and address that as the program manager, I used to be the primary intended user of the evaluation results. When I made the transition to an external consultant, I needed to be much more intentional about designing evaluations that met the needs of the new intended users.

Hot Tip: Be aware of how your position affects use. The personal factor is different in different relationships – internal and external.

Lesson Learned: Process use is different internally and externally. As a staff member, I used to be able to identify opportunities for process use in an ongoing and informal way. As an external consultant, however, I again had to be much more intentional about identifying opportunities and planning for process use.

Hot Tip: External evaluators need to be intentional about seeking opportunities to support evaluative thinking across the organization through more formalized process use.

Cool Trick: One way to engage staff is a reflective practice exercise. Bring staff together to reflect on the question: “What are things you know you should be doing but aren’t?” This question gets people thinking about potential personal barriers to using information. That sets the stage for discussing barriers to evaluation use organizationally. Next identify enabling factors that support and enhance use, and ways to overcome barriers to use.

It’s also worth noting that despite some of the challenges noted above, the transition from internal to external also gave me a new perspective on evaluation use. Once I recognized some of the barriers to use as an external consultant, I was actually able to use my position to promote use more effectively than I did while internal. The added distance gave me some leverage that I lacked as a staff member to call attention to opportunities and challenges to evaluation use across the organization.

Rad Resources: Essentials of Utilization-Focused Evaluation, Michael Quinn Patton, Sage (2012).

Consulting Start-Up and Management, Gail Barrington, Sage (2012).

Using Reflective Practice for Developmental Evaluation, Charmagne Campbell-Patton, AEA365 March 2015.

The American Evaluation Association is celebrating Evaluation Use (Eval Use) Topical Interest Group Week. The contributions all this week to aea365 come from our Eval Use TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

We are Nora Murphy and Keith Miller with TerraLuna Collaborative, an evaluation cooperative in the Twin Cities, Minnesota. We feel fortunate to be the evaluation partners on several large developmental evaluations.

One project we are working on seeks to support the inner wellbeing journey of seasoned social entrepreneurs. On a recent conference call, a project team member asked: “How do you know when to use the data to make a change to the program? Isn’t struggle an important part of the individual’s wellbeing journey? If we react too quickly to data and ‘fix’ everything the participant isn’t comfortable with, aren’t we minimizing their opportunities for growth?”

He’s right. I (Nora) shared my perspective that evaluation data is only one source of information that should be used when making a decision. Also important to consider is: 1) our intuition, 2) our accumulated personal and professional wisdom, and 3) the collective wisdom of the group of people seeking to use the evaluation findings.

Hot Tip: Be reflective and identify the source(s) of wisdom you are drawing on.

Reflecting on that conversation, Keith and I realized that my response was rooted in the guiding principles of a three-year partnership with the Minnesota Humanities Center, Omaha Public Schools, and The Sherwood Foundation. The guiding principles are:

  • Build and strengthen relationships;
  • Recognize the power of story and the danger of absence;
  • Learn from and with multiple voices; and
  • Amplify community solutions for change.

These principles guide how we show up as evaluators and how we do our work. Evaluation use happens when there is a foundation of trust–trust in both the results and the evaluators. We’ve learned to build trust by investing in relationships, intentionally including multiple voices, seeking absent narratives, and amplifying community ideas and solutions.

Hot Tip: Be responsive, not reactive.

Patton (2010) suggests that one role of developmental evaluators is to look for and document “forks in the road that move the program in new directions. (p. 150)” As developmental evaluators we can facilitate conversations about whether the data be used immediately because it indicates a fork in the road, or whether the data is something to be aware of and track. During these conversations we can also create space for intuition and wisdom.

Lesson Learned: These guiding principles have helped us shape our role as evaluation partners and increase evaluation use. Our partners trust us to engage them in reflective conversations about what the findings mean how they might be used.

Rad Resource: Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation and Use, Michael Quinn Patton, Guilford (2010).

Rad Resource: Nora F. Murphy and Jennifer Tonko – How Do You Understand the Impact of the Humanities?

The American Evaluation Association is celebrating Evaluation Use (Eval Use) Topical Interest Group Week. The contributions all this week to aea365 come from our Eval Use TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Keiko Kuji-Shikatani, the current chair of the Evaluation Use Topical Interest Group (TIG), one of the original AEA TIGs. The Evaluation Use TIG was born of the interest in evaluation utilization in the 1970s, extending into both theoretical and empirical work on Use in the 1980s and 1990s, and to a broader conceptualization of use and influence in the 2000s. The Evaluation Use TIG is committed to understanding and enhancing the use of evaluation in a variety of contexts and to maximizing the positive influence of evaluation through both the evaluation process and the results produced.

Program evaluation began with the desire to seek information that can be utilized to improve the human condition. Use may not be apparent to those who are not internal to an organization since the process of using evaluation requires discussions that may be very sensitive in nature. This week’s AEA365 will examine how Evaluation Use TIG members are striving to support various efforts in diverse and complex contexts.

As for me, as an internal evaluator for the Ontario Ministry of Education, utilization of evaluation is something that is the norm in what I do every day in pursuit of reaching every student. The world in which our students are growing up and going to be leaders and learners throughout their lifetime is a complex and a quickly changing place. In order to support students so they are the best that they can be, those in the system needs to work smarter and use evaluative thinking to guide every facet of improvement efforts.

Rad Resource: Evaluative thinking is systematic, intentional and ongoing attention to expected results. It focuses on how results are achieved, what evidence is needed to inform future actions and how to improve future results. One cannot really discuss Evaluation Use without Michael Quinn Patton – check out (http://www.mcf.org/news/giving-forum/making-evaluation-meaningful).

Our work as internal evaluators involve continually communicating the value of evaluative thinking and guiding developmental evaluation (DE) by modeling the use of evidence to understand more precisely the needs of all students and to monitor and evaluate progress of improvement efforts.

Hot Tips: Check out how evaluation (http://edu.gov.on.ca/eng/teachers/studentsuccess/CCL_SSE_Report.pdf) is used to inform the next steps https://www.edu.gov.on.ca/eng/teachers/studentsuccess/strategy.html) and how that change can look like (http://edu.gov.on.ca/eng/research/EvidenceOfImprovementStudy.pdf).

In our work, the ongoing involvement of evaluators, who are intentionally embedded in program and policy development and implementation teams contribute to modeling evaluative thinking and guiding DE that build system evaluation capacity. The emphasis is on being a learning organization through evidence-informed, focused improvement planning and implementation.

Hot Tips: check out how evaluative thinking is embedded in professional learning (http://sim.abel.yorku.ca/ )or how evaluation thinking is embedded in improvement planning (http://www.edu.gov.on.ca/eng/policyfunding/memos/september2012/ImprovePlanAssessTool.pdf).

The American Evaluation Association is celebrating Evaluation Use (Eval Use) Topical Interest Group Week. The contributions all this week to aea365 come from our Eval Use TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hi my name is Jayne Corso and I am the Community Manager for AEA.  Evaluation 2015 planning is well underway and, as always, social media will play a big part before, during, and after the conference. Social media is a great way to interact with other evaluators and gain the most up-to-date information on Evaluation 2015. Try these tips to get the most out of social media and increase engagement with fellow evaluators.

Pre-conference

Follow the Hashtag: Use #Eval15 to stay connected on Facebook and Twitter. If you search #Eval15 on either site, you will find the latest posts about the event from AEA directly and other evaluators who are talking about Evaluation 2015. You can even join the conversation—add #Eval15 to your Evaluation 2015 posts.

Follow Evaluation 2015 Presenters: You can find a few of your favorite presenters on social media. Follow them before, during, and after the conference to see their Evaluation 2015 highlights and take advantage of this access to ask questions about their session or evaluation work. Below are a few examples:

Stephanie Evergreen: @evergreendata

Ann K. Emery: @AnnKEmery

Sheila Robinson: @SheilaBRobinson

Kirk Knestis: @Knestis

During Conference

Be Active Onsite: Being active on social media during a conference can help with your personal branding, expand your network to more like-minded people, and help you get more out of the entire conference experience in general. Through Twitter and Facebook and help from our AEA community we can put our in-person conference in real time.

While at Evaluation 2015 live tweet your sessions. This allows you to post bite sized takeaways from your session, share pictures from the event, and make connections while you are in Chicago. Add #Eval15 to your posts so all attendees and non-attendees who are following Evaluation 2015 can see your post and gain a wider prospective on all events at the conference.

Don’t be afraid to reach out to other people on Twitter or Facebook who are also covering Evaluation 2015! This is where the conference hashtag comes in; just click the hashtag and comment, favorite, and retweet away.

Post-conference

Connect After Evaluation 2015: After Evaluation 2015, AEA will be posting photos and highlights from the event—so stay connected with AEA! You can also stay in touch with fellow conference attendees and follow your favorite presenters. Our goal is to help you extend your social network by the end of Evaluation 2015!

We can’t wait to see you in Chicago!

AEA_1071214_WB2015 Final

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Curator note: Today we feature a special 2-part post. Please scroll down to read both parts. Enjoy!
 

Theresa McGauley-Keaney on Dialogue Education, Part 1: A ‘Lean’ Way to Teach and Learn

Hello, I am Theresa McGauley-Keaney, a Program Monitor and Trainer (PMT) at Commonwealth Medicine, a division of the University of Massachusetts Medical School. With a focus on health policy, financing and service delivery in the public interest, we conduct a wide variety of program evaluation and research projects. This is a fast-paced, matrix organization which is employing Lean process improvement methodology (Lean) to its daily work. Separately, my fellow PMTs and I have been using a training methodology called Dialogue Education™ (DE) to add structure and efficiency to our trainings. In this 2-part AEA365 blog, I’d like to explain how my colleagues and I have been using DE and how well we discovered it complements Lean thinking.

The emphasis of DE is on strong training design in order for learners to experience positive, long-term effects of their learning. Shaped by Dr. Jane Vella and drawn from the work of several experts in the fields of education and psychology, DE is a learning-centered training system that values the experience of adult learners. As one of my colleagues summed it up, “DE takes the core principles of adult learning theory and applies them to the development and design of adult trainings.”

Theresa 1

Lesson Learned: As trainers, we like DE for its pragmatism, flexibility and for its results. I personally like it because it holds trainers and learners accountable for learning. Through careful design, DE incorporates evaluation into the training on three levels: learning, transfer, and impact. Without learning there will be no transfer, and without transfer there will be no impact. To this point, just one of the many tools DE offers is an Accountability Planner that informs training design by asking:

  • Which objectives do you really want and need to evaluate?
  • What will the learners do that that you can evaluate?
  • Based on your objectives, what do you anticipate changing and what about it are you seeking to evaluate? (learning, transfer or impact)

The Accountability Planner offers the opportunity to consider how learners will apply their learning in their own context. Anticipating and planning for impact allows us to create outcome measures we can use to document results for stakeholders.

Hot Tip: DE uses Achievement-Based Objectives during the training. Make it simple; have learners write on flip charts; or arrange post-it notes. If learners produce it, it can be assessed.

In Part 2, I will discuss how we use DE synergistically with Lean principles for better quality training as we embark on program evaluation and research projects.

Theresa McGauley-Keaney on Dialogue Education, Part 2: A ‘Lean’ Way to Teach and Learn

Hello again, I am Theresa McGauley-Keaney, a Program Monitor and Trainer (PMT) at the University of Massachusetts Medical School’s Commonwealth Medicine. In Part 1, I introduced you to Dialogue Education (DE), a training methodology with a focus on learning. Now I’d like to describe how we find DE complements Lean.

Lean is a performance improvement methodology based on the premise that less waste provides more value for the customer. It adapts the scientific method to process improvement and is grounded in respect for front line workers. Leadership’s role is to support workers to ensure a streamlined flow of efficiency. Like Lean, DE respects adult learners for their knowledge and experience. The trainer’s role is to support the learner. Neither the trainer nor the student is the focus; both are held accountable for learning. In both Lean and DE, people are considered “problem solvers” and are an important part of the process to reach the intended goal.

Lean’s Eight Wastes:

  1. Defects (errors)
  2. Overproduction (doing more than needed)
  3. Waiting (or delays)
  4. Not utilizing employees (ideas and skills not used)
  5. Transport
  6. Inventory (too much material)
  7. Motion (movement by workers)
  8. Extra Processing (re-dos, re-work)

Lesson Learned: We find Lean methodologies synergistic in that DE allows us to evaluate the “learning” in our trainings along the way, enabling us to remediate in real-time and avoid re-training (#8). Periodic evaluation intercepts misunderstanding that could lead to passing on bad information (#1). DE utilizes a Learning Needs Resource Assessment that identifies the “required” learning, preventing overproduction (#2), and time wasted teaching people what they already know (#3).

Lesson Learned: My favorite thing about DE is watching people become empowered and motivated by being heard, by instituting their ideas, by being respected, by feeling included and taking ownership (#4).

Lesson Learned: I adapted the Lean A3 size paper report format to the DE Design Steps for consistency, familiarity, and collaboration.

Theresa 2

Here is an example of how I used DE to standardize a best practice for security committees across several business units. One or two security liaisons per unit attended my training. I explained why developing a security committee is encouraged and how a charter clarifies structure, function and purpose. Using blank templates, liaisons broke into small groups to share, through dialogue, their ideas and suggestions for each other’s charter. Within one hour, they each had a unit-specific draft charter in hand. They verbalized their intention to show their charters to their unit leadership and discuss implementation. Follow-up revealed that most units now have security committees.

Hot Tip: “Never tell people how to do things. Tell them what to do and they will surprise you with their ingenuity.” General George Patton, Jr.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Older posts >>

Archives

To top