AEA365 | A Tip-a-Day by and for Evaluators

CAT | Business, Leadership, and Performance

My name is Kylie Hutchinson.  I am an independent evaluation consultant with Community Solutions Planning & Evaluation.  In addition to evaluation consulting and capacity building, I tweet at @EvaluationMaven and co-host the monthly evaluation podcast, Adventures in Evaluation along with my colleague @JamesWCoyle.

When I started out in evaluation 26 years ago, I was focused on being a good methodologist and statistician.  After deciding to work primarily with NGOs I learned the importance of being a good program planner.  Employing a participatory approach required me to become a competent facilitator and consensus-builder.  These days, the increased emphasis on utilization and data visualization is forcing me to upgrade my skills in communications and graphic design.  New developments in mobile data collection are making me improve my technical skills.  A recent foray into development evaluation has taught me the important role that a knowledge manager plays in evaluation. Finally, we are starting to understand evaluation capacity development as a process rather than a product, so now I need expertise in organizational development, change management, and the behavioral sciences.  Whoa.

HutchDon’t get me wrong, I’m not complaining.  Every day I wake up and think how lucky I am to have picked such a diverse career as evaluation. But with all these responsibilities on my plate, my toolbox is starting to get full and sometimes keep me awake a night.  How can I manage to be effective at all of these things?  Should I worry about being a Jack of all trades, Master of none?

Hot Tip:  You don’t have to do it all.  Determine your strengths and outsource your weaknesses. Pick several areas of specialization and ask for assistance with the others.  This help may come in the form of other colleagues or departments.  For example, if you think you need help with change management, sub-contract an organizational development consultant to your team.  If you work in an organization with a communications or graphic design department, don’t forget to call on their expertise when you need it.

Hot Tip:  Take baby steps.  If you want to practice more innovative reporting, don’t assume you have to become an expert in communication strategies overnight. Select one or two new skills you want to develop annually and pick away at those.

Hot Tip:  If you can, strategically select those evaluations that will expose you to a new desired area, e.g. mobile data collection or use of a new software.

Rad Resource:  Even if you’re not Canadian, the Canadian Evaluation Society’s Competencies for Canadian Evaluation Practice provide a great basis from which to reflect on your skills.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Greetings!  We are Tom McQuiston (USW Tony Mazzocchi Center) and Tobi Mae Lippin and Kristin Bradley-Bull (New Perspectives Consulting Group).  We have collaborated for over a decade on participatory evaluation and assessment projects for the United Steelworkers (labor union).  And we have grappled mightily with how to complete high-quality data analysis and interpretation in participatory ways.

Hot Tip: Carefully determine up front what degree of full evaluation team participation there will be in data analysis.  Some practical considerations include:  the amount of team time, energy, interest, and analysis expertise that is available; the levels of data analysis being completed; the degree of project focus on team capacity-building; and the project budget and timeline.  How these and other considerations get weighed is, of course, also a product of the values undergirding your work and the project.

Hot Tip: Consider preparing an intermediate data report (a.k.a. “half-baked” report) that streamlines the analysis process for the full team.  Before the full team dives in, we:  review the raw quantitative data; run preliminary cross-tabs and statistical tests; refine the data report content to include only the — to us — most noteworthy data; remove extraneous columns spit out of SPSS; and assemble the tables that should be analyzed together — along with relevant qualitative data — into reasonably-sized thematic chunks for the team.

Hot Tip: Team time is a precious commodity, so well-planned analysis/ interpretation meetings are essential.  Some keys to success include:

  1. Invest in building the capacity of all team members.  We do this through a reciprocal process of us training other team members in, say, reading a frequency or cross-tab table or coding qualitative data and of them training us in the realities of what we are all studying.
  2. Determine time- and complexity-equivalent analyses that sub-teams can work on simultaneously.  Plan to have the full team thoughtfully review sub-team work.
  3. Stay open to shifting in response to the team’s expertise and needs.  An empowered team will guide the process in ever-evolving ways.

Some examples of tools we have developed — yes, you, too, can use Legos™ in your work — can be found at: http://newperspectivesinc.org/resources.

We never fail to have many moments of “a-ha,” “what now” and “wow” in each participatory process.  We wish the same for you.

The American Evaluation Association is celebrating Best of aea365 week. The contributions all this week are reposts of great aea365 blogs from our earlier years. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

Hello! I’m Michelle Baron, an Independent Evaluation Strategist. In my work in higher education, I’ve encountered a mixture of evaluation champions and critics. Today I’d like to address the importance of strategic planning in building a culture of evaluation.

Strategic planning is considered by many to be an organizational road map by outlining the organizational vision and mission, establishing clear and attainable objectives and goals, and then developing processes for how to achieve them.    Strategic planning and evaluation go hand in hand in moving the organization and its programs forward to benefit its stakeholders. Strategic planning is simply crucial to the evaluation process: without a road map of criteria, standards, and goals, it’s almost impossible to achieve desired success.

Evaluators have a unique role in helping organizations with both ends of the spectrum: creating a foundation through strategic planning, and then conducting evaluations to examine and monitor progress.

Hot Tip #1: Start at the top. Buy-in from top management for strategic planning is of the utmost importance for its success.

Hot Tip #2: Conduct a SWOT analysis (Strengths, Weaknesses, Opportunities, and Threats) of the entity or its programs/services. Doing so not only enlightens people to a variety of ideas and questions to consider, but can also indicate the level of support for those topics.

Cool Trick: Brainstorming sessions are often an excellent starting point for the organization itself or smaller group within that organization. The evaluator or designated member of the organization can facilitate the discussion by developing questions beforehand that may serve as prompts for the discussion, such as those dealing with objectives, goals, and resources.

Rad Resource #1: Strategic Planning for Public & Nonprofit Organizations by John Bryson, and related books by the same author, provide the groundwork and tools necessary for organizations to develop and sustain their strategic planning process.

Rad Resource #2: The Fifth Discipline: The Art and Practice of the Learning Organization by Peter Senge helps leaders establish the foundation and philosophy behind strategic planning, and helps them develop their long-term thinking for organizational growth.

With these tools and resources, evaluators may be more prepared to assist organizations in strategic planning, and have more support for and effectiveness of the evaluations for the organizations.

The American Evaluation Association is celebrating Business, Leadership and Performance (BLP) TIG Week with our colleagues in the BLP AEA Topical Interest Group. The contributions all this week to aea365 come from our BLP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· · · ·

My name is Michelle Paul Heelan, Ph.D., an evaluation specialist and organizational behavior consultant with ICF International.  In my fifteen years assisting private corporations and public agencies to track indicators of organizational health, I’ve found that moving towards more sophisticated levels of training evaluation is challenging – but here at ICF we’ve identified effective strategies to measure application of learning to on-the-job behavior.  This post provides highlights of our approach.

A challenge in training evaluation is transcending organizations’ reliance on participant reactions and knowledge acquisition to assess the impact of training.  Training is offered for a purpose beyond learning for learning’s sake – however we struggle to possess data that show the extent to which that purpose has been achieved once participants return to their jobs.  In our approach, we confront a key question: How do we (as empirically-based evaluation experts) gather those data that demonstrate the on-the-job impact of training?

Hot Tip #1: The work occurs during the training design phase – Nearly all essential steps of our approach happen during training design, or these steps must be reverse-engineered if one is acquiring training.

Hot Tip #2: A structured collaboration among three parties creates the foundation for the evaluation – Evaluation experts, instructional design experts, and organizational stakeholders (e.g., business unit leaders, training/development champions) must identify desired business goals and the employee behaviors hypothesized as necessary to achieve those business goals.  In practice, this is more difficult than it seems.

Hot Tip #3: Evaluation data collection instruments and learning objectives are developed in tandem – We craft learning objectives that, when achieved, can be demonstrated in a concrete, observable manner. During the design phase, we identify the behavioral variables expected to be affected by individuals’ participation for each of the learning objectives.

Hot Tip #4: The behavioral application of learning is best measured by multiple perspectives – For each variable, we create survey items for ratings from multiple perspectives (i.e., participants and at least one other relevant party, such as supervisors or peers). Using multiple perspectives to evaluate behavioral changes over time is an essential component of a robust evaluation methodology. Investigating the extent to which other parties assess a participant’s behavior similarly to their own self-assessment helps illuminate external factors in the organizational environment that affect training results.

Hot Tip #5: Training goals are paired with evaluation variables to ensure action-oriented results – This method also permits the goals of the training to drive what evaluation variables are measured, thereby maintaining clear linkages between each evaluation variable and specific training content elements.

Benefits of Our Approach:

  • Ensures evaluation is targeted at those business results of strategic importance to stakeholders
  • Isolates the most beneficial adjustments to training based on real-world application
  • Provides leadership with data directly useful for training budget decisions

Rad Resource: Interested in learning more?  Attend my presentation entitled “Essential Steps for Assessing Behavioral Impact of Training in Organizations” with colleagues Heather Johnson and Kate Harker at the upcoming AEA conference – October 19th, 1:00pm – 2:30pm in OakLawn (Multipaper Session 900).

The American Evaluation Association is celebrating Business, Leadership and Performance (BLP) TIG Week with our colleagues in the BLP AEA Topical Interest Group. The contributions all this week to aea365 come from our BLP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Want to learn more from Michelle and colleagues? They’ll be presenting as part of the Evaluation 2013 Conference Program, October 16-19 in Washington, DC.

·

We are Nichole Stewart and Laura Pryor and we’d like to share a preview of our presentation at the upcoming AEA 2013 conference. Our session, Performance Management to Program Evaluation: Creating a Complimentary Connection, will use a case study of a Los Angeles-based juvenile offender reentry program to demonstrate how “information and knowledge production” can be coordinated for performance management (PM) and program evaluation (PE).

Lessons Learned: There IS a difference!

Distinguishing between PM and PE has historically presented challenges for program directors and the public agencies and non-profit organizations that fund them. Programs have to grapple with day-to-day operations as well as adapting to evolving frameworks for understanding “what works”—from results-based accountability to continuous quality improvement to evidence-based everything. Evaluators are frequently called upon to engage simultaneously in both PM and PE, however the distinctions between the tasks are not always clearly understood or articulated in practice.

Lessons Learned: There IS a connection!

Fortunately, several authors have explored the relationship between PM and PE and outlined how PM and PE can complement one another with regard to data collection and analysis:

  • Information complementarity– Use the same data to answer different questions based on different analyses (Kusek and Rist, 2004).
  • Methodical complementarity– Use similar processes and tools to collect and analyze data and ultimately convert data into actionable information (Nielsen and Ejler, 2008).

Rad Resources

StewartPryorGraph

Source: Child Trends, Research-to-Results Brief (January 2011)

Hot Tips:

 

The American Evaluation Association is celebrating Business, Leadership and Performance (BLP) TIG Week with our colleagues in the BLP AEA Topical Interest Group. The contributions all this week to aea365 come from our BLP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Want to learn more from Nichole and Laura? They’ll be presenting as part of the Evaluation 2013 Conference Program, October 16-19 in Washington, DC.

·

Greetings! I am Carla Forrest, a staff member and Lean Six Sigma Black Belt at Sandia National Laboratories, Albuquerque, New Mexico. My professional work involves configuration management of scientific and engineering knowledge and information. My passion, however, lies in using appreciative approaches to improve workplace performance.

Rad Resource

Recently I read “The 5 Languages of Appreciation in the Workplace” by Dr. Gary Chapman and Dr. Paul White. The authors categorize the five appreciative languages as: (1) words of affirmation; (2) quality time; (3) acts of service; (4) tangible gifts; and (5) physical touch. In the workplace, we often overlook the impact that appreciative inquiry and language have on organizational and individual performance. Authentic appreciation, when expressed in the primary appreciative language of the individual, can be a strong motivator, trust builder, and empowering influence, often uplifting the individual and organization into high performance.

Hot Tip

Appreciation is not recognition or reward. The focus of appreciation is intrinsic. The focus of recognition and reward is extrinsic. Organizational reward and recognition programs focus on performance. Appreciation is personally meaningful, focusing on who a person is. The typical “one size fits all” reward and recognition program is usually managerially directed and impersonal, often lending skepticism as to the genuineness of the leader’s intentions. The ultimate downside to the reward/recognition approach is the cost involved. Motivating through authentic appreciation has no financial cost, but is truly priceless!

In what ways can leaders apply appreciative approaches to transform relationships, attitudes, and performance in the workplace?

 

The American Evaluation Association is celebrating Business, Leadership and Performance (BLP) TIG Week with our colleagues in the BLP AEA Topical Interest Group. The contributions all this week to aea365 come from our BLP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· · ·

My name is Trina Willard and I am the Principal of Knowledge Advisory Group, a small consulting firm that provides research and evaluation services to nonprofits, government agencies and small businesses.  With nearly two decades of work in the evaluation field under my belt, I’ve had many opportunities to engage leaders in discussions about evaluation and organizational development.  Across all sectors, executives and leadership teams almost always identify benefits to evaluation, but likewise disclose concerns.  Today I’d like to address one of the most common objections I hear:

“But we can’t evaluate!  What if the results are bad?”

Let’s face a stone cold fact: every organization has at least some room for improvement. Are the leaders you work with concerned about measuring results because they fear evaluation may reveal that their organizations aren’t flawless?  If so, their organizations are vulnerable to risks that could blindside leaders when they ultimately come to light, perhaps publicly.  As evaluators, how do we make the case to clients and colleagues that bad results aren’t actually so…bad?  These three strategies have worked well for me in my consulting practice.

Hot Tip #1:  Give an example.  We all have a cautionary tale of an organization that resisted evaluation and paid dearly for it, perhaps from personal experience.  Use it.

Hot Tip #2:  Paint the picture for change.  Revelations about products, programs or services that feel negative or problematic are often seen as threats.  However, reframing that notion, to one of opportunity, can be very valuable in altering the leadership perspective.  Committing to an evaluation does not suggest a willingness to demonstrate failure; it demonstrates a willingness to always improve.

Hot Tip #3:  Challenge the choice of reaction versus action.  Be direct and ask this question: “Do you really want to wait until a problem takes root, jeopardizing your organization’s future before you begin to examine what needs to be improved? Or do you want to know now, so you can address it before it’s too late?”

What tips do you use to encourage leaders to face their fears and evaluate, thereby driving organizational decision-making and future success?

Rad Resource:  Analytics at Work: Smarter Decisions, Better Results by Thomas H. Davenport, Jeanne G. Harris and Robert Morison.   Look inside to discover numerous examples of how leaders use data to make decisions, across organizations with diverse circumstances and analytic capabilities.

The American Evaluation Association is celebrating Business, Leadership and Performance (BLP) TIG Week with our colleagues in the BLP AEA Topical Interest Group. The contributions all this week to aea365 come from our BLP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· ·

Hello.  I am Tom Lyzenga, a PhD student at Western Michigan University, and chair of the Business, Leadership, and Performance TIG.  I am also a training evaluator and performance consultant at a major international corporation.  My tip today is about filling in blanks in an impact matrix or model.

Challenge: Sometimes program stakeholders can’t provide a complete impact matrix or model.  In some cases, the intervention can be applied uniquely or optionally by each program participant, so we can’t project if or how they will apply it.  In other cases, the training intervention has only high level objectives and we can’t predict either how they will be applied or what specific activities or performance impacts can be expected.

Hot Tip: You can apply Rob Brinkerhoff’s Success Case Methodology to fill in the blanks in the impact matrix AFTER the intervention has occurred.  By conducting structured interviews with participants who are identified as either very or not successful, we can describe how the intervention was applied, what impact it had, and what factors contributed to the success of the application.  We can also describe the relative success of the program in terms of the percentage of successful participants.  As a result, stakeholders can get a sense of the value of the intervention and influence the success factors for the next intervention.

Rad Resource: The demonstration I am offering, Filling in The Blanks in Models and Matrices with Success Case Methodology, at Evaluation 2013 in Washington, DC is designed to demonstrate the use of the Success Case Method to complete impact matrices and models.

The American Evaluation Association is celebrating Business, Leadership and Performance (BLP) TIG Week with our colleagues in the BLP AEA Topical Interest Group. The contributions all this week to aea365 come from our BLP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Want to learn more from Tom? He’ll be presenting as part of the Evaluation 2013 Conference Program, October 16-19 in Washington, DC.

·

Hello! I am Denise Roosendaal, AEA’s new Executive Director.

I am thrilled to have been selected to lead the management team for American Evaluation Assocation. I appreciate the significant tenure of AEA Executive Director Emeritus  Susan Kistler, and look forward to her continued involvement in the organization in her new role. The AEA has a long history of impressive accomplishments and achievements, and in creating value for its passionate members through programs and services. The potential for continued success is equally significant.

In looking toward the future, I am most compelled by the organization’s strongly held values and how those values will impact the strategy your Board of Directors will create for the next few years. This strategy will certainly be driven by the overarching mission, the landscape of the evaluation profession and practice, and the global impact evaluation can have on society.

At the top of my 90 day plan will be a concerted endeavor to get to know you – the members.  What are your concerns? What are your observations of the ever-evolving profession of evaluation? In what aspects of evaluation are you most interested? Where do you see yourself and the organization in five years? If it sounds as if I’m interviewing you, I might be! I would welcome the opportunity to hear from you directly so that I can better understand what the membership needs from its organization.

I have been in association management my entire career. While I enjoy so many aspects of this field, I am most passionate about the creation of community within an organization.  AEA has truly mastered this quality.  The level of commitment of the members and the leadership is impressive. With the TIGs, Affiliates, AEA365, e-learning, Evaluation 2013, there are so many ways you can engage in the organization in a manner that is friendly and intimate and thought-provoking.

Get involved: Please drop me an email with any of your thoughts to these questions or other topics you wish you to address.  I look forward to meeting you in Washington DC in October! Even if you are not able to join us at the conference, I hope to meet you in the coming year.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the  aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Howdy!  I’m Tom Ward, and I am a faculty member of the U.S. Army Command and General Staff College at Fort Leavenworth, Kansas.  In that capacity, I teach critical thinking, ethics, contracting, logistics, and “writing to persuade.” My passion, however, is knowledge management, and using KM to improve decision making.  My tip today is about providing project leadership.

We sometimes forget that very competent adults still require leadership when participating in group endeavors. The more they understand the intent of a project and the desired end state of a client, the more they are able to bring their own talents to bear.  This requires not only vision on the part of leaders, but a clear and understandable communication of that vision – preferably in written form.

Hot Tip:

  • Project leaders must understand the environmental context of the project, visualize the steps required to complete the project, and communicate that visualization effectively.  For example, if the deliverable product is a written report or graphic presentation, examples of similar deliverables provide excellent visualization of “this is where we are going.”  “How we will get there” is crucial as well, but may part of the design phase of the problem solving process; still, clearly describing the process of “how we will decide how we get there” enables unified effort and enhances the ability of individuals to contribute effectively.  Leaders who provide clear frameworks for task accomplishment and then provide the required resources for individual and group success gain a reputation not only as reliable producers for clients, but “favorite bosses” of project group members.

The American Evaluation Association is celebrating the Business, Leadership, and Performance TIG (BLP) Week. The contributions all week come from BLP members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Older posts >>

Archives

To top