AEA365 | A Tip-a-Day by and for Evaluators

CAT | Business, Leadership, and Performance

We are Nichole Stewart and Laura Pryor and we’d like to share a preview of our presentation at the upcoming AEA 2013 conference. Our session, Performance Management to Program Evaluation: Creating a Complimentary Connection, will use a case study of a Los Angeles-based juvenile offender reentry program to demonstrate how “information and knowledge production” can be coordinated for performance management (PM) and program evaluation (PE).

Lessons Learned: There IS a difference!

Distinguishing between PM and PE has historically presented challenges for program directors and the public agencies and non-profit organizations that fund them. Programs have to grapple with day-to-day operations as well as adapting to evolving frameworks for understanding “what works”—from results-based accountability to continuous quality improvement to evidence-based everything. Evaluators are frequently called upon to engage simultaneously in both PM and PE, however the distinctions between the tasks are not always clearly understood or articulated in practice.

Lessons Learned: There IS a connection!

Fortunately, several authors have explored the relationship between PM and PE and outlined how PM and PE can complement one another with regard to data collection and analysis:

  • Information complementarity– Use the same data to answer different questions based on different analyses (Kusek and Rist, 2004).
  • Methodical complementarity– Use similar processes and tools to collect and analyze data and ultimately convert data into actionable information (Nielsen and Ejler, 2008).

Rad Resources

StewartPryorGraph

Source: Child Trends, Research-to-Results Brief (January 2011)

Hot Tips:

 

The American Evaluation Association is celebrating Business, Leadership and Performance (BLP) TIG Week with our colleagues in the BLP AEA Topical Interest Group. The contributions all this week to aea365 come from our BLP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Want to learn more from Nichole and Laura? They’ll be presenting as part of the Evaluation 2013 Conference Program, October 16-19 in Washington, DC.

·

Greetings! I am Carla Forrest, a staff member and Lean Six Sigma Black Belt at Sandia National Laboratories, Albuquerque, New Mexico. My professional work involves configuration management of scientific and engineering knowledge and information. My passion, however, lies in using appreciative approaches to improve workplace performance.

Rad Resource

Recently I read “The 5 Languages of Appreciation in the Workplace” by Dr. Gary Chapman and Dr. Paul White. The authors categorize the five appreciative languages as: (1) words of affirmation; (2) quality time; (3) acts of service; (4) tangible gifts; and (5) physical touch. In the workplace, we often overlook the impact that appreciative inquiry and language have on organizational and individual performance. Authentic appreciation, when expressed in the primary appreciative language of the individual, can be a strong motivator, trust builder, and empowering influence, often uplifting the individual and organization into high performance.

Hot Tip

Appreciation is not recognition or reward. The focus of appreciation is intrinsic. The focus of recognition and reward is extrinsic. Organizational reward and recognition programs focus on performance. Appreciation is personally meaningful, focusing on who a person is. The typical “one size fits all” reward and recognition program is usually managerially directed and impersonal, often lending skepticism as to the genuineness of the leader’s intentions. The ultimate downside to the reward/recognition approach is the cost involved. Motivating through authentic appreciation has no financial cost, but is truly priceless!

In what ways can leaders apply appreciative approaches to transform relationships, attitudes, and performance in the workplace?

 

The American Evaluation Association is celebrating Business, Leadership and Performance (BLP) TIG Week with our colleagues in the BLP AEA Topical Interest Group. The contributions all this week to aea365 come from our BLP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· · ·

My name is Trina Willard and I am the Principal of Knowledge Advisory Group, a small consulting firm that provides research and evaluation services to nonprofits, government agencies and small businesses.  With nearly two decades of work in the evaluation field under my belt, I’ve had many opportunities to engage leaders in discussions about evaluation and organizational development.  Across all sectors, executives and leadership teams almost always identify benefits to evaluation, but likewise disclose concerns.  Today I’d like to address one of the most common objections I hear:

“But we can’t evaluate!  What if the results are bad?”

Let’s face a stone cold fact: every organization has at least some room for improvement. Are the leaders you work with concerned about measuring results because they fear evaluation may reveal that their organizations aren’t flawless?  If so, their organizations are vulnerable to risks that could blindside leaders when they ultimately come to light, perhaps publicly.  As evaluators, how do we make the case to clients and colleagues that bad results aren’t actually so…bad?  These three strategies have worked well for me in my consulting practice.

Hot Tip #1:  Give an example.  We all have a cautionary tale of an organization that resisted evaluation and paid dearly for it, perhaps from personal experience.  Use it.

Hot Tip #2:  Paint the picture for change.  Revelations about products, programs or services that feel negative or problematic are often seen as threats.  However, reframing that notion, to one of opportunity, can be very valuable in altering the leadership perspective.  Committing to an evaluation does not suggest a willingness to demonstrate failure; it demonstrates a willingness to always improve.

Hot Tip #3:  Challenge the choice of reaction versus action.  Be direct and ask this question: “Do you really want to wait until a problem takes root, jeopardizing your organization’s future before you begin to examine what needs to be improved? Or do you want to know now, so you can address it before it’s too late?”

What tips do you use to encourage leaders to face their fears and evaluate, thereby driving organizational decision-making and future success?

Rad Resource:  Analytics at Work: Smarter Decisions, Better Results by Thomas H. Davenport, Jeanne G. Harris and Robert Morison.   Look inside to discover numerous examples of how leaders use data to make decisions, across organizations with diverse circumstances and analytic capabilities.

The American Evaluation Association is celebrating Business, Leadership and Performance (BLP) TIG Week with our colleagues in the BLP AEA Topical Interest Group. The contributions all this week to aea365 come from our BLP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

· ·

Hello.  I am Tom Lyzenga, a PhD student at Western Michigan University, and chair of the Business, Leadership, and Performance TIG.  I am also a training evaluator and performance consultant at a major international corporation.  My tip today is about filling in blanks in an impact matrix or model.

Challenge: Sometimes program stakeholders can’t provide a complete impact matrix or model.  In some cases, the intervention can be applied uniquely or optionally by each program participant, so we can’t project if or how they will apply it.  In other cases, the training intervention has only high level objectives and we can’t predict either how they will be applied or what specific activities or performance impacts can be expected.

Hot Tip: You can apply Rob Brinkerhoff’s Success Case Methodology to fill in the blanks in the impact matrix AFTER the intervention has occurred.  By conducting structured interviews with participants who are identified as either very or not successful, we can describe how the intervention was applied, what impact it had, and what factors contributed to the success of the application.  We can also describe the relative success of the program in terms of the percentage of successful participants.  As a result, stakeholders can get a sense of the value of the intervention and influence the success factors for the next intervention.

Rad Resource: The demonstration I am offering, Filling in The Blanks in Models and Matrices with Success Case Methodology, at Evaluation 2013 in Washington, DC is designed to demonstrate the use of the Success Case Method to complete impact matrices and models.

The American Evaluation Association is celebrating Business, Leadership and Performance (BLP) TIG Week with our colleagues in the BLP AEA Topical Interest Group. The contributions all this week to aea365 come from our BLP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. Want to learn more from Tom? He’ll be presenting as part of the Evaluation 2013 Conference Program, October 16-19 in Washington, DC.

·

Hello! I am Denise Roosendaal, AEA’s new Executive Director.

I am thrilled to have been selected to lead the management team for American Evaluation Assocation. I appreciate the significant tenure of AEA Executive Director Emeritus  Susan Kistler, and look forward to her continued involvement in the organization in her new role. The AEA has a long history of impressive accomplishments and achievements, and in creating value for its passionate members through programs and services. The potential for continued success is equally significant.

In looking toward the future, I am most compelled by the organization’s strongly held values and how those values will impact the strategy your Board of Directors will create for the next few years. This strategy will certainly be driven by the overarching mission, the landscape of the evaluation profession and practice, and the global impact evaluation can have on society.

At the top of my 90 day plan will be a concerted endeavor to get to know you – the members.  What are your concerns? What are your observations of the ever-evolving profession of evaluation? In what aspects of evaluation are you most interested? Where do you see yourself and the organization in five years? If it sounds as if I’m interviewing you, I might be! I would welcome the opportunity to hear from you directly so that I can better understand what the membership needs from its organization.

I have been in association management my entire career. While I enjoy so many aspects of this field, I am most passionate about the creation of community within an organization.  AEA has truly mastered this quality.  The level of commitment of the members and the leadership is impressive. With the TIGs, Affiliates, AEA365, e-learning, Evaluation 2013, there are so many ways you can engage in the organization in a manner that is friendly and intimate and thought-provoking.

Get involved: Please drop me an email with any of your thoughts to these questions or other topics you wish you to address.  I look forward to meeting you in Washington DC in October! Even if you are not able to join us at the conference, I hope to meet you in the coming year.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the  aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Howdy!  I’m Tom Ward, and I am a faculty member of the U.S. Army Command and General Staff College at Fort Leavenworth, Kansas.  In that capacity, I teach critical thinking, ethics, contracting, logistics, and “writing to persuade.” My passion, however, is knowledge management, and using KM to improve decision making.  My tip today is about providing project leadership.

We sometimes forget that very competent adults still require leadership when participating in group endeavors. The more they understand the intent of a project and the desired end state of a client, the more they are able to bring their own talents to bear.  This requires not only vision on the part of leaders, but a clear and understandable communication of that vision – preferably in written form.

Hot Tip:

  • Project leaders must understand the environmental context of the project, visualize the steps required to complete the project, and communicate that visualization effectively.  For example, if the deliverable product is a written report or graphic presentation, examples of similar deliverables provide excellent visualization of “this is where we are going.”  “How we will get there” is crucial as well, but may part of the design phase of the problem solving process; still, clearly describing the process of “how we will decide how we get there” enables unified effort and enhances the ability of individuals to contribute effectively.  Leaders who provide clear frameworks for task accomplishment and then provide the required resources for individual and group success gain a reputation not only as reliable producers for clients, but “favorite bosses” of project group members.

The American Evaluation Association is celebrating the Business, Leadership, and Performance TIG (BLP) Week. The contributions all week come from BLP members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

We are from the College of Education and Human Sciences, University of Nebraska-Lincoln, with Timothy Guetterman, the doctoral student, and Delwyn Harnisch, the Professor.

Mixed methods approaches can be useful in assessing needs and readiness to learn among professional workshop participants.  Combining qualitative and quantitative methods can enhance triangulation and completeness of findings.  We recently mixed methods while evaluating a weeklong workshop delivered to medical educators in Kazakhstan and experienced how mixing can aid evaluation activities.

The international collaboration between teams in the U.S. and Kazakhstan presented challenges that we mitigated through technologies, such as email, Skype, and Dropbox.  Surveys administered before, during, and after the workshop through an online tool, Qualtrics, were important to guide implementation, continually assess learning, and understand the participant’ perspective.

Hot Tips:

  • Guiding Implementation. Mixed methods within the needs and readiness assessment served a formative purpose, helping us tailor the workshop to specific participant needs.  Mixed methods analyses yielded rich details about what participants wanted and needed that would be difficult to anticipate with a quantitative instrument.  Online surveys presented a way to connect with participants early.  Beyond quantitative scales, we asked questions (e.g., “What do you hope to learn?”).  Because data were immediately available, findings guided the workshop implementation.
  • Continually Assess Learning. Throughout the workshop, brief (about one minute) surveys at the end of the day helped to gauge understanding of where participants are to develop the community of learners.  Providing a daily survey solicited brief qualitative responses from items (e.g., “Summarize in a few words the most important point from today”; “What point is still confusing?”).  The questions provided valuable information but only took minutes to complete.
  • Understand the Participants’ Perspectives. In the summative evaluation of the workshop, mixed methods allowed us to obtain participant ratings and gain understanding of what participants learned through open-ended qualitative questions.

Lessons Learned:

  • With the use of these tools, we were able to model in this workshop a process for developing a deep and practical understanding of assessment for learning.  With the leaders at this program sharing at their sites, we are beginning to see the vehicle of site-based teacher learning communities.  Each of these sites is using two or three techniques in their own classrooms and then meeting with other colleagues monthly to discuss their experiences and to see what other teachers are doing.
  • The result of this effort is that these teacher learning communities now develop a shared language enabling them to talk to one another about what they are doing.
  • In short, the use of mixed methods allows the team to focus on where the learners are now, where they want to go, and how we can help them get there.

The American Evaluation Association is celebrating the Business, Leadership, and Performance TIG (BLP) Week. The contributions all week come from BLP members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello evaluators, my name is Sid Ali and I am Principal Consultant at Research and Evaluation Consulting.  I do much of my work in education and training settings, and this often takes me into the corporate, environment.

I have found that there is great benefit to both the evaluator and the client in using tried and tested multi-step methods for evaluation management, especially if the client organization does not have a culture familiar with evaluation methods and use.  These multi-step methods are often used in public health and human services evaluations, but can be easily transferred to the corporate setting with some elbow grease.

Corporate organizations that have primarily used performance measurement to monitor programs require a familiarization with the evaluative process.  The US GAO has a nice description of the relationship between evaluation and performance measurement that can help you communicate such to your clients.  This familiarization can take many forms, but preparing a primer and distributing is not the approach I would recommend.  Here’s where the multi-step methods come into play, as much focus in what I call the “orientation” phase of the evaluation is placed on building relationships with key players in the evaluation management from the corporation’s side.  Understanding the historical context of the organization and the program is crucial at this phase as well.

Multi-step methods for evaluation management also help the evaluator and client by establishing an evaluation activity sequence or road map that is shared with the organization in the “orientation” phase with the caveat that there may be changes to the route that was planned.  My experience in using the multi-step methods is that evaluation activities and results are better understood and both become more relevant within the client organization during the evaluation and in times post-evaluation as well.

Rad Resources:

The American Evaluation Association is celebrating the Business, Leadership, and Performance TIG (BLP) Week. The contributions all week come from BLP members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi, I am Robin T. Kelley and am an internal evaluator at a national nonprofit health organization that is funded by the Centers for Disease Control and Prevention to provide free capacity building assistance to HIV prevention organizations, health departments and their HIV planning groups.

In the HIV/AIDS field, there are a number of changes occurring; here are just a few major ones:  In 2010, there was the release of the U.S. National HIV/AIDS Strategy. All funded entities are now striving to align themselves with the major goals of this strategy. As of  2011, scientific studies that showed the effectiveness of adherence to HIV medicine in reducing the viral loads,  resources are placed into, biomedical interventions and the  emphasis is now placed  more on organizations conducting  high impact HIV prevention.

Lessons Learned:

One key method of building an organization’s ability to manage complex situations, particularly small organizations that serve vulnerable populations, or populations of color-is to strengthen their change management leadership skills.  Research has shown that in times of complexity, such as shifting federal and health priorities, organizations, businesses that serve minorities  often shut their doors first ,leaving underserved communities abandoned and without services.  To sustain these agencies, evaluators as well as program managers should be agile and flexible in understanding the community needs, their resources, staff strengths as well as weaknesses-to best manage the changes.

Hot Tips:

Here are some steps to take and useful tools to address HIV changes and changes in general:

1)     First, help the organization conduct an organizational diagnosis.  They must know what they have in order to consider what to change.

2)     Second, help the organization to conduct an environmental scan or asset mapping of their community to determine if there is still a need for their services.

3)     Then to help organizations to analyze the data.  Based on the findings, help the organization to do a SWOT analysis (an analysis of their strengths, weaknesses, opportunities and threats).  Depending on these findings,  perhaps  there is a way to merge efforts with another organization;

4)     Next, help the organizations communicate changes to all staff; without constant communication, rumors can fly and morale can sink.

5)     Finally, help the organization to create a process log so that they can record the number of new service requests and activities and to continue to justify their existence.

Rad Resources:

The American Evaluation Association is celebrating the Business, Leadership, and Performance TIG (BLP) Week. The contributions all week come from BLP members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

My name is Ellen Steiner, Director of Market Research and Evaluation at Energy Market Innovations, a research-based consultancy focused on strategic program design and evaluation for the energy efficiency industry – we work to create an energy future that is sustainable for coming generations.

Lessons Learned:

An increasingly common practice…

In energy efficiency program evaluations, telephone surveys are traditionally the mode of choice. However, there are many reasons that evaluators are increasingly interested in pursuing online surveys including the potential for:

(1) lower costs,

(2) increased sample sizes,

(3) more rapid deployment, and

(4) enhanced respondent convenience.

With online surveys, fielding costs are often lower and larger sample sizes can be reached cost-effectively. Larger sample sizes result in greater accuracy and can support increased segmentation of the sample. Online surveys also take less time to be fielded and can be completed at the respondent’s convenience.

Yet be aware…

In contrast, there are still many concerns regarding the validity and reliability of online surveys. Disadvantages of online surveys potentially include:

(1) respondent bias,

(2) response rate issues,

(3) normative effects, and

(4) cognitive effects.

Certain populations are less likely to have Internet access or respond to an Internet survey, which poses a generalizability threat. Although past research indicates that online response rates often are equal or slightly higher than that of traditional modes, Internet users are increasingly exposed to online survey solicitations, necessitating researchers employ creative and effective strategies for garnering participation. In addition, normative and cognitive challenges related to not having a trained interviewer present to clarify and probe which may lead to less reliable data.

Come talk with us at AEA!

My colleague, Jess Chandler and I will be presenting a session at the AEA conference titled “Using Online Surveys and Telephone Surveys for a Commercial Energy Efficiency Program Evaluation: A Mode Effects Experiment,” in which we will discuss the findings from a recent study we conducted comparing online to telephone surveys. We hope you can join us and share your experiences with online surveys!

Hot Tips:

  • Email Address Availability – In our experience, if you do not have email addresses for the majority of the population from which you want to sample, the cost benefits of an internet sample are cancelled out by the time spent seeking out or trying to purchase email addresses.
  • Mode Effects Pilot Studies – Where possible, conducting a pilot study using a randomized controlled design where two or more samples are drawn from the same population and each sample is given the survey in a different mode is a best practice to understand the potential limitations of an online survey specific to the population under study.

The American Evaluation Association is celebrating the Business, Leadership, and Performance TIG (BLP) Week. The contributions all week come from BLP members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · · · ·

<< Latest posts

Older posts >>

Archives

To top