AEA365 | A Tip-a-Day by and for Evaluators

TAG | learning

I’m Neha Sharma from the CLEAR Global Hub at the World Bank’s Independent Evaluation Group. A key Hub role involves facilitating learning and sharing knowledge about evaluation capacity development. So I often think about how people learn. In this context, I’ve been reading a lot of behavioral science literature, and reflecting on what makes people learn to change behaviors.

Richard Thaler, University of Chicago Economist and Behavior Science Professor, recently wrote about how he changed his class’s grading scheme to minimize student complaints about “low” grades when he administered difficult tests (to get higher dispersion of grades to identify “star” students).  His trick was to change the denominator in the grading scheme from 100 to 137, meaning that the average student now scored in the 90s and not in the 70s. He achieved his desired results: high dispersion of grades and no student complaints about “low” grades!

Thaler’s blog made me wonder what effect this change in grading scheme had on student learning and the lessons it carried for communicating tough evaluation results. The relationship between performance and learning holds critical lessons for evaluators – does a 70 disguised as a 90 have an effect on learning?

Like classroom tests, evaluations that are seen as overly harsh or critical are often questioned and lessons are underused by the evaluated agency. This doesn’t mean that poor results should not be communicated – they absolutely should – but evaluators need to keep in mind that receiving and then learning from bad performance is not easy when there is a lot at stake – future funding, jobs, professional growth, and political stability. On the other hand, evaluations that reaffirm stakeholder-biases are futile too.

This balance between communicating actual performance and encouraging learning may be key to determining evaluation use. If evaluations are to fulfill their learning mission the “how to” learn is just as, if not more, relevant as the evaluation itself. Cognitive science research about behavior change could teach us a lot about how to encourage learning through evaluations. For instance, we see that easy is better than complicated, attractive is better than dull, and social is better rather than teaching in isolation when trying to change behaviors. Behavior science is an interesting field of study for evaluators – to help us demystify the relationship between evaluation performance and learning.

Rad Resources:

Thaler is one of many behavioral scientists (and psychologists, economists) writing about what influences our behavior. Here are more.

The American Evaluation Association is celebrating Centers for Learning on Evaluation and Results (CLEAR) week. The contributions all this week to aea365 come from members of CLEAR. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.


Hello! I’m Michelle Baron, an Independent Evaluation Strategist. In my work in higher education, I’ve encountered a mixture of evaluation champions and critics. Today I’d like to address the importance of strategic planning in building a culture of evaluation.

Strategic planning is considered by many to be an organizational road map by outlining the organizational vision and mission, establishing clear and attainable objectives and goals, and then developing processes for how to achieve them.    Strategic planning and evaluation go hand in hand in moving the organization and its programs forward to benefit its stakeholders. Strategic planning is simply crucial to the evaluation process: without a road map of criteria, standards, and goals, it’s almost impossible to achieve desired success.

Evaluators have a unique role in helping organizations with both ends of the spectrum: creating a foundation through strategic planning, and then conducting evaluations to examine and monitor progress.

Hot Tip #1: Start at the top. Buy-in from top management for strategic planning is of the utmost importance for its success.

Hot Tip #2: Conduct a SWOT analysis (Strengths, Weaknesses, Opportunities, and Threats) of the entity or its programs/services. Doing so not only enlightens people to a variety of ideas and questions to consider, but can also indicate the level of support for those topics.

Cool Trick: Brainstorming sessions are often an excellent starting point for the organization itself or smaller group within that organization. The evaluator or designated member of the organization can facilitate the discussion by developing questions beforehand that may serve as prompts for the discussion, such as those dealing with objectives, goals, and resources.

Rad Resource #1: Strategic Planning for Public & Nonprofit Organizations by John Bryson, and related books by the same author, provide the groundwork and tools necessary for organizations to develop and sustain their strategic planning process.

Rad Resource #2: The Fifth Discipline: The Art and Practice of the Learning Organization by Peter Senge helps leaders establish the foundation and philosophy behind strategic planning, and helps them develop their long-term thinking for organizational growth.

With these tools and resources, evaluators may be more prepared to assist organizations in strategic planning, and have more support for and effectiveness of the evaluations for the organizations.

The American Evaluation Association is celebrating Business, Leadership and Performance (BLP) TIG Week with our colleagues in the BLP AEA Topical Interest Group. The contributions all this week to aea365 come from our BLP TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.


· · · ·

Hello! Sheila B. Robinson here, guest posting for Susan Kistler, our regular Saturday contributor. I work in PK12 education at Greece Central School District, and in higher education at the University of Rochester’s Warner School of Education. As aea365’s current Lead Volunteer Curator, I’ve had the pleasure of working with a number of groups – American Evaluation Association Topical Interest Groups (AEA TIGs), AEA Affiliates, and other groups that are united by evaluation practice in various contexts.

Hot Tip: Leave no stone unturned! In other words, don’t skip entire weeks. You can learn a lot even when a sponsored week’s group name doesn’t resonate with you. During sponsored weeks, you can read about how evaluators in different contexts from your own have grappled with evaluation challenges, learned something from working in diverse communities, or tried new technologies to enhance their evaluation practice and are now willing to share their experiences with all of us.

Hot Tip: Dig for enticing artifacts! Look for posts with content that transcends the focus of the sponsored week. For example, while I am not an environmental program evaluator, nor do I evaluate extension education programs, I found these two gems during sponsored weeks:

  • In this post, Sara El Choufi shared resources for learning Excel during the Environmental Program Evaluation (EPE TIG) sponsored week.
  • In this post, Melissa Cater shared information on creating a Community of Practice during Extension Education Evaluation (EEE TIG) week.

archaeologistLesson Learned: While our sponsored week authors may share evaluation foci with each other, they offer Hot Tips, Cool Tricks, Lessons Learned, and Rad Resources that appeal to and can be educative for a broad range of evaluators.


Cool Trick: Get your hands dirty! Sift through the archive and unearth your own gems in sponsored (and non-sponsored!) weeks.

Lesson Learned: Many sponsored weeks have themes that cut across evaluation contexts. In addition to TIG-sponsored weeks,we’ve hosted Cultural Competence Week, Innovative #Eval Week, Video in #Eval Week, AEA affiliate weeks, Bloggers Series Weeks, and Local Area Working Group Weeks, among others.

Rad Resource: History in the making: Check out aea365 and our archive for a list of over 1000 nuggets of evaluation wisdom from hundreds of authors. With about 70 sponsored weeks on aea365, there’s a lot to learn! So, get into comfortable clothes, get your virtual trowel, sieve, and brush and get your read on!



Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.


· ·

I’m Cheryl Poth and I am an assistant professor at the Centre for Applied Studies in Measurement and Evaluation in the department of Educational Psychology, Faculty of Education at the University of Alberta in Edmonton, Canada. My area of research is focused on how developmental evaluators build evaluation capacity within their organizations. My use of mixed methods is pragmatically-driven, that is, I use it when the research/evaluation question(s) require the integration of both qualitative themes and quantitative measures to generate a more comprehensive understanding. Most recently, my work within research teams has provided the impetus for research and writing about the role of a mixed methods practitioner within such teams.

Lessons Learned:

  • Develop and teach courses. In 2010, I developed (and offered) a doctoral mixed methods research (MMR) course in response to the demand from graduate students for opportunities to gain skills within MMR. The course was oversubscribed and at the end of the term we formed a mixed methods reading group, which continues to provide support as students are working their way through their research process. I am fortunate to be able to offer this course again this winter and already it is full!
  • Offer workshops. To help build MMR capacity, I have offered workshops in a variety of locations, most recently at the 9th Annual Qualitative Research Summer Intensive held in Chapel Hill, NC in late summer and at the 13th Thinking Qualitatively Workshop Series offered by the International Institute for Qualitative Methodology held in Edmonton, AB in early summer. These workshops remind me of the reality for many researchers that their graduate programs required completion of an advanced research methods course that was either qualitatively- or quantitatively-focused and of the need to build a community of MM researchers and that the community can exist locally or using technology can exist globally! It has been a pleasure to watch as new and experienced researchers begin to learn about MMR designs and integration procedures.
  • Join a community. One of the places where I have begun to find my community MM researchers was through a group currently working on forming the International Association of Mixed Methods, at the International Mixed Methods conference, and the mixed methods researchers on Methodspace.

Hot Tip:

Rad Resource:

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

I am Priya Small, an independent public health evaluation consultant, and today I will talk about lessons I have learned from riding the steep learning curve. Many times we may tend to devalue our early learning experiences as evaluators. But as this is a time of rapid growth, sharing these lessons is profitable.

Lesson Learned: Listen more, speak less. Observe more, take less notes. Pay especially close attention to how veteran evaluators manage meetings with stakeholders, build relationships, and resolve potential conflicts. Such learning opportunities may be scarce later.

Lesson Learned: If you are waiting for that job opportunity to come through, consider working on a pro bono project. This can offer a valuable opportunity to craft a useful evaluation in your interest area. Pro bono work can provide immediate motivation to sharpen and fine-tune your evaluation skills and network. The best way to perfect your skills is “on-the-job.” Look for committed managers who are open to seeing the value of evaluation. Negotiate a written agreement that clarifies project responsibilities.

Lesson Learned: Compete less and cooperate more. Team work has great potential to produce optimal outcomes. Don’t hesitate to support other evaluators by commenting on their blogs, joining discussion groups, etc. In a very organic manner, this has provided me with opportunities to collaborate.

Hot Tip: Participate in the AEA discussion group and other evaluation discussion groups on LinkedIn. The atmosphere is supportive and collaborative, and the format is efficient.

American Evaluation Association Linked In account image

Join AEA on Linked In

Rad Resource: The AEA365 blog! It is easy to get “tunnel vision” as we get caught up in our own projects. This can stifle creativity and innovation. Read and reflect on AEA365 blogs posts regularly to see a panoramic view of the vastness of the evaluation landscape. You might just pick up other “Rad Resources” on the way. Endeavor along with me to never stop learning from others. And I encourage you to also reach out by posting your own lessons learned on the AEA365 blog!

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · ·

Hi, my name is Michelle Baron. I am the Associate Director of The Evaluators’ Institute, an evaluation training organization, and the chair of the curating team for aea365.

As a retired Army veteran, I have conducted many evaluations with a wide range of stakeholder support. I have found three techniques to facilitate a well-received evaluation:

Cool Trick #1: Cultivating an environment for teaching and learning helps to put organizations at ease when going through the evaluation process. When you take away the “I gotcha!” and replace it with valuable instruction organizations can use for future improvement, you help to build a bridge of trust between you as the evaluator and the organization. When organizations contact YOU with evaluation ideas for their workplace, you know a good working relationship is blossoming.

Cool Trick #2: Referring organizations to helpful resources (both online and offline) helps to increase their self-sufficiency and foster productive conversations before, during, and after the evaluation. Military websites often have links to regulations and manuals that foster development of criteria and standards for a given topic.

Cool Trick #3: Increasing evaluation capacity by offering evaluation training in a given area (e.g., physical fitness, vehicle licensing) helps the organization to become not only familiar with policies and procedures of a particular content area, but helps them to be proactive and to think evaluatively regardless of whether they’re being formally evaluated.

I hope this Veteran’s Day brings you more in tune with the needs of your military stakeholders and that you can approach evaluation with a caring and helpful attitude so stakeholders will see the value in the work and reciprocate accordingly.

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to

· · · · · ·

Hello. We are Bill Bickel, Jennifer Iriti, and Julie Meredith. We make up the optimistically named Evaluation for Learning Project (EFL) at the Learning Research and Development Center, University of Pittsburgh. We are writing to share a recent experience we had working with a small, regional foundation interested in learning more from its grant making activities beyond its modest grant reports. Foundation leadership sensed that there was more to be gleaned from grantee experiences, especially about long-term effects of grants, which were not being captured. We were asked to devise a low cost way to help. Our learning protocol tip is what we came up with.

Tip: The following protocol was used with a test set of grantees.

  1. Identify a set of grantees to focus the learning exercise. Some common characteristics enriched the learning opportunity (e.g., grants closed for at least three years; some common goals and/or change methods [e.g., reliance on professional development in grantees’ work]; grantee leadership interested in learning).
  2. Develop an informal, retrospective theory of change (ToC) based upon existing grantee documents (e.g., applications & reports, write-ups, organizational descriptions).
  3. “Test” our ToC with grant leadership and refine. Debrief grantee leadership on what happened since the grant closed to document the enacted ToC.
  4. Poll grantee leadership on what the foundation could do to enhance their own and the foundation’s capacities to support future learning.
  5. Analyze individual and cross-grantee data for implications for foundation processes and write-up results in an accessible brief for all concerned.

EFL’s level of effort was modest. Insights gained about long-term outcomes, capacity building needs, and recommended changes in foundation application and reporting processes by early accounts are potentially useful both in the short and longer term. They are being vetted against the working knowledge of foundation leadership as we write. One can imagine many variations on the protocol; our point here is that past organizational experience has much to offer to a learning agenda if tapped in even an informal way.

Recommended resources that provide additional insights to support learning in “small foundations with big learning agenda” follow.

Resource: Marli Melton, Jan. Kay. Slater & Wendy Constantine have a useful chapter on ways evaluation can support learning in “Strategies for Smaller Foundations”.

Resource: Though not specifically targeted to small foundations Michael Patton, John Bare, & Deborah Bonnet offer insights on “Building Strong Foundation-Grantee Relationships,”– quite relevant to building learning in such contexts.

Both can be found in: Eds. M. Braverman, J. K. Slater, and N. Constantine. (2004). Foundations & Evaluation: Contexts & Practices for Effective Philanthropy. San Francisco: Jossey-Bass.*

*American Evaluation Association members receive 20% off on all Jossey-Bass titles when ordered directly from the publisher. Just sign on to the AEA website and select “Publications Discount Codes” from the members only for details.

The American Evaluation Association is celebrating evaluation in Not For Profits & Foundations (NPF) week with our colleagues in the NPF Topical Interest Group.  The contributions all this week to AEA365 will come from our NPF members and you may wish to consider subscribing to our weekly headlines and resources list where we’ll be highlighting NPF resources.

· · ·

Hi!  We are Carrie M. Brown (student) & Angela Walmsley (professor) from Saint Louis University.  Today we will be sharing some tips on having a successful semester-long evaluation experience.

Graduate students can become discouraged to conduct a program evaluation by thinking it will take too long to complete and by not having direct assistance from a supervisor. We decided to tackle this obstacle by creating a one-on-one program evaluation course. Together, we completed a program evaluation within a semester time-frame that investigated an economics educational program for 3rd and 4th graders at a children’s museum. The evaluation implemented qualitative and quantitative methods, participant triangulation, and methods triangulation. Upon conclusion of the semester, the museum received a final written report. Here, we give you tips on how to create a successful semester-long one-on-one program evaluation course.

Hot Tips: Here is how we structured our course:

  • Weekly meetings: One to two hours each, between professor and student.
  • Evaluation journal articles: Student located one article per week and wrote a one-page summary/critique.
  • Outlines of textbook chapters: Student created an outline, one chapter per week, from Program Evaluation: Methods and Case Studies (Posavac & Carey, 2002).
  • Program evaluation: Weekly progress on the evaluation.
  • E-mail phone contact as needed.

Hot Tips: Here are a few bits of advice on actually conducting the evaluation itself.

  • Consider programs already in your institution or programs in the community that do not receive or cannot afford evaluations.
  • Be prepared to communicate what you can provide and your time frame to the stakeholders.
  • Contact and meet stakeholders ASAP, and set up a time to visit the program.e
  • Make a site visit and learn the program ASAP.
  • Avoid being too ambitious. Choose only one type of evaluation (needs, process, outcome, etc.).
  • Determine a small set of goals for the evaluation.
  • Create a timeline of weekly goals, keeping the end of the semester in mind.
  • Choose a study design that fits the goals and timeline.
  • Work on writing the report throughout the semester and work together to edit in sections.
  • Review several reports to learn how to best format yours.

We found our experience to be both complete and satisfying. Perhaps you will consider a semester-long one-on-one program evaluation course!

This contribution is from the aea365 Daily Tips blog, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to

· ·

My name is Susan Kistler,  and I am AEA’s Executive Director. I contribute each Saturday’s post to aea365.

I have been reading and thinking about learning from failure. For those of us who are data-lovers, who find security in information, it can be a challenge to overcome the tendency to want to collect all possible information, explore all feasible outcomes, before moving in a new direction. While I’ll never be one to dive in without testing the waters, I do want to go swimming a bit more often.

Rad Resource: Cannon and Edmondson’s 2004 paper Failing to Learn and Learning to Fail (Intelligently): How great organizations put failure to work to improve and innovate outlines three processes needed to fail intelligently – and positively:

  1. Identifying Failure: “The key organizational barrier to identifying failure has mostly to do with overcoming the inaccessibility of data that would be necessary to identify failures.” (p. 10)
  2. Analyzing and Discussing Failure: Create an open environment for discussing failures and overcome negative emotions associated with examining one’s own failures.
  3. Experimentation: Put new ideas to the test, gather comparative data, and embrace both those that succeed and those that fail as contributing to the ultimate success of an endeavor.

Key takeaways include:

  • “Most managers underestimate the power of both technical and social barriers to organizational failure.” (p.3) Technical barriers include ensuring stakeholders have the know-how to use and analyze data to learn; organizational barriers include rewarding only success.
  • We must pay attention to and learn from small failures to prevent larger failures.
  • “Creating an environment in which people have an incentive, or at least do not have a disincentive, to identify and reveal failures is the job of leadership.” (p. 13)
  • “Conducting an analysis of failure requires a spirit of inquiry and openness, patience, and a tolerance of ambiguity. However, most people admire and are rewarded for decisiveness, efficiency and action rather than for deep reflection and painstaking analysis.” (p. 14)
  • “It is not necessary to make all [stakeholders] experts in experimental methodology, it is more important to know when help is needed from experts with sophisticated skills.” (p. 26)

For me, Cannon and Edmondson reaffirmed the value of formal and informal evaluation and its role in innovation. They made it clear that data-lovers are uniquely positioned to fail intelligently.

Rad Resource: Want to think further about learning from failure? Try Giloth and Gewirz (from Annie E Casey) Philanthropy and Mistakes: An Untapped Resource in the Foundation Review (free online), or Bumenthal’s blog post on four steps to failing well (Fail Small, Fail Publicly, Fail to Win, Fail Proudly), or Pond’s Embracing Micro-failure (Sarajoy is an aea365 contributor, AEA member, and social entrepreneur!).

Note that the above reflects my own opinions and not necessarily that of my employer, AEA.

This contribution is from the aea365 Daily Tips blog, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to

· · ·

I am Katye Perry, an Associate Professor at Oklahoma State University in Research, Evaluation, Measurement and Statistics (REMS). I have taught a graduate level evaluation class for one year shy of twenty years. My students represent multiple disciplines within the College of Education as well as from disciplines across the university. Like most instructors of the only or an introductory class in evaluation, and with this composition, I have sought to find the right balance between theory and practice while at the same time trying not to oversimplify the reality of the practice.

Hot Tip: In one part of my lessons, I merge ethics, the Joint Program Evaluation Standards and AEA’s Guiding Principles through ethical dilemmas. Specifically, for my class, these dilemmas are drawn from Newman, D. & Brown, R. (1996) Applied ethics in program evaluation. Sage: Thousand Oaks, CA (This text is required of my students). However, by no means, is this the only source for obtaining examples of ethical dilemmas encountered by practicing evaluators. See the Ethical Challenges section of the American Evaluation Journal and/or Michael Morris (2008) Evaluation ethics for best practice: Cases and commentaries. Guilford Press: New York, just to name of few resources. Now, how do I guide my students through this experience?

  1. I make sure my students have already reviewed the Joint Standards for Program Evaluation and AEA’s Guiding Principles; then they are placed in groups of 3-4 students;
  2. They are introduced to Newman and Brown’s text which presents for some and reviews for others, definitions, theories, principles, etc. that can be used to inform decisions when confronted with an ethical dilemma. A unique feature in this text is its vignettes, framework and flowchart developed by the authors to guide decision-making; now the fun part;
  3. Each group is assigned a vignette and asked how they would resolve the problem.

Almost without fail, the students disregard the standards, principles, theories, framework, etc., and solve the problem based on their own unique experiences. This is turn provides an opportunity for me to use the standards to conduct, where possible, a metaevaluation of the scenario in the vignettes and then look to the framework for possible solutions to the dilemma. We really get into some great discussions regarding best solutions. Next time, I will start with step 3, discuss and then move to steps 1 and 2 and see how the solutions change.

This contribution is from the aea365 Daily Tips blog, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to

· · ·

Older posts >>


To top