AEA365 | A Tip-a-Day by and for Evaluators

CAT | Evaluation Managers and Supervisors

Hi everyone! I’m Yvonne M. Watson, a Program Analyst in the United States Environmental Protection Agency’s (EPA’s) Evaluation Support Division (ESD). As chair of the American Evaluation Association’s Environmental Program Evaluation Topical Interest Group, I invite you to learn more about evaluation and environmental issues this week.

Evaluation units across the federal government vary in size, budget, expertise and mission/ mandate. ESD manages evaluation studies, but a significant part of our mission is to build evaluation capacity and establish a culture of program improvement and continuous learning. To build evaluation capacity and a performance management culture, we have experimented with the following activities: 1) an internal competitive proposal solicitation process that provides funding and technical assistance for evaluation studies; 2) the delivery of performance management training; 3) the design of tools and products (e.g., program evaluation guidelines); and 4) supporting internal and external evaluation networking forums. The tips, and lessons learned below reflect some of the insights gained over a 12-year journey to build evaluation capacity and a performance management culture at EPA.

Lessons Learned:

  • Champions. It’s challenging to build a “program evaluation” culture without the ongoing support of career executives, and the strategic support of political appointees. Not all managers believe “Program evaluation is good for you.” However, when you find one that does, don’t let go! Sustain their interest with information about evaluation basics and “nuggets” about evaluation results. (A nugget might be a paragraph describing key findings, outcomes and how the evaluation is being used.) The best evaluation champion is an educated and equipped champion. Developing these internal champions often takes time, thought and intentionality but you’ll find that in the long run, the juice is worth the squeeze!
  • Communication. Crafting the right “message” to communicate the value of program evaluation to senior managers, mid-level managers and program staff is an art. The right (or wrong) message can turn the tide from “evaluation apprehension” to “evaluation appreciation”. What is your program evaluation message? – “Try it, you’ll like it.” If you have one re-examine it. Is it easily understood? You’ll also need to face the sobering reality that even the best message will fall flat on closed ears.
  • Credibility. Building credibility within your organization occurs one evaluation at a time. It’s important to build a cadre of internal evaluation staff and external evaluators (contractors and consultants) who can demonstrate technical competence and knowledge of state-of-the art tools and techniques available to conduct high-quality rigorous program evaluations. In addition, invest the time and resources needed to develop a portfolio and body of work (evaluation studies) that increase your internal client’s confidence and demonstrate your organization’s evaluation competence and capacity.

The American Evaluation Association is celebrating Environmental Program Evaluation Week with our colleagues in AEA’s Environmental Program Evaluation Topical Interest Group. The contributions all this week to aea365 come from our EPE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

My name is Frank Meintjies and I work as a consultant in South Africa. While most of my evaluation work has related to poverty reduction initiatives, I have also undertaken evaluation work on HIV and AIDS programmes.

Evaluation and strategic planning are so closely linked. Some organisations want to dive in to strategic planning without doing a good evaluation first. If you do that, it would be a mistake; it would be a classic case of ‘more haste less speed’.

If you evaluate first, you lay a solid foundation, on so many levels. The review will build a common base of understanding among those involved in crafting the new strategy. Participants will develop (or renew) a shared understanding of what they do; how and why they do what they do; what progress is being made; and, whether the work and objectives are still relevant to a fast-changing context.

Hot Tip:
Make time to take a cool look at your organisation, its achievements, capabilities and impact, before plunging into strategic planning. Get information from beneficiaries and data from the field; this will serve as fuel for the creative thinking processes.

Tip: If you are under time pressure to hold a strategic planning, you may have to opt for a rapid evaluation. Such an evaluation is undertaken within a short time frame, but levels of validity and reliability are high enough to enable program staff to make confident, informed decisions. To learn more about the use of rapid evaluation, see the International Training & Education Center for Health’s resource at http://tinyurl.com/7huxffe to learn more about the use of rapid evaluation.

Lesson Learned: If you aren’t clear what the current state of play is then you are trying to look into the future from a muddy and clouded vantage point. Having a clear perspective of who and what your organisation is will sharpen your gaze as you examine the the future, with all its constraints and possibilities.

Lesson Learned: If you have undertaken a recent evaluation, or even if you undertake ongoing monitoring and evaluation, a major evaluation exercise before strategic planning will most likely be unnecessary. However, it will be good to revisit the key points that emerge out of such evaluations as a precursor to, or during, the strategic planning activity.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

I am Corinna Souza and I am an evaluator at evalutechnica. Our small firm works primarily with regional school districts. Lately, I’ve been on the lookout for tools to help my evaluation team work collaboratively. There are four of us, working from two different offices, plus our key liaison to the schools with which we are working.

Lesson Learned – What is a wiki: It is an online website that allows for collaborative adding and editing to the content of its pages. You have probably heard of Wikipedia, the most well known wiki out there. There are lots of other wikis as well.

Rad Resource – Packard Foundation’s Goldmine Research Project Wiki: One wiki example is from the Packard Foundation and TCC group’s Goldmine project. The project’s“objective is to analyze and make meaningful the “goldmine” of organizational effectiveness data from 1300 capacity building projects in order to 1) enable the Foundation and the field to better evaluate OE grants and 2) disseminate lessons learned from the field—all in an engaging and transparent way.” Their wiki includes discussions, what they’ve learned, and lots of tools and resources.

Hot Tip – WikiMatrix: Wikimatrix is a huge database that allows you to compare 134 wiki platforms. There are a lot of options, aimed at different audiences and with different capabilities. We did quite a bit of research trying to choose the best platform.

Lesson Learned – Go With Supported Platforms: Ultimately, we ended up on the same platform as the Packard Foundation’s wiki – Wikispaces. We got drawn in by the range of tools and bells and whistles available, but in the end went with Wikispaces because it had the core functionality we needed and, what turned out to be the most important, two members of our team were already comfortable using it and could help along the way.

Rad Resource – Wikispaces: Using Wikispaces, we have, for free, gotten a basic multi-page website, a discussion platform, and a space for collaborative document development. It has good administration tools so that different people can have different access points (some can see, some can edit, etc.).

Lesson Learned – History: Our favorite tool on Wikispaces, which seems to be common across wikis, is the history function. We can back up a webpage (a document that we are co-editing) to any previous version and we can see who made changes along the way.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

My name is Lisa Dillman and I am a graduate student at UCLA. I served as a session scribe at Evaluation 2010, and attended session number 529: Taking Control of Your Evaluation Career. I chose to attend this skill building workshop because I have both a research and personal interest in how people come to acquire the skill set necessary to conduct evaluations.

Lessons Learned: Ann Maxwell of the United States Department of Health and Human Services and George Grob of the Center for Public Program Evaluation offered great advice, tips, and resources for evaluators at all stages in their careers. Some of their specific recommendations include:

  • Each year, set a goal to learn three things very well (particularly if you aren’t receiving professional development support in your current position). For example, you could choose sampling techniques, meta-evaluation, and a content area relevant to your work as areas of focus. Investigate not only what you need to improve, but how you are going to accomplish it, and formulate a plan. Imagine what your skill set will be after 5, 10, 15 years or more!
  • Don’t overwork! Remember that you were hired for your creativity and problem solving skills. Spending too much time mentally laboring over solutions can be detrimental—answers will come to you when you least expect it.
  • It is important for those who supervise evaluators to realize that ongoing training of evaluators is critical—investing in people is a way to enhance evaluation impact.
  • Keep interested in your career—it is of great value to you and others!

Great Resources: The presenters have shared several tools from the workshop in the eLibrary. Among the downloads are an Individual Development plan, a Self Appraisal for Program Evaluation Staff, and several samples of charts detailing core evaluator competencies. The files can be found online in the AEA eLibrary here.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Hello, my name is Anthony Kim, and I am a doctoral candidate in Policy, Organization, Measurement, and Evaluation at U.C. Berkeley’s Graduate School of Education. Prior to Berkeley, I worked at an Education nonprofit as a program manager that had to fill a dual role as internal evaluator of my program. As might be expected, my objectives as a program manager did not coincide with my objectives as an internal evaluator.

Below are some tips on how to manage this type of situation:

Hot Tip #1: Don’t let stakeholders manage the process: As a program manager, it is essential to keep a good relationship with key stakeholders to ensure the viability and success of a program. However, the presence of multiple stakeholders with competing interests can paralyze the evaluation process for program managers. While stakeholders should feel a sense of ownership in the evaluation process, the program manager/internal evaluator must be sure to not allow this sense of ownership to turn into a sense of entitlement in setting the future direction of the program.

Hot Tip #2: Assign managers to evaluate programs that they are not directly affiliated with: Often times, a tight budget will force organizations to fore go hiring independent evaluators, and to instead rely on program managers in a dual role. Organizations could mitigate the resulting conflicts of interest by assigning managers to evaluate programs that they are not directly affiliated with. As a side benefit, this type of cross-evaluation would allow managers to learn about programs they are not as familiar with.

Hot Tip #3: Don’t shortchange the evaluation process: Wholesale program change is disruptive, involves the retraining and hiring/firing of key program staff members, and is in general highly work-intensive for the program manager. As a dual-role program manager/internal evaluator, it may be tempting to conduct a cursory evaluation, and as a result leave your program largely unchanged.

However, shortchanging the evaluation process is a myopic approach. An honest, comprehensive evaluation allows for an opportunity to leverage a program for maximum impact. In my case, I managed a program that served at-risk school children, and there was a very real cost to any program shortcomings. Ultimately, it is important to remember that your program serves a certain constituency, and that settling into a “comfort zone” may be detrimental toward your program goals.

This contribution is from the aea365 Tip-a-Day Alerts, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org. Want to learn more from Anthony? He’ll be presenting as part of the Evaluation 2010 Conference Program, November 10-13 in San Antonio, Texas.

· ·

My name is Michelle Jay and I am an Assistant Professor at the University of South Carolina. I am an independent evaluator and also an evaluation consultant with Evaluation, Assessment and Policy Connections (EvAP) in the School of Education at UNC-Chapel Hill. Currently I serve with Rita O’Sullivan as Directors of AEA’s Graduate Education Diversity Internship (GEDI) program.

Lessons Learned: A few years ago, EvAP served as the external evaluators for a federally-funded Gaining Early Awareness and Readiness for Undergraduate Programs (GEAR UP) state-wide grant housed at University of North Carolina (UNC) General Administration. Part of our work involved assisting project coordinators in 20 North Carolina counties to collect student-level data required for their Annual Performance Review reports as well as for program monitoring, assessment, and improvement. For various reasons, project coordinators experienced numerous difficulties in obtaining the necessary data from their Student Information Management Systems (SIMS) administrators at both the school and district levels. As collaborative evaluators, we viewed the SIMS administrators not only as “keepers of the keys” to the “data kingdom,” but also as potentially vested program stakeholders whose input and “buy-in” had not yet been sought.

Consequently, in an effort to “think outside the box,” the EvAP team seized an opportunity to help foster better relationships between our program coordinators and their SIMS administrators. We discovered that the administrators often attended an annual conference each year for school personnel. The EvAP team sought permission to attend the conference where we sponsored a boxed luncheon for the SIMS administrators. During the lunch, we provided them with an overview of the GEAR UP program and its goals, described our role as the evaluators, and explained in detail how they could contribute to the success of their districts’ program by providing the important data needed by their district’s program coordinator.

The effects of the luncheon were immediate. Program coordinators who had previously experienced difficulty getting data had it on their desks later that week. Over the course of the year, the quality and quantity of the data the EvAP team obtained from the coordinators increased dramatically. We were extremely pleased that the collaborative evaluation strategies that guided our work had served us well in an unanticipated fashion.

Hot Tip: The data needs of the programs we serve as evaluators can sometimes seem daunting. In this case, we learned that fixing “the problem” was less a data-related matter that it was a “marketing” issue. SIMS administrators, and other keepers-of-the-data, have multiple responsibilities and are under tremendous pressure to serve multiple constituencies. Sometimes, getting their support and cooperation are merely a matter of making sure they are aware of your particular program, the kinds of data you require, and the frequency of your needs. Oh, and to know that they are appreciated doesn’t hurt either.

This contribution is from the aea365 Daily Tips blog, by and for evaluators, from the American Evaluation Association. Please consider contributing – send a note of interest to aea365@eval.org.

· ·

<< Latest posts

Archives

To top