AEA365 | A Tip-a-Day by and for Evaluators

CAT | Program Design

Hello from Debi Lang with the Massachusetts Area Health Education Center Network (MassAHEC) at the University of Massachusetts Medical School’s Center for Health Policy and Research. I last published an aea365 post on how evaluation and program staff collaborated to establish a competency-based model for a range of MassAHEC Health Careers Promotion and Preparation (HCPP) programs. The current post focuses on the importance of learning objectives as part of program design and evaluation, with some tips and resources on how to write clear objectives.

The AHEC HCPP model consists of 5 core competencies with learning goals that apply across a range of HCPP programs (see the chart below).

Core Competencies                                  Learning Goals

lang-1

Each of the programs has written learning objectives that define specific knowledge, skills, and attitudes students will learn by participating in these programs. Learning objectives are important because they:

  • document the knowledge, skills, attitudes/behaviors students should be able to demonstrate after completing the program;
  • encourage good program design by guiding the use of appropriate class activities, materials, and assessments;
  • tell students what they can expect to learn/become competent in by participating in the program; and
  • help measure students’ learning.

Below are some of the learning objectives from one HCPP program and their connection to the competencies listed above:

lang-2

Hot Tips: Here are some recommendations for writing learning objectives.

  • Think of learning objectives as outcomes. What will students know/be able to do once they complete the program? Start with the phrase: “At the end of this program, students will…”
  • Be careful not to write learning objectives as a description of the activities or tasks students will experience during the program.
  • Make sure student learning assessments are based on the learning objectives.

Rad Resource: “Bloom’s Taxonomy” is a framework based on 6 levels of knowledge (cognition) that progress from simple to more complex. When writing learning objectives, use the keywords associated with the knowledge level you expect students to achieve.

To be continued…

Program-specific learning objectives that connect to one or more core competencies can help measure student learning in order to report program outcomes from a competency perspective on a local and state level. In a future post, I’ll discuss how learning objectives are used in an evaluation method called the retrospective pre-post, along with ways to analyze data collected using this design feature.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I’m Chithra Adams, a program evaluator at the Human Development Institute, University of Kentucky. I explore how design principles and research can be applied to program evaluation.

Rad Idea – prototyping: A key part of the design process is prototyping. Prototyping is a process by which ideas and concepts are tested for technical and social feasibility. Prototypes are physical products. They can range from the expression of an early idea to an almost complete product. Depending on the context, prototyping can be used in several ways (Stapper, 2010):

Prototypes evoke a focused discussion in a team because the phenomenon is ‘on the table’.

Prototypes allow testing of a hypothesis.

Prototypes confront theories because instantiating one typically forces those involved to consider several overlapping perspectives/theories/frames.

Prototypes confront the world because the theory is not hidden in abstraction.

A prototype can change the world because in interventions it allows people to experience a situation that did not exist before.

Lessons learned in prototyping evaluation products and process: I use prototyping to test out new ideas as well as to get the client involved in the evaluation process. Prototyping can be used both to test out ideas for products (reports, briefs) and process (client interaction, stakeholder involvement). Regardless, the client/stakeholder should understand that the prototype is only a draft and further refinements will be made. In some cases, prototyping with a client/stakeholder is simply not feasible. In those cases, the idea or product can be tested with someone outside a project. Sometimes I try out the first few iterations with other evaluators and test the final versions with clients.

The timing of the prototyping process is critical. It should be done when the client has the time to provide feedback for further refinement. You cannot prototype an evaluation report a week before it is due! The key to prototyping is rapidly testing ideas and getting feedback. Be receptive to feedback. Being receptive to it does not mean that all feedback will be incorporated in the next iteration. Expert judgment should be used to identify what feedback will be adopted.

Rad Resources:

Sanders and Stapper (2014) describe various processes designers use to gain insights and test ideas with consumers.

While this blogpost describes prototype within the context of web development, it provides a really broad snapshot about the nuts and bolts of prototyping.

The American Evaluation Association is celebrating Program Design TIG Week with our colleagues in the Program Design Topical Interest Group. The contributions all this week to aea365 come from our Program Design TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

I’m Alvin Yapp, an evaluator with the Edmonton Oliver Primary Care Network (EOPCN), a team of doctors and other health providers, such as nurses, dietitians, kinesiologists, mental health workers, and pharmacists. We work as a team to provide coordinated primary health care to patients. I lead the evaluation efforts at EOPCN. In this post, I share my experiences in using evaluation to inform the design of primary health care programs.

I have found evaluation to be a powerful tool for program design; evaluators can help create a framework to gather evidence which informs program development. I have found that evaluators can positively influence program design through the following activities:

  1. Identify important indicators to examine in order to measure success of the program.Teams that design programs without clear indicators of success and a plan to measure those indicators will have a difficult time identifying problems and fixing them.

2. Embed evaluation processes into the processes of the program; evaluation is a part of the program.

If the evaluation processes are a key part of the program and clearly articulated from the beginning, staff will be much more engaged with evaluation activities (i.e., providing feedback) and the quality of the data will be higher.

3. Provide periodic check-ins of indicators to support ongoing development.

This does not have to take the form of a weekly written report, but can be regular team huddles around what the data is currently showing and informal conversations about possible problems and solutions.

4. Support evidence-based program changes.

Use the evidence from evaluation activities to inform and rationalize changes to the program.

Hot Tip: Spend some time instilling a culture of evaluation into the team/organization. If the team understands that you are there to support the design and success of their program, they will be much more open with identifying areas where the program can improve. That said, do not overburden them with evaluation activities; they still have a program to design and implement!

Lesson Learned in Evaluating Developing Programs: Program development does not happen in a straight line. Be nimble. Plan as much as you can, and plan for those plans to go wrong.

The American Evaluation Association is celebrating Program Design TIG Week with our colleagues in the Program Design Topical Interest Group. The contributions all this week to aea365 come from our Program Design TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

I’m Marti Frank, a researcher and program evaluator from Portland, Oregon. I’ve found the hardest part of a project is making recommendations that resonate with my client, and I’ve been working on an approach to developing and communicating recommendations that’s rooted in non-violent communication.

Lessons Learned: Evaluation clients are naturally attached to their programs. Finding non-threatening ways to draw attention to program design issues and inspire action can be a challenging skill to master for evaluators.

Hot Tip: Use a non-violent communication approach to develop and present program design recommendations.

Non-violent communication (NVC) dates back to the 1960s and people have found it useful in incredibly diverse contexts, from peace-making to parenting. NVC focuses on three aspects of communication: self-empathy (an awareness of one’s inner experiences), empathy (understanding the feelings and emotions of others), and honest self-expression (expressing oneself authentically in a way that inspire compassion from others). The goal is to foster open, honest communication and avoid communication that block compassion or alienates people. My eight-year-old son is learning NVC at school as a way to mediate disputes. Why not use it in evaluation, too?

In my interpretation, NVC is a three-step process that frames a conversation.

Observations ? consequences ? requests

In the evaluation context:

  • Observations and consequences are both evidence-based, they differ only in their causal relationship; the evaluator’s challenge is to distinguish between them.
  • Observations can be any information that helps us describe the program and its context. This approach to thinking about and reporting on evaluation findings has helped me make a place for anecdotal or otherwise-hard-to-chart data.
  • Consequences are what result from the observed conditions. In my formulation, consequences are usually the social condition we want to explain, or change.
  • Requests are recommendations: what we think needs to happen to move from the consequences at hand to our ideal state.

Lessons Learned: I find the benefits of the NVC approach are that it:

  • Forces us to think about data in a narrative format. This means there’s no shying away from questions of causality.
  • Makes room for anecdotes and other one-off pieces of data that strike us intuitively as important but which otherwise may not find a home in an evaluation report structured around the statistically significant results of data collection activities.  
  • Contextualizes recommendations so that, by the time we get to them in the report or presentation, we’ve brought the audience along on our chain of logical reasoning.
  • Can help tie the particular findings from any single evaluation back to the organization’s theory of change, supporting or adding nuance to the theory.

Rad Resource: Getting to Yes is a classic handbook on negotiation and communication with ties to the NVC approach.

The American Evaluation Association is celebrating Program Design TIG Week with our colleagues in the Program Design Topical Interest Group. The contributions all this week to aea365 come from our Program Design TIG  members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

My name is Alexey Kuzmin and I am the Director of the Process Consulting Company based in Moscow, Russia (since 1992). We specialize in program evaluation and evaluation related services such as evaluation training; design and implementation of monitoring and evaluation systems. Often clients invite us to participate in program design.

Lesson Learned: Values influence program design to a great extent, but rarely become an explicit part of program description.

Cool trick: Suggest your clients to make program values an explicit part of their program design. It will stimulate an in-depth and important discussion among the key stakeholders.  

Hot Tips: Remember that if you decide to do so, you need to be ready to:

  • Raise stakeholders’ awareness of role and importance of values in program design.
  • Help develop and describe program values.
  • Explore both organizational and social values.
  • Be able to identify real values as opposed to declared values.
  • Consider multiple value perspectives and cultural context.
  • Respond to values-related evaluation questions.

Rad Resources:

  • Whitman’s article in the Nonprofit Management & Leadership, sheds light those social values that underpin philanthropic foundations.
  • My colleague (Irina Efremova-Garth) and I reported on our experiences in including values in program design at the AEA15 conference. Our slides are available via the AEA Public eLibrary.

The American Evaluation Association is celebrating Program Design TIG Week with our colleagues in the Program Design Topical Interest Group. The contributions all this week to aea365 come from our Program Design TIG  members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Hello! I am Terence Fitzgerald, Senior Director of Program Design & Evaluation at International Justice Mission (IJM), a nonprofit organization that works to protect people from violent crime.  I lead a team of staff that provides leadership and technical support to IJM on program design, monitoring, evaluation, and research.  My team’s remit is to “bring evidence to bear for the mission” so that leaders at executive, portfolio, and project levels can make informed decisions; and so that IJM can realize its vision and fulfil its mission.  I want to share lessons that my team has learned from working on program design from inside a mission-driven organization.  

Lessons Learned:

  • Demystify program design: Before we engage with a team on program design, we ask for and listen to their views and experiences of it; and, where necessary, we explain concepts, processes, definitions, benefits, and challenges.  We meet the teams where they are.  I have found that a one-hour session on designing a personal program around “being more healthy” can generate lots of input and allow for discussion of many core design issues – different types of impact; relationships between resources, activities, and results; milestones and indicators; and assumptions and risks.  Even with inexperienced staff, it can be quite easy to convey key concepts, get enthusiastic buy-in, and then build a more solid design.
  • Progressively elaborate the design: As we work with teams on a design, my team creates draft logic models and other design artifacts based on our understanding of what they have told us.   We give those back to the team to review our work, correct any erroneous content, note any unresolved issues, and to secure buy-in for further elaboration.  We raise concerns where we see threats to the design’s plausibility or feasibility, and we help teams to understand the causes of our concerns.  We raise unresolved concerns to decision-makers.
  • Facilitate change to designs: Teams can tend to revere and thus be reluctant to change the designs into which they have put time and energy.  My team reinforces that designs are meant to change, based on the team’s experiences and environmental changes.  We explain and, where necessary, guide teams through IJM’s program change management process.

Rad Resources:

  • The OECD DAC criteria for evaluating development assistance are very useful to bring evaluation into the program design process.  My team uses the criteria to inform design discussions that often lead to clarifications on program priorities, inclusions and exclusions, and other design elements.

The American Evaluation Association is celebrating Program Design TIG Week with our colleagues in the Program Design Topical Interest Group. The contributions all this week to aea365 come from our Program Design TIG  members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

 

 

No tags

Hi! We’re Angelina Lopez (NYC Department of Education’s iZone) and Chi Yan Lam (Queen’s University) from the Program Design TIG. The PD-TIG was founded to provide a forum to explore the theory and practice of program design. Our TIG is proud to sponsor this week’s AEA365 posts. The contributors this week each work at the intersection of evaluation and design. You’ll hear how program design has shaped the way they approach evaluation to enhance evaluation use and program impact.

Hot Tip: 2016 is the Year of Evaluation + Design. At #Eval16, AEA’s annual conference, the concept of design will be explored through the lens of program design, evaluation design, and information design.

Hot Tip: While the field of design has traditionally been associated with the creation and development of tangible products (e.g., industrial design of consumer products; architecture of public spaces), designers are increasingly transcending fields and applying their craft in the social, business, and public sectors:

Rad Resource: The design and social innovation community is actively exploring how design methods, processes, and mindsets may be applied towards solving complex social issues. Toolkits including the Design Kit by IDEO.org, Bootcamp Bootleg by d.School at Stanford University, and the Development Impact & You Toolkit by Neta UK share how to facilitate design methods and activities to explore the needs of end-users of programs and to think creatively about potential solutions (i.e., programs, products, and services).

Lesson Learned: “Design Thinking” has become a trendy buzzword and is often misconstrued as a clear, linear process. In reality, an integrated approach to program design is messy and requires continuous alignment between people and organizations. Additionally, many of these so-called “innovative” methods and processes are not new. We’ve found alignment with participatory and developmental evaluation frameworks, and believe evaluative thinking skills can add value to design approaches.

The American Evaluation Association is celebrating Program Design TIG Week with our colleagues in the Program Design Topical Interest Group. The contributions all this week to aea365 come from our Program Design TIG  members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

Archives

To top