AEA365 | A Tip-a-Day by and for Evaluators

Search

We are Debi Lang and Sharon Grundel, with the Massachusetts Area Health Education Center Network (MassAHEC) at the University of Massachusetts Medical School’s Center for Health Policy and Research. MassAHEC supports health career exploration programs for diverse youth. Through four community-based AHEC Centers and 19 chapters of HOSA – Future Health Professionals (a national student-led membership organization), over 1000 students statewide participate annually in experiential and academically enriching curriculum.

Given the range of long-established program activities that align with differing community needs, we were challenged to develop an evaluation approach that could be standardized among all the youth programs to effectively ‘tell our story’ and position ourselves to seek additional funding.

Lessons Learned:

  • Establishing uniformity – Initially, we considered developing uniform curriculum across youth programs, as has been done with other MassAHEC projects. However, the Centers expressed the belief this would compromise their responsiveness to locally-based health career training needs and the successful activities they worked hard to develop. This meaningful dialogue led to defining five core competency areas that were applicable to all MassAHEC youth programs and the HOSA chapters.
  • Acknowledging uniqueness – After reaching consensus with AHEC staff on core competencies and aligned goals, we met with the Youth Program Coordinators to define a set of measurable, flexible, learning objectives for each goal reflecting the unique activities of their individual programs. The opportunity to listen to each other as the Coordinators described their individual programs, and to collectively design the learning objectives was powerful. Not only did the experience cement staff investment in tying program activities to these competencies, goals and objectives, but all agreed there is value in ongoing collaboration, especially for new staff.
  • Collaborating toward a shared purpose – This process of developing youth program goals, objectives, and assessment tools demonstrated the ability of evaluators and program staff to successfully work together toward a common purpose of gathering the evidence to demonstrate how the various youth programs positively affect participating students. Going forward, student learning will be assessed using pre-and post- tests based on program-specific learning objectives. Aggregated data will hopefully demonstrate how MassAHEC youth programs impact high school students’ knowledge of, interest in, and ability to pursue education and careers in the health professions.

Rad Resources:

AEA365! The Youth-Focused Evaluation posts on AEA365 in September 2013 were extremely helpful in providing examples of existing evidence-based assessment tools; e.g. the locally-developed youth assessment efforts described by the Conservation Corps of Minnesota and Iowa, and the Youth Experience Survey used by Good Shepherd Services. We also discovered the Rural Youth Development Evaluation Toolkit, which modeled a pre and post-assessment of youth involved in community projects.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

 

No tags

Hello from Debi Lang with the Massachusetts Area Health Education Center Network (MassAHEC) at the University of Massachusetts Medical School’s Center for Health Policy and Research. I last published an aea365 post on how evaluation and program staff collaborated to establish a competency-based model for a range of MassAHEC Health Careers Promotion and Preparation (HCPP) programs. The current post focuses on the importance of learning objectives as part of program design and evaluation, with some tips and resources on how to write clear objectives.

The AHEC HCPP model consists of 5 core competencies with learning goals that apply across a range of HCPP programs (see the chart below).

Core Competencies                                  Learning Goals

lang-1

Each of the programs has written learning objectives that define specific knowledge, skills, and attitudes students will learn by participating in these programs. Learning objectives are important because they:

  • document the knowledge, skills, attitudes/behaviors students should be able to demonstrate after completing the program;
  • encourage good program design by guiding the use of appropriate class activities, materials, and assessments;
  • tell students what they can expect to learn/become competent in by participating in the program; and
  • help measure students’ learning.

Below are some of the learning objectives from one HCPP program and their connection to the competencies listed above:

lang-2

Hot Tips: Here are some recommendations for writing learning objectives.

  • Think of learning objectives as outcomes. What will students know/be able to do once they complete the program? Start with the phrase: “At the end of this program, students will…”
  • Be careful not to write learning objectives as a description of the activities or tasks students will experience during the program.
  • Make sure student learning assessments are based on the learning objectives.

Rad Resource: “Bloom’s Taxonomy” is a framework based on 6 levels of knowledge (cognition) that progress from simple to more complex. When writing learning objectives, use the keywords associated with the knowledge level you expect students to achieve.

To be continued…

Program-specific learning objectives that connect to one or more core competencies can help measure student learning in order to report program outcomes from a competency perspective on a local and state level. In a future post, I’ll discuss how learning objectives are used in an evaluation method called the retrospective pre-post, along with ways to analyze data collected using this design feature.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

No tags

Archives

To top