HPEER TIG Week: Mapping Assessments to Licensing Exam Content Areas Utilizing the Model for Collaborative Evaluation by Davina M. DeVries

Greetings fellow evaluators! I am Davina M. DeVries, Learning and Development Manager for the University of South Florida College of Pharmacy (COP). In my spare time I am a PhD student in the Measurement, Research, and Evaluation program, focusing on educational program evaluation.

The college of pharmacy asked me to develop and evaluate the system for tracking learning outcomes across the 4-year program by mapping exam questions to specific accreditation standards. This process required that I try several iterations of the tagging process to find the most efficient process for getting the system organized correctly. I would like to describe my approach to the Model for Collaborative Evaluation (MCE) in a rapid-fire evaluation. I chose to use the MCE to give stakeholders ownership of the process and use real-time feedback to make required corrections.

I asked the faculty how they currently use the mapping system, what corrections needed to be made, and how they would use the data. To aid in the process and act as subject matter experts, fourth year students, during their academic rotations, were selected to assist in the tagging process.

Professional programs are tasked with preparing student to excel in their field of choice, which is guarded by licensure exams. The COP uses a secure testing database to input exam questions and then tag questions to the various standards for our program. Faculty tend to overlook this step or avoid using the existing correct categories for tagging questions. Using the MCE I was able to identify key stakeholders and use open communication to find where the process had stopped working, and then used successive rapid-fire evaluations to determine a new process for updating the question database. With each iteration the approach students used to complete their task was updated.

Lessons Learned:

  1. The MCE is a stakeholder involvement approach that allows for an open dialogue that supports determination of the project’s scope. Stakeholders can actively participate and have ownership of the outcomes to be able to use the outcomes in real time.
  2. The perceptions that the system needed to be 100% correct before it could be used again was incorrect. This process is a gradual change that improves over time.
  3. Faculty are busy, so recruiting students to assist helps the faculty and the student. 4th year students are studying for the licensure exam and get a boost by working on this project.
  4. Allow those assisting to find their own way of completing the required tasks. No two students want to review questions in the same manner.
  5. Choose one course at a time and work diligently to review all questions.
  6. Train faculty to take charge once their course’s questions are completely tagged. The faculty must have ownership of the database.

Rad Resources:

The American Evaluation Association is celebrating Health Professions Education Evaluation and Research (HPEER) TIG Week with our colleagues in the Health Professions Education Evaluation and Research Topical Interest Group. The contributions all this week to aea365 come from our HPEER TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

1 thought on “HPEER TIG Week: Mapping Assessments to Licensing Exam Content Areas Utilizing the Model for Collaborative Evaluation by Davina M. DeVries”

  1. Hi Sheila,
    I am a PME student at Queen’s University and a Dental Hygiene Educator. As part of the course that I am taking on program evaluation we have been asked to find an article that interests us in the evaluation community. Your article was a great example of collaborative evaluation as described in our readings from Shulha, L. & Cousins, B. (1997). You epitomized the partnership between the evaluator and the program stakeholders which helped to form a win-win for 4th year students. As faculty became too busy to participate in the mapping exercise, 4th year students had the unintended benefit of reviewing questions to ready themselves for the licensure exam.
    The collaborative evaluation has other unintended benefits, such as organizational learning. This was illustrated in the training of faculty so that they take ownership for the continual development of the database and demonstrating that its accuracy will improve with use.
    Lastly, it was interesting to me that actual questions are mapped to show that professional standards are met. In Canada, our national licensure exam is set up this way, but the educational programs for dental hygiene, maps curriculum themes/standards to course learning outcomes. In the course maps, we identify how these learning outcomes are tested (a practical, case study, multiple choice test).
    Thank you for sharing collaborative evaluation in practice, that had the elements of an evaluator and stakeholders working well together and ultimately promoting evaluation use.
    Reference: Shulha, L., & Cousins, B. (1997). Evaluation use: Theory, research and practice since 1986. Evaluation Practice, 18, 195-208.

Leave a Comment

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.