AEA365 | A Tip-a-Day by and for Evaluators

TAG | self-assessment

Hi there! We’re Sally Laskey, Outreach Director at the National Sexual Violence Resource Center, Kris Bein, Resource Sharing Project Assistant Coordinator at the Iowa Coalition Against Sexual Assault, and Liz Zadnik, Capacity Building Specialist with the New Jersey Coalition Against Sexual Assault. We’ve worked together to strengthen a self-assessment tool for advocates working with survivors of trauma at dual (domestic violence and sexual assault) or multi-service organizations.

Sally and Kris have worked closely for years developing training techniques for the tool, as well as assisting organizations as part of the Sexual Assault Demonstration Initiative. The assessment was developed to support organizational efforts to enhance services, as well as identify strengths and gaps regarding:

  • Services to survivors of sexual violence
  • Relationships in the community
  • Confidence in responding to survivors of sexual violence
  • Skills for responding to survivors of sexual violence

Lessons Learned: The depth and meaning of the self-assessment is very connected to larger contexts within an organization. While Sally and Kris have been working with folks across the country, they’ve noted a number of trends and approaches:

  • “Assessment” = listening and understanding. In other words, professionals who were initially skeptical or hesitant about the process quickly identified how the questions dug deep and explored neglected areas of practice or development.
  • Try not to focus on what’s wrong, but rather on strengths and leadership.
  • Conversely, those gaps or needs help set a clear path for change. The assessment is also designed to help strengthen services, thus requiring some shifts in practice.

Meanwhile, Liz has been working on some tools that build on a role-specific assessment for preventionists working with communities to end sexual violence. The compendiums are influenced by core competencies and qualities for this group of professionals and designed to prepare and support them in ever-evolving work. Similar to the self-assessment for advocates, this tool will be a strengths-based exercise for professional development and self-reflection.

Hot Tip: When providing training, connect the assessment with practitioners’ established skillset to reduce resistance and confusion. The skill of listening has been a great bridge in connecting advocates with assessment.

Rad Resource: Check out SADI’s other resources for capturing community voices and strengthening services in multiservice organizations.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org . aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Greetings. I am Christine Johnson, the Director of Transformation and Quality Improvement for the Patient Centered Medical Home Initiative (PCMHI). I am from the University of Massachusetts Medical School’s Center for Health Policy and Research. Today’s post shares how using self-assessment medical home transformation tools with primary care practices can help practices self-evaluate throughout the transformation process and provide data for the evaluation team.

Medical home transformation involves multiple stakeholders: health insurance payers, practices, policy makers and those providing transformation technical assistance. When practices complete the same tool and then share their results, multiple stakeholders can literally be on the same page with a similar understanding of what has been done, what needs to be done, and identifying gaps. Self-assessment tools can be tailored to your individual project’s goals, or standardized tools can be used ‘as is’.

Hot Tips:

  • Use tools to monitor progress and design technical assistance. Not only do the results from the transformation tools support practices to track and monitor their practice redesign, they allow technical assistance and practice staff to discuss any differences in their perception of the practice change efforts and can be a key resource for designing further technical assistance.
  • Utilize health insurance payers as stakeholders. Payers can see progress being made that is often intangible and support the practices in building the necessary foundation that will eventually lead to clinical performance improvement.
  • Administer self-assessment tools multiple times throughout a project to highlight small, but encouraging, changes.

Lessons Learned: Self-assessment tools can:

  • Establish a practice’s baseline
  • Enable practices to understand where they are in their transformation compared to other practices
  • Guide and structure practices’ transformation, particularly if the transformation tool has both an actual and expected project status over time
  • Allow technical assistance staff to step in early to support practices that are struggling in their transformation

Hot Tip: Save yourself and the practices the time of developing and testing a new tool. Take a look at the growing number of tools already available (see links below) rather than creating your own.

Hot Tip: Once practices have some experience using a self-assessment tool, ask practices who are finding the tool useful and are successfully accomplishing their PCMH transformation to present to the other practices either via a conference call or a webinar.

There are no “perfect” on-line assessments but our team suggests:

Qualis Site landing page

Transformation practice self-assessment tool

Medical Home Index

TransforMED MHIQ

Rad Resource: Measuring Medical Homes

Clipped from http://www.medicalhomeimprovement.org/knowledge/practices.html

The American Evaluation Association is celebrating Massachusetts Patient-Centered Medical Home Initiative (PCMHI) week. The contributions all this week to aea365 come from members who work with the Massachusetts Patient-Centered Medical Home Initiative (PCMHI). Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators. 

Our names are Wendy Viola, Lindsey Patterson, Mary Gray, and Ashley Boal and we are doctoral students in the Applied Social and Community Psychology program at Portland State University.  This winter, we took a course in Program Evaluation from Dr. Katherine McDonald.  We’d like to share three aspects of the seminar that we felt made it so useful and informative for us.

  1. Classroom Environment. The format of the course encouraged open and interactive dialogue among the students and the instructor. The atmosphere was conversational and informal, allowing students the space to work through sticky issues and raise honest questions without fear of judgment. Regular course activities allowed us to consider creative approaches to program evaluation and develop activities that we brought to class for other students. For example, Dr. McDonald incorporated program evaluation activities, such as Patton’s activities to break the ice with stakeholders, and Stufflebeam’s (2001) “Program Evaluation Self-Assessment Instrument,” into our classroom activities.

Hot Tip: Engage students by facilitating an open and interactive environment that fosters discussion and creativity.

  1. Course Content. The course covered both evaluation practice and theory, including the historical and philosophical underpinnings of evaluation theories. Because gaining expertise in the theory and practice of program evaluation in a 10-week course is not possible, Dr. McDonald provided us with a tremendous amount of resources for us to peruse on our own time and refer back to as necessary, as we begin working on evaluations more independently.

Hot Tip:  Provide students with templates, examples, and additional references about any activities or topics covered in order to allow them access to resources they will need once the course is over.

  1. Applications. One of the most valuable aspects of the course was its emphasis on the application of theory to the real world.  During the course, we developed and received extensive feedback on logic models, data collection and analysis matrices, and written and oral evaluation proposals. Additionally, we participated in a “career day” in which Dr. McDonald arranged a panel of evaluators who work in a variety of contexts to meet with our class to discuss careers in evaluation.

Hot Tip: Allow students to practice skills they will need in the real world and expose them to the diverse career opportunities in the world of program evaluation.

Our seminar only scratched the surface of program evaluation, but these features of the course provided us with a strong foundation in the field, and elicited excitement about our futures in evaluation.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · · · ·

My name is Dr. Larry Bohannon and I am an assistant professor in the Elementary, Early Childhood and Special Education Department at Southeast MO State University, Cape Girardeau, Missouri.

The Partnership for 21st Century Skills (2006) encourages schools, districts and states to advocate the infusion of 21st century skills into education. It also provides tools and resources to help facilitate and drive change.

One of the leading deficits the Partnership lists of high-school graduates is Professionalism/Work Ethic, defined as demonstrating personal accountability and effective work habits. Marzano (2001) also believes that students need to recognize that good work habits improve achievement, citing effort reinforcement and recognition as one of nine instructional strategies to improve achievement. Pre-service teachers are taught how to write lesson plans that contain formative and summative evaluations, but as educators we have traditionally not taught them to write self-assessments nor have we taught them how to recognize the students’ effort, except in those that excelled.

McTighe and O’Connor (2005) state assessments and grading should focus on how well – not when – the student mastered the designated knowledge and skill. Why not add a formal self-assessment in elementary grades for students to begin looking at how much effort they put forth to master the objective? If effort and achievement match, the student is well on the way to success; if they do not, then this is the time to help students see that effort and achievement work hand-in-hand. Recognizing effort or lack thereof will help students learn “how” to attain goals. Learning to set goals is great, but learning which work habits are needed to reach them is the ultimate task of all.

Hot Tip: The self-assessment may include questions to help students determine if they followed the directions, put forth their best effort, enjoyed the lesson, and what more they want to learn. If applicable, the self-assessment can also include questions that deal with how well students worked together or independently.

Hot Tip: The self-assessment for younger students needs to be on their reading level and possibly read aloud. The use of “smiley, sad and neutral faces” works well with the younger students. The number of questions for younger students needs to be few and having them circle or color the “face” that they select also works well.

Hot Tip: The self-assessment for older students may use “Yes, No, Maybe” and include a few questions to check comprehension.

Casner-Lotto, J. et al. (2006) Are they really ready to work? 4-16. http://www.heartland.org/custom/semod_policybot/pdf/20154.pdf

Marzano, R., Pickering, D., Pollock, J. (2001) Classroom instruction that works:  research- based strategies for increasing student achievement, 49.

McTighe, J., & O’Connor, K. (2005). Seven practices for effective learning. Educational Leadership, 63(3), 10-17.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Archives

To top