My name is Stanley Capela, currently Vice President for Quality Management and Corporate Compliance Officer for HeartShare Human Services of New York, a 140 million dollar multi-service organization.
As with most government funded organizations, we have to show we are compliant with regulations and at same time meet certain performance metrics. As a result, I am confronted with how to create a system that focuses on quality assurance that meets performance metrics and incorporates quality improvement process. Using graphs we identified a series of deficiencies and sites that had poor performance. Then we drilled down further identifying areas that were cited as repeat deficiencies by state auditors. With this information, we developed a series of trainings focused on those deficiencies. As a result, we reduced repeat deficiencies in developmental disabilities. The key was to graphically present the data in a way that we were able to pinpoint specific sites that had the problem and developed a plan to improve performance.
Hot Tip: When setting up an internal monitoring system, we focus and prioritize areas that require the program to be compliant with government agencies. We select five to ten items and develop performance metrics. For our child welfare programs we focused on a number of areas such as adoption finalizations, AWOLs, client contacts, service plan timeliness and length of stay. Next, we set up a dashboard with appropriate charts; convene leadership team; review reports; identify challenges; develop interventions; and review progress after three months. After reviewing data we pinpoint which sites fail to meet targets. Over time, program sees improvement and realizes data utilization can lead to positive change.
Lessons Learned: One major problem when using this approach is when you focus on too many areas you get bogged down and accomplish little or no improvement. Make sure everyone has clear understanding that we are a team and that we are not out to get you. Often program directors focus on placing blame as opposed to dealing with problem. The key is focusing on program staff owning the data and realizing there are successes as well as challenges. In other words, perceptions can make a difference on how you approach quality assurance and performance measurements as you create a quality improvement culture. The other major issue is making sure the facilitator and the individual preparing data is independent and separate from program.
Rad Resources: Quality Evaluation Template: How to Develop a Utilization Focused Evaluation System Incorporating QI and QA Systems by Stan Capela.
Council on Accreditation – look at the Performance Quality Improvement (PQI) standard.
Council on Quality Leadership and their method Personal Outcome Measures (POMS)
The American Evaluation Association is celebrating Internal Evaluation (IE) Topical Interest Group Week. The contributions all this week to aea365 come from our IE TIG members. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like to submit an aea365 Tip? Please send a note of interest to aea365@eval.org. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.