AEA365 | A Tip-a-Day by and for Evaluators

TAG | auditing

Greetings from beautiful Boise! We are Rakesh Mohan and Bryon Welch from the Idaho legislature’s Office of Performance Evaluations.

Last February, the Idaho legislature asked us to evaluate the state’s new system for processing Medicaid claims. Legislators had received many constituent complaints that claims from Medicaid providers were being denied, delayed, or inaccurately processed. Legislators were beginning to question whether the new system would ever perform as intended.

The $106 million system went live in July 2010 and immediately began experiencing problems. At the time of our review, over 23,000 providers were enrolled in the system, which was processing about 150,000 claims each week.

Lessons Learned: Our review found that problems with processing provider claims were the result of unclear contract requirements, a lack of system readiness, and most importantly, the absence of adequate end user participation. Less than one percent of total providers were selected for a pilot test, but neither the state administrators nor the contractor knew how many claims were actually pilot tested. Further, only about 50 percent of the providers were enrolled when the system went live.

Hot Tip: If you are ever asked to evaluate the implementation of a large IT system that is experiencing problems, make sure you examine the end user involvement in the system’s design and implementation. Too often end user feedback is underappreciated, not used, or completely ignored.

Lessons Not Learned: Nearly ten years ago, Idaho attempted to implement a similar IT system to track student information for K-12 public schools. After spending about $24 million, the project was terminated due to undelivered promises and a lack of buy in from end users. Unfortunately, lessons identified in our evaluation of the failed student information systems were apparently not learned by people responsible for this new Medicaid claims processing system.

Hot Tip: Because the success of an IT system depends on end user buy-in, ask the following questions when evaluating the implementation of large IT systems:

1.       Are end users clearly identified?

2.       Are end user needs identified and incorporated into system objectives?

3.       Do vendors clearly specify how their solutions/products will address system objectives and end user needs?

4.       Is there a clear method for a two-way communication between system managers and end users with technical expertise?

5.       Is there a clear method for regularly updating end users on changes and progress?

The American Evaluation Association is celebrating GOV TIG Week with our colleagues in the Government Evaluation AEA Topical Interest Group. The contributions all this week to aea365 come from our GOV TIG members and you can learn more about their work via the Government TIG sessions at AEA’s annual conference. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

My name is Ariana Brooks and I am the Director of Evaluation, Research and Planning for HeartShare Human Services.

Lesson Learned: When I started as an internal evaluator my supervisor, Stan Capela, stressed to me one main point: evaluation does not solve management problems. My initial reaction was it made sense and remember similar issues discussed in graduate school. I did not fully grasp the meaning until I was performing my job responsibilities. Specifically, each report was producing similar results. At first I was naively shocked at the level of resistance from some managers. We were well versed in Patton’s Utilization Focused approach. So we focused on providing meaningful reports, but there was resistance even though we would repeatedly tell managers the “numbers don’t lie”.

Lesson Learned: As a social psychologist, I reflected on various theories that helped explain their behavior. Of course, people will interpret stimuli based on their own perspective. People are motivated to preserve a positive sense of self and are more resistant to counterattitudinal messages, especially if they are highly invested in the issue (e.g. their job). So it made sense that when an internal audit illustrates program’s deficiencies have more to do with supervision or program administration it can be hard for management to swallow.

Although it is frustrating when management’s resistance to change can reduce the utility of evaluation work, it is fascinating to see how the theories I studied play out in an organization. Borrowing from evaluation and social psychology theories, here are some tips that helped me combat and understand resistance:

  • Hot tip: Think about the source of the message, or evaluation results. The source should be respected, seen as having expertise, trusted and viewed as an in-group member (someone also invested in the program or in a similar role).
  • Rad Resource: The appreciative inquiry approach to evaluating programs has been met with great success. Managers are more willing to be involved and use evaluation results when they carry a more positive tone. Focusing on management’s strengths to overcome program challenges has proved to be a more useful approach. A great resource online is: http://appreciativeinquiry.case.edu/
  • Hot tip: Avoid any language that seems targeted towards certain individuals, roles or positions. Make the responsibility of overcoming challenges a group effort, including the evaluator.
  • Hot tip: Take a sign of defensiveness as a positive. Often this is a sign that staff is truly invested in the program and their work. Directing this energy toward more productive means can be a bit of a struggle but be rewarding in the long run.

 

The American Evaluation Association is celebrating GOV TIG Week with our colleagues in the Government Evaluation AEA Topical Interest Group. The contributions all this week to aea365 come from our GOV TIG members and you can learn more about their work via the Government TIG sessions at AEA’s annual conference. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· ·

My name is Stan Capela. I am the Corporate Compliance Officer for HeartShare Human Services, as well as the current chair of the American Evaluation Association’s Government Topical Interest Group (GOV TIG). The purpose of this AEA365 is to talk about corporate compliance and its relationship to program evaluation. Also it is an opportunity to highlight significant issues relevant to the Government TIG. This is the first AEA365 as part of a week long conversation on evaluation and government.

Lessons Learned: Several years ago, the federal government created the Office of Medicaid Inspector General’s Office (OMIG) to reduce fraud in Medicaid funded programs. The statute focused on the need for all Medicaid funded programs to establish eight anti-fraud elements. All organizations must have:

1.       a corporate compliance policy;

2.       corporate compliance program oversight;

3.       education and training;

4.       effective confidential communication;

5.       enforcement compliance standards;

6.       auditing and monitoring of compliance activities;

7.       detection and response; and

8.       whistleblower provisions and protections.

The key is to ensure that systems are in place to provide ongoing monitoring of programs, educate staff on code of conduct, ensure appropriate governance and encourage staff to be cognizant of fraudulent activities and reporting such activities.

Although internal program evaluators conduct ongoing evaluations, the corporate compliance role is one where there is a greater emphasis placed on orchestrating all evaluation activities in a way that reduces fraud, as well as risk to the organization. Further, there also is an emphasis on making sure the corporate compliance officer reports to the governing board and the CEO and President.

Lessons Learned: New York State has placed a great deal of emphasis on OMIG. The agency offers a wide range of webinars and tools. One very useful tool is a checklist to assess organizations’ corporate compliance plans. It is available thru OMIG compliance alert notes at its website: www.omig.ny.gov. As a result of these changes, organizations will place greater emphasis on individuals with program evaluation responsibilities to take on these tasks as part of their normal workload. In addition, this role also re-enforces the importance of ethics as part of the evaluators’ responsibilities since one task focuses on ensuring appropriate ethical conduct throughout the organization.

Hot Tip: Finally, as GOV TIG Chair, I encourage you to attend our business meeting at the AEA annual conference on Thursday November 3rd at 8 am in Huntington B, where you will be inspired by David Bernstein who will reflect on methods to make evaluations more useful and long lasting for research sponsors and stakeholders. If you want to learn more about the TIG or want to play a more active role, contact me at stan.capela@heartshare.org.

The American Evaluation Association is celebrating GOV TIG Week with our colleagues in the Government Evaluation AEA Topical Interest Group. The contributions all this week to aea365 come from our GOV TIG members and you can learn more about their work via the Government TIG sessions at AEA’s annual conference. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

·

Hi. My name is Chris Camillo, and I am an auditor and consultant on international child labor and education issues. As part of my auditing work, I visit rural development projects in Africa and Latin America to assess the quality of their GPRA performance data, their compliance with program requirements and their learning environments for beneficiaries.

My Hot Tips are recommendations for improving monitoring systems from an auditor’s perspective.

Hot Tip 1: When designing a project for a rural environment, thoroughly assess potential barriers to efficient monitoring. In many countries that I’ve visited, heavy seasonal rains, rugged terrain, unpaved roads, strikes and inadequate transportation result in significant delays in data collection and reporting from target communities. A monitoring plan that relies on volunteer data collectors making frequent visits on foot to sites that are located many miles apart would be too challenging to implement under these circumstances.

Hot Tip 2: Make certain that the monitoring system is robust by requiring thorough documentation of all data collected and by requiring periodic data audits to validate the accuracy and reliability of performance numbers against the source documentation. Use automated controls whenever possible to help prevent errors in data collection, data entry, and reporting.

Hot Tip 3: In addition to training, consider providing performance-based compensation or incentives to employees and volunteers to ensure the accuracy and timeliness of data collection, transmission and reporting.

Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. Would you like submit a Tip? Send it to aea365@eval.org aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.

· · ·

Archives

To top