Greetings from beautiful Boise! We are Rakesh Mohan and Bryon Welch from the Idaho legislature’s Office of Performance Evaluations.
Last February, the Idaho legislature asked us to evaluate the state’s new system for processing Medicaid claims. Legislators had received many constituent complaints that claims from Medicaid providers were being denied, delayed, or inaccurately processed. Legislators were beginning to question whether the new system would ever perform as intended.
The $106 million system went live in July 2010 and immediately began experiencing problems. At the time of our review, over 23,000 providers were enrolled in the system, which was processing about 150,000 claims each week.
Lessons Learned: Our review found that problems with processing provider claims were the result of unclear contract requirements, a lack of system readiness, and most importantly, the absence of adequate end user participation. Less than one percent of total providers were selected for a pilot test, but neither the state administrators nor the contractor knew how many claims were actually pilot tested. Further, only about 50 percent of the providers were enrolled when the system went live.
Hot Tip: If you are ever asked to evaluate the implementation of a large IT system that is experiencing problems, make sure you examine the end user involvement in the system’s design and implementation. Too often end user feedback is underappreciated, not used, or completely ignored.
Lessons Not Learned: Nearly ten years ago, Idaho attempted to implement a similar IT system to track student information for K-12 public schools. After spending about $24 million, the project was terminated due to undelivered promises and a lack of buy in from end users. Unfortunately, lessons identified in our evaluation of the failed student information systems were apparently not learned by people responsible for this new Medicaid claims processing system.
Hot Tip: Because the success of an IT system depends on end user buy-in, ask the following questions when evaluating the implementation of large IT systems:
1. Are end users clearly identified?
2. Are end user needs identified and incorporated into system objectives?
3. Do vendors clearly specify how their solutions/products will address system objectives and end user needs?
4. Is there a clear method for a two-way communication between system managers and end users with technical expertise?
5. Is there a clear method for regularly updating end users on changes and progress?
The American Evaluation Association is celebrating GOV TIG Week with our colleagues in the Government Evaluation AEA Topical Interest Group. The contributions all this week to aea365 come from our GOV TIG members and you can learn more about their work via the Government TIG sessions at AEA’s annual conference. Do you have questions, concerns, kudos, or content to extend this aea365 contribution? Please add them in the comments section for this post on the aea365 webpage so that we may enrich our community of practice. aea365 is sponsored by the American Evaluation Association and provides a Tip-a-Day by and for evaluators.