What is going on with 5010 and ICD10… I guess not much

Not too many organizations seem to be unduly concerned about the impending conversion which is now less than two and a half year away for 5010. Or so it seems at least by the actions being taken in the industry. Though I have been hearing a lot about how worried they are regarding the lack of time they might have for changing such a complex network of application portfolio, but not many seem to be taking actions commiserate with their concerns.

We have seen quite a lot of semi-structured exercises taking place, either using internal staff or leveraging high-end consulting organizations but they are primarily limited to very high level analysis of what is going to be impacted. While the initial high level assessment is not a bad idea at all, in my opinion we should be way past that stage by now. A seventy page power-point deck highlighting the twenty core areas that are going to be impacted would have been a good idea in March’2009 but may not be sufficient in October’2009.

Let’s try to see the things in perspective:

1) First of all the mandate date of 1st Jan, 2012 is actually a misnomer. The actual date that one needs to be concerned about is really 1st Jan, 2011. The latter is the date when organizations are supposed to be ready to test their compatibility with their trading partners and the year after that is supposed to be focused more on testing rather than real first-time implementation. So basically all we have is around 15 months and in some cases (such as the Blues) that time is even more restricted because the association’s requirement of being prepared by 1st July, 2010. Barely 9 months away.

2) Second, the tactical approach that most of the organizations are thinking about (i.e., using a step down conversion on the inbound 5010 docs and then propagate the resultant 4010 doc all the way through the downstream applications w/o making any change to them) is not a bad idea at all but definitely has its limitations. The most glaring one being the fact that the shelf life of this solution is not much beyond 1st October, 2013, i.e. when ICD10 mandates take hold. Why I say so is because 4010 can not support ICD10 and if we keep on down-converting the inbound 5010s, the propagated 4010s will need to incorporate the down-converted ICD9 and that defeats the whole purpose of going the ICD10 route. Bye, bye granularity. Bye, bye reduced payouts. Bye, bye increased quality of care. After that it might as well be a mandate being pushed down the throat courtesy CMS.

3) Third, the step-down approach itself is not as simple as some people are assuming it to be. Obviously it is relatively easy to down-convert a 5010 to 4010 (notice the use of word ‘relative’. The conversion is not entirely straight-forward, just simpler than up-conversion) but what happens to the attributes that are new in 5010 and are expected to be used for some decision making process in the downstream applications. By ignoring them during down-conversion (as 4010 will not support them) and hence not using them in the decision making process downstream, is the organization still in compliance with the mandate? Or even if one stores that deleted information in some kind of interim repository, what will be the performance impact on core transaction processes if the applications now have to access the interim repository to get the additional data? In any case, even if one makes simple modification to the core application to fetch the addition data from the interim repository, wouldn’t that call for all sort of regression testing and wouldn’t that defeat the whole concept of not touching the downstream applications? Also, how does one handle the 3rd party apps? The vendors will have either a 4010 compliant app or a 5010 compliant app. They are not going to have an in-between app that will allow the end users to configure an interim repository as the source of additional information while maintaining compliance with 4010 standards.

So, the bottom-line is that even if one is thinking about using the interim tactical approach (that of down-conversion variety), one must not be complacent in terms of time frames. There are many considerations even to implement the interim solution and it is definitely not going to be the final game. So my recommendation is start work on the interim solution immediately and when I say ‘work’, I mean a heck of a lot more than the power points. I mean, identification of required attributes to support 5010 specific mandates in the downstream apps. I mean, identifying the code sets that are going to require the additional data and to design an approach for those code sets to get the new data elements. I mean, designing a fool-proof store-and-forward methodology that can support batch as well as real-time transaction processing and is not a resource hog to eat up all your spare processing time. By the way, does anybody have any spare processing time in any case? I did not think so.