"Technology and claims"—these terms go together much like beer and pretzels or death and taxes. I guess it depends on your perspective … and experience.
The common thinking today is that technology is and has been a boon to the insurance industry and to claims management in particular. Automation has virtually remade the claims industry. I don't think many would disagree. And this technological progress has mostly been for the better, too! But technology can be a double-edged sword—both a blessing and a curse. Like fire, one of the original new technologies, it must be used properly and safely or disaster can occur.
Let's look at some advantages and drawbacks from a 10,000 foot level.
Advantages of Technology
The advantages are many. Here are a few.
Greater Speed, Efficiency, and Power
This is a hands-down improvement from the days of football field sized rooms filled with large, noisy mainframes chugging out data cards. Today, systems can pore through terabytes of data to discern fraud potential in seemingly unrelated files. Reserve updates, payments, medical bill repricing, and financial reconciliation tasks can all be performed simultaneously with great speed and accuracy.
Improved Data Availability and Analysis
Again, this is a no-brainer, too. There is now far greater data availability at one's fingertips both through well-designed databases and via the Internet. In years past, obtaining the same information was tedious and time consuming. Further, today's claim systems have greatly increased the ability of the claims professional and risk manager to analyze vast amounts of claims, identify problem areas, and take corrective action from both a pre- and a post-loss basis.
Fraud detection, early intervention of potentially disastrous workers compensation claims, litigation management modules, and other specialty areas are examples of greater analytical capability. Expert systems that can collectively estimate ultimate aggregate reserves based on multiple factors help predict costs much more accurately and help claim executives and risk managers alike to reduce unwelcome surprises.
Reduced manpower is an often-cited "benefit" of increased automation. This is both true and false, but it is beside the point. Data entry people are being replaced, in some cases, by Internet-based applications. But the real issue is the intelligent use of new applications and innovations in systems design such as is achieved through Straight Through Processing (STP), Service Oriented Architecture (SOA), and other advances that have improved system-to-system communication and interaction.
Take the interesting debate on whether to use strictly Internet-based First Report of Injury software or Intelligent Call Centers (using nurse case managers). Both methods are a decided improvement on a cumbersome problem: how to most effectively and efficiently report the claim to the right parties and get immediate action with the fewest amounts of errors possible. Because, in the claims business, time is definitely money, both methods are decided improvements on the old method. The bottom line, either way, is a better usage of claims personnel which may or may not involve a lesser Full Time Equivalent (FTE) count.
These improved applications are brought to bear on an intelligently designed business process. These include:
Well-designed claims administration software
Business process management software
Specifically designed portals
Other very carefully designed applications all aimed at improving the core claims management process
Celent, an IT think tank, recently estimated that implementation of these applications has the potential to improve the property/casualty industry's combined ratio by up to 7 points. Now that is cost savings.
Disadvantages of Technology
Sounds great, doesn't it? Let's look at the drawbacks now.
One doesn't have to go too long without hearing about some system project going way over budget, not delivering expected promises, and being harder to operate than the system it replaced. What is the root cause? No project management? No planning? Failure to accurately gauge what the users really need?
Unfortunately, this happens all the time and the associated costs are staggering. A project is estimated at $X, then the scope shifts (called "scope creep"), timelines drag out, expectations rise and fall, and the project ultimately results in a drastic overrun (sometimes 2 to 3 times) and goes months beyond its anticipated signoff date.
This can be seen in a graphic: the so-called Project Management Triangle, where each side represents a constraint. One side of the triangle cannot be changed without affecting the others. A further refinement of the constraints separates product "quality" or "performance" from scope, and turns quality into a fourth constraint.
Many times, technology is misapplied. For example, why build an involved database management system with all of the "bells and whistles" when a combination of Excel, Access, and some preliminary thought might suffice? Or try and build an elaborate "artificial intelligence" reserving system to replace experienced claims personnel? I personally have seen a system developed in the mid-1990s intending just that outcome. When all of the variables were plugged into the AI expert reserving system, all that came out was nonsensical gibberish. It turned out that there were far more variables than the system's architects counted on when they were designing its layout.
This reminded me of the old Star Trek episode when an "expert system" was built to replace human involvement (Captain Kirk) and ended up blowing away everything in its path—enemy and friend alike: a prime-time example of technology run amok.
This issue is apparent to anyone looking to buy a new laptop or desktop computer … or risk/claims information system. The speed at which improvements and changes occur in the IT arena is amazing. The bad news is that many of the systems developed in the 1970s, 1980s, and even the 1990s are using outdated hardware and software.
The advent of the Internet has accelerated the differences between the old mainframe based "legacy" systems and the newer, sleeker, and faster systems that are coming out today. A lot of the older software and hardware do not work too well with the newer versions, and expensive connections (called "interfaces") have to be created to plug the gap until the older systems can be phased out. But replacing the legacy systems is very expensive, so it must be done carefully.
For example, "ripping the cover" off of an old, legacy system with its antiquated plugs and codes written by people long retired will pose some knotty problems. However, the opportunity cost of not replacing the antiquated systems can be even more expensive when compared with the benefits of improved productivity, speed, efficiency, and systematized business processes.
Map your business processes carefully.
Determine where opportunity best exists to effectively blend people with technology improvements.
Understand the Project Management Triangle lessons: prioritize needs, manage the scope, watch the time. If this is done, cost will be minimized and quality enhanced.
Select technology vendors based on priorities, not just the "sizzle."
Make changes, not for the sake of change itself, but because of a carefully thought out vision of where you want to go.
Opinions expressed in Expert Commentary articles are those of the author and are not necessarily held by the author's employer or IRMI.
Expert Commentary articles and other IRMI Online content do not purport to provide legal, accounting, or other professional advice or opinion.
If such advice is needed, consult with your attorney, accountant, or other qualified adviser.