BY: KEITH PETERSON, PH.D. VICE PRESIDENT, ADVANCED ANALYTICS AND CONSULTING MITCHELL INTERNATIONAL, INC.
Claim executives are tuning in to the power of analytics to identify new operational efficiencies, control losses and improve customers’ claim experience. To date, though, most organizations are in the early stages of testing analytical tools and determining how best to invest in these new capabilities. This paper describes the use of predictive analytics to improve the adjudication of automobile injury claims and makes recommendations on how to get started developing an analytical capability.
Insurers know that efficient claims handling is a moving target. Rising medical costs, continual change in regulations and mandates, an aging adjuster workforce, customers with higher service expectations and savvy fraudsters all contribute to creating a challenging environment to resolving claims quickly and fairly. Furthermore, claims operations play different strategic roles across insurers. To some, it is part of the supply chain with a focus on determining the appropriate benefits payable and ensuring expedient payouts. To others, it is a differentiator in the fight for new customers—concierge services, lifetime guarantees and online support tools are offered to help ensure a positive claimant experience. For all, though, the claims process is under pressure to respond effectively to the unrelenting rise in costs—challenging even the most effective companies to balance financial performance and customer satisfaction.
Faced with these moving targets around claim costs and customer management, there is a strong interest in whether and how analytics can help manage claims performance. Many claim executives want greater transparency into what drives operating costs and more quantitative data about what factors determine claim outcomes. Claims analytics are a powerful way for insurers to gain visibility into their cost drivers and predict impact of their strategic and operational choices.
Analytics have assumed a prominent role in many industries and the potential in insurance claims is self-evident. They are used to fly airplanes and manage billions of dollars in investment decisions every day—often performing better than humans. They are table stakes in the underwriting and customer marketing functions. Given that insurance is fundamentally a data-driven business—and claims typically account for 70% or more of an insurer’s cost, improving claims resolution using analytics holds the potential to deliver millions of dollars in savings per year.
Analytics may be applied to help guide strategy, study operating decisions, and impact the claims process. Each of these focal points requires a different level of data aggregation and analysis. Strategic analysis takes place at the business unit level to guide budget allocation and manage financial performance. Operating decisions are made at the regional or claim office level and claim processes require access to claims level data.
A typical claims process includes many decision points (Figure 1). Good decisions at each point increase the chance of resolving the claim quickly, fairly and accurately. Poor choices, on the other hand, create greater risk of excessive loss payouts, excessive time required to resolve claims, undetected hard and soft fraud, and unhappy customers.
These decision points are where analytics can be applied to help improve performance. For instance, they can be used to provide the adjuster greater insight into the context of the claim, the claimant and providers. They can also be used to automate certain types of routine business rules—such as when to forward claims for Independent Medical Exams or to Special Investigative Units.
Figure 1. Claim process and decision points
Decision Analytics Framework
We use the phrase “Decision Analytics” to describe a framework for action-oriented use of data to improve outcomes over each of the claims process decision points. This framework starts with the selective collection of data that can be mapped to specific uses and a reporting and analysis infrastructure that can scale as demand increases.
There are several dimensions of data relevant to claims analysis (see Table 1). These data include information derived internally from transaction and customer databases as well as external data accessed through partners or vendors. The priority is not to capture as much data as possible but to create a full view of the claim from multiple perspectives. Each of these dimensions provides value in explaining and predicting claim outcomes.
Claims data are a centerpiece of an analytics strategy. Careful attention is paid to preparing claims data for analysis because they are sourced from billing and administrative systems and may vary widely in quality and usability. While most analysts will say that the “more data the better” as a start, they will also tell you that they want quality data and often the majority of their time is spent preparing the data. For analytical purposes, good data provides:
- Maximum coverage of the target population (all possible claims by region and coverage type),
- Currency (the most recent data possible to reflect current conditions and to capture the most recent activity of claimants and providers), and
- Accuracy (data that is minimally biased).
Compared to relying on statistical samples or compiled information, claims data from internal systems have key advantages—the large volume of data make them easier to conduct very granular analysis, as well as afford the opportunity to study low incidence events not picked up by surveys or other research methods. For example, an analyst can look for certain patterns of emerging fraud before it becomes widespread. Overall, claims data are the key to an array of analyses like describing medical care utilization and costs, understanding treatment and billing patterns, assessing the prevalence of injury types, potential fraud and claim outcomes.
Claims data also have limitations. Most important, mis-reporting and data errors can significantly impact accuracy. Claims analysts are particularly focused on the detection of outlier data that should be removed from analysis because they are data entry errors—creating overestimation of true costs—vs. outliers that represent potential fraud or over-treatment. Alternatively, incorrect or under-reported diagnoses can lead to under estimation of costs. Fortunately, while claims data are never perfect, modern analytical tools have come a long way toward preparing “noisy” data for use in modeling applications.
Reporting and Analytics
There is a broad spectrum of analytical tools available to claims organizations (Table 2). The purpose of these tools is to deliver data-driven insight, uncover patterns in data that are hidden to the human eye, detect outliers and understand relationships that predict claims outcomes. These tools are combined to address four core needs: 1) reporting; 2) knowledge discovery; 3) prediction and 4) optimization. Claim organizations first need access to performance management reporting and ad hoc query capabilities. A solid reporting capability is the foundation of an analytical environment. The main focus of reporting is typically summarizing recent history to monitor trends in performance. Nearly all insurers manage to KPI’s. And, most successful insurers benchmark their performance to assess competitiveness.
Knowledge discovery includes guided and un-guided analysis of data using an array of techniques from cross-tabulation and correlation to clustering and multivariate modeling tools. The focus is uncovering and quantifying patterns that relate to predicting claims outcomes. These patterns have several uses:
- Anticipating future events to guide proactive responses
- Helping to explain cost and performance drivers from the wide array of claim data sources
- Evaluating and quantifying performance trends in terms of degree consistency and seasonality
Optimization refers to a class of advanced analytical methods that determine the best tradeoffs among competing objectives given a set of business constraints. For example, a claims executive may wish to find the best mix of product specialists to employ, by region and office, to minimize losses and maintain a subscribed level of customer satisfaction. Optimization enables decision makers to ask “What is the best course of action?” and assess quantitatively and financially the risk of a particular decision.
Claims Management Applications
Claims handling is one of the most significant functions of the insurer—it directly impacts cost and can be one of the only times to influence the customer experience. Good claims management includes matching claims to the best resources, handling claims expediently, and controlling loss payouts associated with fraud.
Claims analytics can impact process in several ways—by helping to anticipate risk, provide a more consistent handling approach, and helping to ensure fair settlements. Perhaps most importantly, predictive analytics can help insurers move from relying on a retrospective view of their claims to an action-oriented strategy based on predicted claim outcomes. Table 3 summarizes several claims management applications.
Auto medical claims fraud is one of the most pressing problems facing every insurer. The IRC estimates that fraud contributes more than $5 to $7 billion in excess payments to auto insurance claims. The NICB reported a 24% in claims submitted as suspicious from 2008 to 2010. Once hard or soft fraud is suspected, insurers begin more intensive claim reviews by running credit checks, referring a claim for an independent medical exam, forwarding the claim to SIU or denying a claim outright. However, fraud or buildup can take months to identify and react to prior to intervention. These delays reduce the opportunity to mitigate outcomes proactively.
laims Analytics for Auto Casualty Insurers 9
Insurers developing their predictive capabilities should include fraud scoring. Scoring claims for fraud potential earlier in the claims lifecycle opens up opportunities to proactively mitigate the fraud by developing the most appropriate handling strategy. This might, for example, be a more systematic escalation procedure that intensifies the resources on the claim as more suspicious indicators appear in the system. In addition, fraudsters are very good at learning to exploit an insurer’s business rules for fraud detection. A further advantage of a proactive scoring system is to target providers or networks of claim participants that demonstrate suspicious behavior and put them on notice—possibly sending them a pre-emptive EOB or placing them on a watch list.
A comprehensive analytical approach to fraud will incorporate a combination of techniques, including predictive scoring, database pattern recognition, and outlier analysis. Newer techniques include text mining of claim documentation and correspondence as well as network analysis of the links between providers, attorneys and body shops to detect organized fraud.
While fraud analysis deals with loss costs, claim handling represents loss administrative expense. Properly assigning the best mix of resources to a claim at First Notice of Loss is a key to claims handling efficiency. When high risk claims are inappropriately assigned, for example to an inexperienced or over-taxed adjuster, efficiency can be impacted by the need subsequent claim re-assignment, weak negotiations and poor customer communications. These factors may ultimately result in higher payouts and wasted resource usage.
Another compelling case for introducing more analytics into the claims process concerns the aging out of experienced adjusters across the industry. Mahoney and Bishop2 cite research suggesting a gap of 84,000 adjusters by 2014. While analytics will never replace the adjuster, they can be important decision support tools—helping less experienced focus on the relevant information, automating routine decision tasks and providing guidance in a consistent manner.
Predictive analytics can be used to help identify risky claims that are likely to incur higher payouts given claim severity or complexity. These claims can be prioritized in the queue or assigned to the adjuster(s) best suited to handle the claim. For example, a strike team of more skilled and experienced adjusters might be assembled to handle claims with the greatest predicted exposure. More typical claims continue to be handled in the same way, and low risk claims might be assigned to outsource resources or prioritized for fast track settlement.
More complex claims are difficult to prioritize and often take the adjuster significant time to research and form a handling strategy. Predictive models are useful in that they can deliver to the adjuster not just a risk score but a list of the key diagnostic indicators that drove that score. This claim “tear sheet” can save the adjuster time and effort in understanding the claim structure.
Strategic Planning Applications
Every insurer has an operating model for their claims organization that guides budget, structure, policy, and practice. Some operating models emerge from careful research and planning. Others evolve by default over time. Either way, once a successful model is in place, the key challenge is to include higher order improvements and optimizing performance. Achieving this next level of success requires more complex decision making that can be enhanced through the use of analytics.
Strategic planning is about using claims data from your business and the broader industry to establish effective performance management systems and optimize operating models. By using more predictive measures of claim outcomes, insurers can test and deploy new strategies to their advantage. The following table outlines a common seat of analytical applications and their benefits. In these cases, predictive scoring is combined with a broader array of descriptive analysis and optimization tools to analyze a situation and recommend a course of action.
Most successful insurers seek to competitively denominate their operations against industry benchmarks and relevant peer group performance. And as cost pressures continue in the industry, many players are re-emphasizing continuous quality improvement initiatives. By evaluating performance against external benchmarks, leaders identify key areas of competitive differentiation to strengthen and strategic concerns such as excessive payouts relative to strategic peers.
Analytical techniques for benchmarking are relatively simple; however, one of the major challenges with developing effective metrics is access to a robust set of industry or peer group data to compare against. Research sample methods are possible, but these are typically limited in their coverage and expense to maintain. Mitchell International, by virtue of its strong presence with most major carriers, maintains a normed database of key cost indicators for the auto casualty industry. This unmatched data asset can be leveraged for standard and custom benchmark report development by contributors.
In claims, key benchmarks are typically cost-driven. They focus on the quality of claim handling processes, time to resolution and trends in settlement values over time. Relevant comparisons will include comparisons by state, by provider type, by coverage and by the most common diagnostic and treatment types.
Benchmark comparisons can provide actionable information. Mitchell provides benchmark comparison studies at a regional and provider level. In a recent client study of New York, outlier analysis highlighted several counties with average costs for back surgery triple that for the rest of the industry. Drill down analysis highlighted a cluster of claims associated with a facility employing renowned specialist surgeons. These analyses helped the insurer target opportunities to negotiate preferred rates with the providers.
Benchmarking can also be extended to assess performance across regions, offices or teams. For example, regression analysis can be used to determine appropriate branch goaling objectives with respect to productivity or quality of claim resolution. The benchmark data can be used to guide the determination of fair goals and establish consistent, quantitative support for their usage.
Determining how to allocate budget, establishing the right operational model, and evaluating potential claims handling changes are critical in a competitive marketplace. These decisions are often complex—requiring tradeoffs among many competing demands.
In recent years, several technology trends have converged to facilitate the development of model base- planning using advanced prediction, machine learning and optimization methodologies. These approaches are an effective decision support tool for asking complex strategic questions with multiple tradeoffs and varying constraints. For example, an insurer may want to determine the best claims office network and staffing mix that minimizes administrative costs while maintaining a target level of claims performance and customer satisfaction. Or, a claims officer may want to estimate whether switching from a generalist adjuster to a product specialist model will yield a cost advantage given the insurer’s current risk pool.
This class of strategic planning tools requires models specific to the insurer’s needs, analytical data marts, and flexible reporting; thus, they are usually custom-designed. In the past couple years most of the major database vendors began incorporating these analytical tools into their database software—making it far more cost-effective to deploy system. The key issue today is understanding what strategic questions will be asked using the system and using those objectives to drive data sourcing and model development.
As costs continue to rise and insurance marketing becomes more competitive, insurers continually examine how their claims adjudication processes can achieve greater efficiencies as well as deliver competitive differentiation through customer service. An investment in analytical capabilities provides advantages ranging from optimizing core claim processes to gaining strategic insight into industry dynamics.
Deploying analytics can be a challenge. There are several new expenses including staffing, data, and technology costs. While it’s possible to store massive amounts of data, more data doesn’t always mean better decisions. In fact, more data without a plan can often lull organizations into endless analyses without clear outcomes.
An organization planning an analytics implementation should define a clear set of business needs and objectives that can be addressed using analytics. Promising areas will include those where:
- There are a high, continuous volume of rule-based decisions that can be automate
- There is a lack of insight into what drives outcomes and where to take action
- Consistent policies or decision criteria should be applied, such as compliance and governance
- Significant decisions require balancing multiple complex tradeoffs
- Emerging trends have a significant impact on an organization before it has a chance to respond
Once needs are articulated, a blueprint can be drawn to define how analytics can be applied and what assets (data, technology, people) are required. Successful analytical implementations inevitably lead to a rapid increase in demand and thus infrastructure must be scalable. The blueprint should include a directional plan on how to create more scale for increased use of data (hardware/software) and more scale to meet increased organizational demands for information (automation or staffing).
Last, the blueprint should be translated into a 12-24 month roadmap identifying how applications will be prioritized and rolled out. In most cases, investing moderately and building on a series of small wins will generate greater corporate buy-in and help balance investment requirements.