​​LATEST POST




Applying the Pareto Rule in Workers' Compensation

by User Not Found | Feb 04, 2016
As previously run in WorkCompWire

Many decision makers use the “80/20 rule” where roughly 80% of effects come from 20% of causes to determine how to manage their time, prioritize initiatives, and allocate resources. The idea behind the 80/20 rule dates back to the 19th century and is associated with economist Vilfredo Pareto’s “Pareto Principle.” He noted that 20% of the population in Italy owned 80% of the land. Today, this idea that a small part of the population is responsible for a large part of the outcomes is seen in all sorts of business management processes, including as a general rule of thumb in Six Sigma.

 In workers’ compensation, we commonly see the 80/20 rule applied to intervention efforts — 80% of injured worker costs come from 20% of injured workers. Therefore, for maximum efficacy in cost reductions and outcome improvements, intervention efforts focused on this smaller population of injured workers makes the most sense.

How Does the Pareto Rule Apply to Workers’ Compensation Claims?

Applying the 80/20 rule to workers’ compensation claims, adjustors could randomly select 20% of claims to intervene on, but they’d only be managing 20% of costs. In addition, they may end up focusing their efforts on claims that would resolve without intervention. However, everyone uses at least some sort of prioritization method. Claims adjustors learn to look at two different claims and make a judgment about which ones will require more effort. If they correctly focus on claims that are twice as expensive as the average claim, and they only intervene on 20%, they’ll capture 33% of long-term costs. This is already a big improvement over choosing “at random.”

Another method is to use “triggers” that flag certain claims when particular criteria associated with high-risk claims have been identified. A commonly used trigger in workers’ compensation is an injured worker who receives opioid analgesics from more than one prescriber. Our data indicates that just under 20% of injured workers who are prescribed medications in the first year of their claim receive opioid analgesics from multiple prescribers. If an adjustor focuses on these claims, they will capture approximately 44% of long-term pharmacy costs. This is another improvement over the 33%, and shows that capturing data can be useful in helping an adjustor prioritize claim activities. However, in this example, we are still far from capturing 80% of costs by working with 20% of claims.  

One way to get closer is through predictive analytics. Predictive analytics is the process of building statistical models with data to make better decisions about the future. For example, in workers’ compensation claims management, analysts measure and weigh data about past claim outcomes to predict how similar claims may turn out in the future. By building a statistical model, analysts calculate the level of risk associated with different types of injuries, behaviors, prescribers, and demographics. Applying all of these risk factors into the same model, decision makers can incorporate a lot of information to determine a single risk score, or level of risk, for each claim.

Developing a Predictive Analytics Model

Much talk on predictive analytics focuses on the data. Interest in “big data” has grown steadily since 2010, with many organizations understanding that collecting data can lead to better decisions. But less focus has been on the other part of predictive analytics — analyzing the data in the appropriate statistical model. Analysts often use as much data as they are able, but they’ve also got to be sure to use the appropriate method to analyze the data. This choice is the “structure” of the statistical model. The best algorithms use the most appropriate model for the data available.

The Pareto Rule plays a large role in determining which structure is most appropriate for a statistical model. The Pareto Rule says that you should focus on 20% of the problem. Unfortunately, despite the ubiquity of the Pareto Principle, most standard statistical methods are not equipped to deal with 80/20, or “skewed,” distributions. Instead, statistical tools like correlation, standard deviation, and linear regression assume a “normal” distribution, or the standard “bell curve” that is often seen in statistics textbooks, where half of the population is “above average” and half is “below average.”

An analyst who uses tools that are symmetric, like the bell curve, may be missing the point of the Pareto Rule. Analysts in workers’ compensation don’t need to be as focused on injured workers that will be in the bottom 20% (lowest quintile), the next lower 20%-40% (second quintile), or even the next 40%-60% (third quintile). There is, however, generally significant emphasis on the top 20% (highest quintile).

Similarly, this skewed distribution is the reason that we sometimes look at median values instead of mean values. The mean is the mathematical “average,” and the median is the “50th percentile.” In other words, the median has half the distribution above its value and half below its value. For example, if you have five injured workers with annual pharmacy costs of $100, $200, $300, $400, and $4,000, the median is $300. But the mean is much higher at $1,000. This is because the single injured worker with $4,000 in pharmacy costs is driving up the skewed distribution to the high end. This single injured worker generating $4,000 in pharmacy expense is the 20% of claims that makes up 80% of the costs. Therefore, we need a model that will highlight those high-risk claims, such as the $4,000 claim in this example.

There are different ways to do this. One way is for analysts to use logistic models instead of linear regression models. With a logistic model, an analyst could, for example, assign all claims that are in the top 20% of costs into a “high cost” category and everyone else into a “low cost” category. This model would identify what causes a claim to be in the high risk or low risk category, without having to worry about whether a low cost claim is $100 or $200.

Getting Closer to the Pareto Rule

The predictive model that Helios uses demonstrates how predictive analytics is the most efficient method for applying the Pareto Rule. Through validation studies, we have found the accuracy of our model is such that 20% of claims we recommend for triage and intervention make up 75% of long-term pharmacy cost. This means that by intervening on only 20% of new claims in 2016, we can be confident that by 2019, 75% of the potential concerns for high-risk claims were addressed in 2016. This is significantly greater than the 20% (“at random”), 33% (finding claims that are twice as expensive), or 44% (triggers) methods mentioned earlier.

Pareto Rule Table

In-depth analysis of workers’ compensation claims can guide analysts to a more efficient way of understanding claims. By finding where and how concerns are clustered, statisticians can learn which tools are appropriate to build a model that can handle the Pareto Rule. Payers can use this insight to accurately and effectively prioritize claim management activities, including intervention efforts, ultimately leading to a better outcome, both clinically and financially.

Stay informed by receiving latest updates

Do you have a question about a blog post?

See more insights