The events of September 11, 2001, have forced insurance companies to take a new look at how they are managing not only their exposure to man-made hazards, but also their exposure to natural disasters, such as earthquakes and hurricanes. What many companies have found is that their present approach to managing their overall aggregate exposure is deficient in many ways.
What can companies do to become more aggressive and effective in managing their exposure to earthquake and other natural disasters? In the past, most companies have concentrated their efforts on controlling their overall accumulations within some (usually fairly large) geographic area and, to a lesser extent, underwriting new business. Their ability to do this has been greatly enhanced through the development of catastrophe modeling software.
Most primary and reinsurance companies are now using catastrophe modeling software to manage their accumulations. In addition, many of the reinsurers and some of the primary companies are also using the models to underwrite business.
Based on the number of companies using software, it would appear on the surface that the insurance industry is being very aggressive in managing this exposure. However, as demonstrated by the September 11 losses, managing the accumulation over a larger geographic area is not sufficient if the risks are concentrated in a very small area.
The problem is that many companies are not using software to underwrite business, and those that do are not coordinating it well with their accumulation programs. The net result is that they are not selectively screening business to optimize the use of their available capacity and avoiding concentrations of risks in a small area.
How can companies improve their catastrophe management programs? First, the objective of every catastrophe management program should be to optimize the risk/return relationship. Insurance 101 will tell you that the way to achieve this objective is to identify properly which accounts should be written or declined (new and renewal), to price accounts appropriately, and maintain a geographically diversified portfolio. Fortunately, the catastrophe modeling software products currently being introduced are providing the tools needed to underwrite and price accounts, while maintaining a diversified portfolio.
Some software products, such as EQECAT's WorldCat EnterpriseTM software, provide underwriters with the ability to look at the key underwriting and pricing factors for an individual account fulfilling the basic underwriting requirements. However, this alone does not achieve the overall objective of optimizing the risk/return relationship. To do this, a company needs to determine the impact of the individual risk on the overall book of business.
Fortunately, the more advanced software products can also provide this information by accounting for the correlation between risks. Highly correlated risks by definition are similar in nature and in close proximity to one another. This is very important since risks that are highly correlated with the overall portfolio will contribute significantly more to the portfolio's exposure and/or volatility than one that is not highly correlated. Therefore, this correlation needs to be reflected in the underwriting and pricing of the individual risk, as well as the accumulation of exposure. Since correlation is very important to the management of accumulations, it is essential that every company clearly understand how it is being accounted for since a very robust model is required to do so properly.
The risk measures used to underwrite the individual risk and analyze the accumulations are the same. They include the expected annual loss, standard deviation, coefficient of variance (COV), calculated rate on line, the 100-, 250-, and 500-year loss estimates, and others. Most of these risk measures are the same ones currently being used by most reinsurers and some primary companies to underwrite and price accounts (e.g., annual loss, standard deviation, and 100-year loss). However, some of them, such as COV and Calculated Rate on Line, are newer and provide unique insights into the exposure.
The COV is a relative measure of the volatility of the risk (the more volatile it is, the riskier it is). The Calculated Rate on Line is the technical price for the exposure, which takes into account the loss cost, volatility, and expense factor. Using this type of information, a company can effectively underwrite and price any risk, taking into consideration its individual merits as well as its impact on the overall portfolio.
While the above approach is a vast improvement over prior underwriting accumulation management approaches, there are still more advanced portfolio optimization techniques that are available to companies on a consulting basis. As an example using EQECAT's Exceedance Probability Leverage Analysis (EPLA) methodology, the goal is to enable companies to achieve their overall objective by evaluating each account on both its own individual characteristics and its relative impact on the overall portfolio similar to what was discussed above.
The difference with this approach is that it enables companies to include more variables in the analysis such as risk/reward (potential loss versus premium) measures. The methodology produces an Exceedance Probability Leverage Curve (EPLC) for each account. The EPLC provides the account's leverage quotient at the different level of exceedance probability. A high leverage quotient means that the risk-adjusted return for the account is low.
Using tools like EPLA, companies can determine:
The key element to keep in mind when using one of the advanced portfolio optimization techniques currently available is the robustness of the underlying model, which cannot be emphasized enough. To be truly effective in achieving the intended result requires the use of a very technically advanced catastrophe model.
The following is an actual analysis that was conducted using the EPLA methodology. For the purpose of this analysis, the corporate portfolio optimization objectives were defined as follows:
> Expected return on capital not less than 15 percent
> 100-year loss to premium ration not more than 10 percent
> 250-year loss to premium ratio not more than 20 percent
> 500-year loss to premium ratio not more than 30 percent
The analysis looked at each risk to determine how much it was leveraging the overall portfolio, if the revenue were appropriate given the leverage factor for the account, and whether or not it should be eliminated, reviewed, or retained in order to achieve the stated objectives. The methodology employed in the analysis essentially made a value judgment on the relative merits of each account to determine which ones should be retained and eliminated. The analysis also identified areas where additional exposure could be written with minimal impact on the overall exposure.
The following table contains the results of the analysis:
|# of Account/Risks
|Return on Capital
Obviously, every company will not achieve results as dramatic as those shown above. However, every company incorporating one or more of the portfolio optimization techniques should benefit and move closer to achieving the ultimate objective of optimizing the risk/return relationship.
Opinions expressed in Expert Commentary articles are those of the author and are not necessarily held by the author's employer or IRMI. Expert Commentary articles and other IRMI Online content do not purport to provide legal, accounting, or other professional advice or opinion. If such advice is needed, consult with your attorney, accountant, or other qualified adviser.