How much will health insurance cost next year? The predictions have keenly reflected the political affiliations of the prognosticators. Democrats are still clinging to the possibility of premium reductions. Republicans are having difficulty hiding their glee over the possibility, especially in Ohio, of rate increases of 70% and higher.
The first test was the release, last week, of rates for Covered California, the individual health insurance exchange for the Golden State. Though the exchange is designed to offer up to 13 insurers per area, no area will have more than six insurance companies participating. Some areas will have only three insurers! Major national insurers such as Aetna, Cigna, Humana, and UnitedHealth Care will not be available through Covered California. Anthem Blue Cross, a division of California’s WellPoint, and Kaiser Permanente, also based in California, will be offered through the exchange.
So how are the numbers? Not bad. Paul Markovich, president of Blue Shield of California, is quoted as saying that the final rates will reflect an average increase of only 13% above individual policies currently available.
The release of these rates was met with a huge sigh of relief from the Democrats. The possibility of a 40 year old purchasing coverage in Los Angeles for $250 per month meant that the Patient Protection and Affordable Care Act (PPACA) might work. Sure San Diego will be 20% higher and San Francisco 40% - 50% more, but it could have been worse. A lot worse.
Luckily for the Democrats, Krugman, and their other apologists, Washington and the national press have been too consumed by the recent out-break in scandals – the IRS, Benghazi, and the press surveillance – and the tornado in Oklahoma to look at hard numbers. A chart like the one above takes real time and effort to deconstruct and understand. Who has time for that? Hell, I consider myself lucky that you are still reading this.
Please take a few moments away from Coasterville on Facebook and look at the above grid.
Forget who is missing among the top three choices Californians will be offered and look instead at who is there. Nine of the fifteen options are HMO products. Though many of us have had excellent experiences with Kaiser Permanente here in Cleveland, even their representatives are quick to point out that an HMO, even theirs, isn’t right for everybody. Two out of the top three options in Los Angeles are HMO’s, the type of insurer that is best suited to meet the PPACA’s requirements. Are two out of three Californians currently choosing to be covered by HMO’s? I doubt it.
The PPACA mandates that men and women pay the same price for insurance and that, among other things, maternity is covered the same as any other medical condition. Those rules are already in effect in California. The maternity coverage was instituted last year. The insured in Ohio will see a bump in their rates just from those two changes. Young healthy males will be the most affected by this change.
The rates on the chart are for the Silver Plan, a policy designed to cover 70% of the insured’s health costs. Regular readers know that the PPACA mandates four levels of coverage, Platinum, Gold, Silver, and Bronze. We are told to celebrate average rates of $242 - $351 per month for tier 3 coverage. We would freak out in Ohio.Residents of Cuyahoga County pay the highest health insurance rates in Ohio. Medical Mutual of Ohio, based in downtown Cleveland, offers a High Deductible Health Policy (HDHP) that you can pair with a Health Savings Account. The $2,500 deductible policy is $139.10 per month for a healthy 40 year old man. The commonly purchased options that have higher deductible are a whole lot less.
Those policies will be a lot higher in 2014
The above grid is a basket of apples and oranges. Individual rates aren’t compared to the current individual policies. That would look awful. Instead the rates are shown with the more volatile small group premiums.
We won’t have a grid like this in Ohio anytime soon. But when we do, the numbers will be great, just as long as you don’t look too closely. DAVE www.bcandb.com
Last week I was invited to give an introduction to googleVis at Lancaster University. This time I decided to use the R package slidify for my talk. Slidify, like knitr, is built on Markdown and makes it very easy to create beautiful HTML5 presentations.
Separating content from layout is always a good idea. Markup languages such as TeX/LaTeX or HTML are built on this principle. Ramnath Vaidyanathan has done a fantastic job with slidify, as it is very straightforward to create presentations with R. There are a couple of advantages compared to traditional presentation software packages:
In the past I have used knitr in combination with pandoc to generate a slidy presentation. However, with slidfiy I can do all this in R directly. And better, Ramnath provides me with a choice of different layout frameworks and syntax highlighting options. Finally to top it all, publishing the slides on Github was only one more R statement: publish('mages', 'Introduction_to_googleVis').
I will give a half-day tutorial on googleVis with Diego de Castillo at useR2013! in Albacete on 9 July 2013. I hope to see some of you there.
Of course, if you have been paying attention, you know that the PPACA has nothing to do with the access to health care or its price. Instead, the law is about the access and pricing of insurance. We were going to eliminate the scourge of 40 million uninsureds and make the premiums more reasonable.
Today’s post is a quick status report on those two goals.
* * *
Nobody wants to be a political football. It is nobody’s goal to be the hot potato. So think what it must be like to be on the Ohio High Risk Pool policy. This is the interim program the federal government created to serve as a bridge until the PPACA becomes fully functional at the end of this year. To qualify you had to have significant medical issues and to have been uninsured for over six months.
As noted in June 2010, Medical Mutual of Ohio won the contract to administer this underfunded and poorly designed program. Since then we have had the federal government try to change the rules and even attempt to throw some of our unhealthiest Ohioans off the program.
Access to this bridge was blocked months ago. With funding running out, the program was closed to new enrollees. Today’s news was as inevitable as it was unwelcome. The federal government will take over the Ohio High Risk Pool at the end of next month. As of July 1st all Ohio members will be transferred to the federally run Pre-existing Condition Insurance plan (PCIP). Coverage? Price? Networks? Who knows? All of this information will be officially released sometime in the next five weeks or so.
If you have a significant heart condition or stage 4 cancer and have relied on the high risk pool this last year, you might be concerned about this change. But don’t worry, as my friend and fellow agent Dave R. noted, “The people that underfunded this are the same people that decide funding for the unaffordable health care act. They refused to give the States the additional funds to keep the program in place for another five months”.
If we really cared about providing access (insurance) to the unhealthiest amongst us, we would be ready to provide the funds necessary to get the job done. Here’s a hint – it is going to take a lot of cash.
* * *
This was an EXCELLENT piece in Time". Which is why nobody read it. Time and its like minded media partners are about as frequently consulted on the news of today as the New York Sun of "Yes, Virginia" fame.
The above comment was posted by one of my readers, a local librarian. When challenged by other readers, he noted, “Nobody read it. That issue never moved from it's (sic) spot on the shelf until I placed the next unread issue of Time in its place.” I wish he was wrong. I wish dozens of visitors to his branch would have read Steven Brill’s Bitter Pill, Why Medical Bills Are Killing Us, a special edition of Time Magazine before someone had “accidently” taken it home.
But it didn’t matter.
Kathleen Sebelius, the Secretary of Health and Human Service (HHS), read and more importantly responded to Brill’s well-researched report. On May 8th Sebelius and the Centers for Medicare and Medicaid Services (CMS) released the actual charges for the 100 most commonly performed inpatient procedures. CMS even admitted that the document dump was in part due to Time.
Shedding light on the incomprehensible and often indefensible pricing structure of our nation’s hospital was a public service. The national news covered it extensively. Local TV stations and newspapers combed the data for the specific pricing for hospitals in their service areas. Even online publications like the AOL Patch covered this news. Did you read the report? Probably not. Did your Congressman/Congresswoman read it? Maybe. But I will bet that the legislative aides have now read Brill and are now familiar with the term “chargemaster” and what that means to you.
The links are all here. It is up to you. Do you want to be just another patron at that unnamed library or do you want to know the numbers? Neither Health Care nor Insurance will ever be affordable until we understand and begin to control costs.
Our two stated goals – access and affordability – remain largely untouched and unsolved. That leaves us, for the moment, 0 for 2. DAVE www.bcandb.com
Here's are some life insurance tips from our consumer advocacy department, which gets a lot of insurance questions:
With long-term products such as whole life insurance, annuities, or long-term care insurance, it can be difficult to know which product or company is the best. Companies often give an estimate of how they expect the products to perform, but realistically, only time will tell which company and product will perform the best. Don’t fall for high teaser interest rates or low-ball premiums that are adjustable.
Our agency receives a lot of complaints from people whose universal and other whole life policies are underfunded and become too expensive to maintain. It’s important to realize that when you buy a whole life product, you’re actually buying a schedule of mortality rates.
No matter how young you are when you buy the policy, as you age, your mortality charges will increase so you’ll need to pay more to keep the policy in effect. Review your annual statements and illustrations to stay on top of how your policy is performing so that you can make sure that you pay enough premiums to keep the policy in effect.
And then there's this issue: A lot of people who have life insurance policies don’t tell the beneficiaries that the policy exists. As a result, the beneficiaries don’t collect on the policies when the policyholder dies. (Partly as a result of this lack of communication, there’s an estimated $200 million in unclaimed life insurance benefits in the U.S.) Also, life insurance policies often lapse when a dying or disabled person quits paying the bills. Remember to keep your life insurance beneficiaries informed so that they can make a claim on the policy after you’re gone and so that they can make sure the policy doesn’t lapse beforehand.
I was trained as a mathematician and it was only last year, when I attended the Royal Statistical Society conference and met many statisticians that I understood how different the two groups are.
In mathematics you often start with some axioms, things you assume to be true, and these axioms are then the basis from which new theory is derived. In statistics or more general in science you start with a theory, or better a hypothesis and try to disprove it. And if you can't disprove it, you accept it until you have other evidence. Or to phrase it like Karl R. Popper: you can only be proven wrong.
Now, why do I mention this? I have met many mathematicians who talk about the beauty of mathematics and I agree, a mathematical concept, theorem or proof can indeed be beautiful. However, when you work in applied mathematics and particular when you use mathematics to build models, there is a danger that you stick to the beautiful idea and ignore reality. Remember the financial crisis?
For example, it might be handy to assume that your data follow a normal distribution, e.g. to make the calculations easier. However, if the data tells you otherwise then be bold and ruthless and change your model. As strange as it might sound, it is has to be your aim to prove a model doesn't work in order to use it successfully.
Remember Pythagoras? He believed in beautiful integers and the realisation that the square root of two was not a fraction of two integers caused a big crisis.
I would argue that we need mathematics to do statistics and statistics to do science. The developments over the last 350 years really demonstrate the success the scientific method. Of course some ideas had to go: the earth can no longer be regarded as the centre our solar system - instead it appears more like a little pale blue dot.
Diggle and Chetwynd, from Lancaster University, published a nice little book that gives a good introduction into statistics and of the scientific method. Two quotes of the book stuck in my mind (pages 1&2):
A scientific theory cannot be proved in the rigours sense of a mathematical theorem. But it can be falsified, meaning that we can conceive of an experimental or observational study that would show the theory to be false. ... The American physicist Richard Feynman memorable said that 'theory' was just a fancy name for a guess. If observation is inconsistent with theory then the theory, however elegant, has to go. Nature cannot be fooled.
A lot of people are wondering how health care reform will affect premiums for health insurance, including those plans that will be sold through the new Washington Healthplanfinder, our state's health insurance exchange.
The insurers selling plans inside and outside the Exchange have filed their proposed rates with our office. These rates include those for small businesses and for individuals buying insurance coverage on their own.
From what we've seen so far, we're pleasantly surprised. Many people will see rates similar to what they're paying now, or in some cases, lower -- and with substantially better benefits. While we have a lot of work to do in reviewing these proposals, and the final rates could change, we're definitely not seeing the huge rate increases that some insurers had predicted.
In most cases, the benefits are substantially better, particularly for the individual market. Today, most individual health plans don't cover prescription drugs or maternity care. When the Affordable Care Act takes full effect in January, they'll have to cover those things.
Also, in many cases, deductibles in the new plans are much lower than today and the new plans include approximately $500 worth of free preventive services, such as a wellness visit, some immunizations, cancer screenings, etc.
Finally, federal subsidies will help with the costs. If you earn less than $45,960 (or $94,200 for a family of four) you may qualify for a federal subsidy -- in the form of tax credits -- to help you pay your premium. You can get an estimate of that subsidy at the Washington Healthplanfinder site.
We're recruiting to fill an exempt position for the deputy insurance commissioner in charge of our Company Supervision Division.
The successful applicant will manage a wide variety of situations and influences the course of insurance affairs at the state, national and international levels.
The position is responsible for the financial and market examination and supervision of all Washington-organized insuring entities and all other insuring entities licensed to do business in this state. The position’s mission is to protect insurance consumers, the public generally, and the state’s economy by ensuring the safety and soundness of insuring entities, and to ensure that they comply with applicable law.
In addition, the position has broad statutory discretion and specific statutory authority involving the registration/licensing, operation, supervision, receivership, liquidation, and merger of insuring entities. For more specifics, duties, salary, and more please see the full job listing.
Q: "My insurer is supposed to take my monthly payment out of my checking account, but didn't do it for three months. Now they want me to pay for three months all at once. Can they do that?"
A: Sorry to tell you, but the short answer is yes, since they provided coverage for those three months. But since they erred by not taking the payments on time, it's worth asking if they'll allow you to pay in installments, so you don't get hit with a triple bill all at once.
When you know that a monthly insurance premium is supposed to come out of your bank account and you notice that it doesn't, don't wait and hope that that means you get free insurance. Call your agent or insurer and find out what's going on.
Over the last year I worked with two colleagues of mine on the subject of inflation and claims inflation in particular. I didn't expect it to be such a challenging topic, but we ended up with more questions than answers. The key question and biggest challenge is to define what inflation, or indeed claims inflation actually is and how to measure it. We published a summary of our thoughts and findings in this month's issue of The Actuary.
Last year's discussion about the differences between the retail price index (RPI) and consumer price index (CPI) in the UK only exemplified the challenge. The economist Tim Harford illustrated the differences between the RPI and CPI with a simple example of price changes for a shirt and blouse in his Radio 4 programme More or Less. The radio podcast is still available from the BBC. Start listening after about 18 minutes into the show.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
The presentation by the two attorney/consultants was only 45 minutes long. It just felt like four hours. They were on firm ground when they stayed in the tax and law side of the Patient Protection and Affordable Care Act (PPACA). Sure they got lost in the details and seemed compelled to flash every detail at their audience, but what they lacked in presentation skills was more than made up with the depth of their knowledge. That was as long as they didn’t venture into my area. They were lost when it came to the insurance part of the new law. Couple their confusion with their power-point skills and you have the perfect recipe for an agitated audience.
The young business owner seated next to me opined that none of this might matter much if the next administration reverses the law. I laughed and reminded her of the Republican’s love for the PPACA. We both just shook our heads. She isn’t old enough to remember when Democrats and Republicans actually worked together in Washington. She would never believe that Otto von Bismarck once remarked that “Politics is the art of the possible”.
There appears to be plenty of politics in Washington, but nothing seems possible.
This blog has discussed the numerous shortcomings of the PPACA for over four years, a full year before the law was even passed. But pretending that last year’s Supreme Court decision or the November 2012 election didn’t happen is not productive. We can’t all be as unproductive as the Republicans of the House of Representatives:
House Speaker John Boehner said Thursday that next week’s vote to repeal the health reform law is being held to provide new lawmakers a chance to vote on it.“We’ve got 70 new members who have not had an opportunity to vote on the president’s health care law,” Boehner said. ‘Frankly they’ve been asking for an opportunity to vote on it.”
That doesn’t mean that nothing is getting done and that no one has reached across the aisle in an attempt to make the PPACA work. It simply means that you have to look a lot closer to home if you are hoping to find anything positive.
The Ohio chapters of the National Association of Health Underwriters (NAHU) held their annual Day at the Statehouse last week. This is the trade group that represents health insurance agents. We were in Columbus to hear from some of the legislators that are the most involved in our area and from Lieutenant Governor Mary Taylor who also serves as the director of the Ohio Department of Insurance. Our afternoon was taken with appointments with various legislators. I was scheduled to meet with State Senator Scott Oelslager and the effervescent Minority Whip of the Senate, Nina Turner who represents me in Columbus.
Some of the states that have both Republican governors and Republican led legislatures have chosen to fight the PPACA. Still! We in Ohio are fortunate to have realists in Columbus. And though these leaders have taken a lot of heat from members of their own party, they continue to work through the process even though they disagree with the PPACA. Our first speaker was Representative Barbara Sears the sponsor of the recently signed H.B. 3. As Majority Floor Leader of the Ohio House Representative Sears shepherded the rules that would establish the training and responsibilities of both agents and navigators in the new exchange system. This is legislation that had bi-partisan support. While some states are satisfied with turmoil and the possibility of the PPACA dying under its own weight, Ohio is taking some of the steps necessary to make it work.
Lieutenant Governor Taylor’s presentation also gave me hope. Regardless of what she personally thinks about the PPACA, Ms. Taylor is committed to doing her job as director of the department of insurance. A big part of that job is reviewing every policy that will be allowed to participate in the Ohio exchange. That exchange is supposed to be available October 1st. As of last week ZERO policies have been presented for review and approval. No insurer wants to go first. She is already making plans for the onslaught she is anticipating in mid-June. If there is going to be a problem with the exchanges, it won’t be because of Mary Taylor.
Regular readers of this blog know that I don’t spend a lot of time complementing our elected representatives as a whole and even less on Republicans. I certainly did not vote for Governor Kasich and disagree with most of his positions on major issues. But you have to admire any leader, Democrat or Republican, who puts his state’s needs first.
My first appointment was with Senator Oelslager from Stark County. I was there to talk about other ways to help our clients get through the coming transition. None of my issues had anything to do with me, personally. We needed to talk to the legislators about S.B. 9 which cleans up the old, but soon to be unneeded, Ohio Open Enrollment Program. We also wanted to alert the legislators that Ohio still defines full-time as 25 hours per week while the PPACA uses 30. You get the idea. Cleaning up these and other seemingly small conflicts will save our clients big headaches in the future. So that was the purpose of my meeting with Senator Oelslager.
We spent the first ten minutes talking about the Kennedys. Republican Oelslager was inspired to go into public service by John and Robert Kennedy the way countless young men were inspired to pick up a guitar after seeing the Beatles on the Ed Sullivan Show. I found him to be engaging and well-informed.
My only disappointment of the day was that Senator Turner was called into a meeting. I instead met with her bright and energetic Legislative Aide, Adam Warren. Mr. Warren was an excellent substitute. I hope to have more conversations with him in the future.
So what I have learned is that politics is still the art of the possible. We just have to limit our discussion to the actions of our state legislators. It’s a start.
We're recruiting for a senior market analyst position at our Tumwater, Wash. office.
This position is responsible for conducting market analysis of regulated entities under the direction of the Chief Market Analyst. This position protects consumer's interests and promotes a healthy business environment in this state by providing regulatory oversight of market interactions between consumers and insurance carriers.
For more specifics, duties, salary, timeline, etc., please see the full job listing.
I am delighted to announce that the programme and abstracts for the first R in Insurance conference at Cass Business School in London, 15 July 2013, have been published.
The conference committee received strong abstracts from academia and the industry, covering:
Pricing
Reserving
Data mining
Capital modelling
Automate reporting
Catastrophe modelling
High-performance computing
Software development management
Register by the end of May to get the early bird booking fee.
We gratefully acknowledge the sponsorship of Mango Solutions and CYBAEA, without whom the event wouldn't be possible.
Department of Actuarial Science & Statistics, Heriot-Watt University
The well-known CreditRisk+ model of portfolio credit risk is often
described as "an actuarial model". Conditional on independent
gamma-distributed economic factors, credit losses in fixed time
periods are conditionally independent Poisson events. Exposures are
usually discretised into a finite number of exposure bands. This leads
to a reasonably tractable model that can be represented in terms of
compound sums.
We will review the structure of the model and then show how it can be
easily implemented in R. We focus on computing the portfolio loss
distribution using Fourier inversion techniques and deriving measures
of tail risk. We will also discuss the calibration of the model.
State space models offer much flexibility in dealing with general time
series and regression problems. Bayesian approach means that expert
judgement can be used in their formulation and they offer the benefit
of allowing the modeller to use information available at any time
period to pre-empt the effects of expected changes or increased
uncertainty in forecasts rather than being limited by more classical
approaches. This makes them valuable for many applications and they
are considered here for the calculation of actuarial reserves.
In this talk, a state space model using various growth curves for
modelling claims developments is presented. These curves are used to
model logarithm and inverse transformed cumulative claims as well as
development patterns. An advantage of the state space modelling
procedure is that a standard output of the model are parametric
ultimate claims forecast distributions for state and observations. The
parameters used in the state matrix are obtained from no-linear
regression of curves from the claims triangle.
Intervention techniques allow the modeller to quickly asses the
effects of new information before subsequent observations are
obtained. The model can also be used as a tool for pre-empting the
effects of potentially large claim events on the business class or
increased uncertainty in the underwriting environment.
This technique is compared with outputs from the chain ladder method.
The models are created using R, a rich statistical analysis
environment which also provides a framework for creating space state
models as well as allowing the user to create custom algorithms.
The recent Double Chain Ladder (DCL) by Martínez-Miranda, Nielsen and
Verrall (2012) has demonstrated how the classical chain ladder
technique can be broken down into its components. It was shown that
DCL works under a wide array of stochastic assumptions on the nature
and dependency structure of payments. Under certain model assumptions
and via one particular estimation technique, it is possible to
interpret the classical chain ladder method as a model of the observed
number of counts with a build-in delay function from a claim is
reported until it is paid. Under the DCL framework it is possible to
gain a deeper understanding of the fundamental drivers of the claims
development than is possible with the basic chain ladder
technique. One example is the case when expert knowledge is available
and one would like to incorporate it into the statistical
analysis. This can be done in a surprisingly simple way to include
into a double chain ladder framework.
In this talk we present a new package in R to analyse run-off
triangles in the double chain ladder framework. The package, which is
expected to be launched in July 2013, contains several functions to
assist the user along the full reserving exercise. Using specific
functions in the package the user will be able to load the data into R
from Excel spreadsheets, make the necessary manipulations on the data,
generate plots to visualize and gain intuition about the data, break
down classical chain ladder under the DCL model, visualize the
underlying delay function and the inflation, introduce expert
knowledge about the severity inflation, the zero-claims etc. The
package contains also data examples and has been documented to
facilitate the analyses to a wide audience, which includes
practitioners, academic researchers and also undergraduate, master and
PhD students. Using the package the user will be able to reproduce the
methodology of the recent papers by Martínez-Miranda, Nielsen, Nielsen
and Verrall (2011), Martínez-Miranda, Nielsen and Verrall (2012,
2013), Martínez-Miranda, Nielsen and Wüthrich (2012) and
Martínez-Miranda, Nielsen, Verrall and Wüthrich (2013).
References:
Martinez-Miranda M.D, Nielsen B, Nielsen J.P and Verrall, R. (2011) “Cash flow simulation for a model of outstanding liabilities based on claim amounts and claim numbers”. Astin Bulletin, 41/1, 107-129.
Martínez-Miranda, M.D., Nielsen, J.P. and Verrall, R. (2012) “Double Chain Ladder”. Astin Bulletin, 42/1, 59-76.
Martínez-Miranda, M.D., Nielsen, J.P. and Verrall, R. (2013) “Double Chain Ladder and Bornhuetter-Ferguson”. North American Actuarial Journal.
Martínez-Miranda, M.D., Nielsen, J.P. and Wüthrich, M.V. (2012) “Statistical modelling and forecasting in Non-life insurance”. SORT-Statistics and Operations Research Transactions 36 (2) July-December 2012, 195-218.
Martínez-Miranda, M.D., Nielsen, J.P., Verrall, R. and Wüthrich, M.V. (2013) “Double Chain Ladder, Claims Development Inflation and Zero Claims”. Scandinavian Actuarial Journal.
I consider a practical approach, based on R code, to the methodology
for the one-year view reserve risk described by [1]. The idea is to
extend the re-reserving algorithm outside the chain ladder model (see
[2]), introducing a proper algorithm that works directly on the
underlying GLM model defined for the ultimate view, and updated with
the simulated payments after 1 year. Besides, the R code gives also
the option to change the regression structure, distribution in the
exponential family and link function of the ultimate-view reserve risk
(see [3] and [4]) in order to permit a better understanding and
evaluation of the model error, as required by Solvency 2 (see [5]).
References
Ohlsson et al. (2008) – The one-year non life insurance risk [ASTIN Colloquia 2008]
Merz, Wüthrich (2008) – Modelling CDR for Solvency purposes [CAS E-Forum, Fall 2008, 542-568]
Gigante, Sigalotti (2005) – Model Risk In Claims Reserving with GLM [Giornale Istituto Italiano degli Attuari LXVIII, n. 1-2, pp. 55-87, 0390-5780]
The R statistical system [3] could be a very powerful tool to price
contracts in the business of insurance. As 2013, several packages
already exist that can aid pricing actuaries in their activity. This
presentation will show how standard R code enhanced by ad - hoc
packages could provide sound actuarial solutions for real business.
A first example could be pricing life contingent coverages for life
insurance business. Few examples performed with the aid of
lifecontingencies package [5] will show how R can be easily used to
perform standard pricing and reserving for life insurances.
A second set of examples will show how GLM estimation capabilities of
R statistical environment can be used to perform standard pricing of
personal lines general insurance coverages. Examples will be taken
from [4] working paper.
The last set of example briefly show an application of actuar [2] and
fitdistrplus [1] packages to price non-proportional reinsurance
coverage for a Motor Third Party Liability portfolio.
References
Marie Laure Delignette-Muller, Regis Pouillot, Jean-Baptiste Denis, and Christophe Dutang. fitdistrplus: help to fit of a parametric distribution to non-censored or censored data, 2012. R package version 1.0-0.
Christophe Dutang, Vincent Goulet, and Mathieu Pigeon. actuar: An r package for actuarial science. Journal of Statistical Software, 25(7):38, 2008.
R Development Core Team. R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria, 2012. ISBN 3-900051-07-0.
Keywords: Mortality modelling; Lee-Carter model; socio-economic circumstances; cause of death; ggplot2; gnm; forecast.
It is well known that mortality rates and life expectancy vary across
socio-economic subpopulations of a country. Higher socio-economic
groups - whether defined by educational attainment, occupation, income
or area deprivation - have lower mortality rates and longer lives than
lower socio-economic groups. In many cases, high socio-economic
subpopulations also experience faster rates of improvement in
mortality. These socio-economic differences pose important challenges
when designing public policies for tackling social inequalities, as
well as when managing the longevity risk in pension funds and annuity
portfolios. The successful addressing of these social and financial
challenges requires the best possible understanding of what has
happened historically and what is likely to occur in the future. A key
step in this direction is to investigate how individual causes of
death differ between the different socio-economic subgroups of the
population.
In this talk we illustrate how R can be used in the analysis of recent
trends in mortality by cause of death and socio-economic
stratification, using mortality data for England split by
socio-economic circumstances. More specifically, we demonstrate how
existing R packages can be used in the preliminary analysis and
visualisation of mortality data (ggplot2) and in the modelling (gnm)
and projection (forecast) of mortality trends employing
multi-population extensions of the popular Lee-Carter mortality model.
References
Hyndman, R. J, 2013. forecast: Forecasting functions for time series. R package version 4.03.
Turner, H., Firth, D., 2012. Generalized nonlinear models in R: an overview of the gnm package. R Package Version 1.0-6.
Wickham, H., 2009. ggplot2: elegant graphics for data analysis. Springer New York.
Insurance can greatly benefit from adopting the R platform and leading
companies are already reaping the rewards. We will show one example
from non-life insurance pricing which will cover both technical
implementation and business change, and we will share information on
the commercial benefits obtained. By using a specific example we can
keep the presentation concrete and the benefits real; however, the
applicability of the approach is general and we will touch on this in
the discussion.
There are many advantages of R. We will focus on two. First, R is
finely balanced to allow exploratory data analysis and interactive
model development while also being a platform for statistical
computing and data mining. As we will show, this is key for
productivity and an element to set up (bit-perfect) reproducible
models.
Second, it is comprehensive in the sense that most approaches to
statistics and data mining are included in the tool or its contributed
packages. Among other benefits, this allows you to easily run multiple
model types on your data, ensuring compatibility with classic and
often robust approaches while at the same time taking advantage of the
latest developments and emerging industry standards.
Non-life insurance pricing is a well-known and well-established
process and yet still a critical business issue. The standard for
tariff analysis is generalised linear models. We first show how to
develop such a model in R, including model selection and
validation. We touch upon how to deploy the model (both scoring using
the model and updating the model itself) while ensuring the results
remain validated and reproducible.
Next we show how easy it is to extend the model to more complex
techniques. In the interest of time we jump over intermediate
approaches and go straight to ensemble models, which are possibly the
state-of-the-art for high-performance models.
We are in no way advocating wholesale abandonment of classical
approaches for modern techniques, “black-box” or otherwise. Rather, we
propose that you make use of both: continuity and understanding
tempered with the results from the latest up-to-date methods. In the
final part we cover some of these business issues to show how other
insurers resolved them and what commercial benefits resulted. Examples
include using the advanced models to restrict the validity domain of
the classical approach (risk we do not understand and will not
insure) and using them to create derived variables, such as
interaction variables, to extend the domain of the GLM (understanding
complex risk).
Most actuarial departments in the Non-Life insurance industry use
Excel /VBA as their computation engine. Industry leading bespoke
modelling software, such as Igloo and ReMetrica relies on Excel / VBA
for data inputs and reporting. This talk points out the typical
problems that arise from using Excel / VBA in capital modelling and
how these issues can be overcome with a combination of R and a proper
version control system. Issues covered include
Keeping track of links
Keeping track of different versions of input data, model code and outputs
Support for multiple users
Trickiness of updates (eg range adjustments for a new underwriting year)
According to different sources the insurance sector is plagued by
fraudulent claims: in the UK alone total undetected general insurance
claims fraud is estimated at £1.9 billion per annum. This adds around
6% (or £44 a year), on average, to the insurance premiums paid by all
policyholders (Research Brief - 2009 Association of British
Insurers).
R offer powerful analytical functions to detect fraudulent
claims. They range from network analysis, typically used to monitor
fraudulent motoring claims, to text analytics.
The presentation aims to:
Offer a brief overview of the R packages that can be used for
fraudulent claim analytics (e.g. how network analytics can be used
to spot frauds etc.).
Illustrate the analytical pipeline component required to detect
potentially fraudulent claims using text analytics. One of the
components illustrated will be the use of the LIWC (Linguistic
Inquiry and Word Count) dictionary.
Link claims with the general insurance process to show the benefits
obtained through a wider usage of analytics.
Please note that currently we plan to illustrate the above using dummy
data, as any insurance company is reluctant to "loan" their data for
analysis.
1Department of Computer Science, Royal Holloway, University of London, Egham, Surrey, TW20 0EX, U.K., 2Department of Mathematical Sciences, University of Essex, Wivenhoe Park, Colchester, CO4 3SQ, U.K., 3Department of Biological Sciences, University of Essex, Wivenhoe Park, Colchester, CO4 3SQ, U.K.
Cloud Computing is increasingly being used by the Scientific
community. For example, in Bionformatics this has been largely driven
by the rapid increase in the size of Omic (Genomic,
Transcriptomic,...) data sets Stein (2010). This rapid increase in
data size is not unique to this field and is a surprisingly general
feature in data analysis. This type of computing is particularly
useful for a workflow where one needs to execute a complicated
analysis (e.g. a large R script) in a trivially parallel fashion over
a large data set. Within Insurance possible applications for such
high-throughput calculations include
time-series analysis which require extensive parameter sweeps or
VaR calculations for a portfolio of a large number of various financial instruments Kim (2009).
Much of the emphasis in cloud computing has been on the use of
Infrastructure as a Service platforms, such as Amazon’s EC2 service
where the user gets direct access to the console of the Virtual
Machines(VM’s) and MapReduce frameworks, in particular Hadoop Yoo
(2011). An alternative to this is to use a Platform as a Service
(PaaS) infrastructure, where access to the VM’s is programmatic. Other
PaaS clouds exists, notably the Google App Engine but are limited due
to a conservative approach to allowing libraries on the App Engine.
A PaaS interface can offer certain advantages over the other
approaches. In particular, it is more straight- forward to design
interfaces to software packages such as R. In the case of Azure,
another advantage is that Microsoft Research have provided a set of C#
libraries called the Generic Worker which allow easy scaling of VM’s.
We have developed software that makes use of these libraries to run R
scripts to analyse a particular data set approximately 1 Tbyte in
total size though decomposed into a number of a much smaller
units. This analysis provides an exemplar to run multiple R jobs in
parallel with each other on the Azure platform and to make use of its
mass storage facilities. We believe that this workflow is a very
common one and is applicable to any number of different areas where R
is employed. We will discuss an early generalisation we have dubbed
GWydiR to run any R script on Azure in this fashion, with a goal
on providing as simple a method as possible for a user to scale up
their R jobs.
References
Stein, L. D. (2010, January). The case for cloud computing in genome informatics. Genome biology 11(5), 207.
Hyunjoo, K., Chaudhari, S., Parashar, M. and Marty, C. (2009) Online Risk Analytics on the Cloud. 9th IEEE/ACM International Symposium on Cluster Computing and the Grid, 2009. CCGRID ’09 484-489 DOI:0.1109/CCGRID.2009.82
Yoo, D. and Sim, K-M. (2011). A comparative review of job scheduling for MapReduce., 2011 IEEE International Conference on Cloud Computing and Intelligence Systems (CCIS) 353-358. DOI:10.1109/CCIS.2011.6045089
The talk will provide R code to show how to automate the presentation of
key charts, tables and reports.
This will be in the context of providing information to general
insurance professionals who are mainly non actuarial. Typical
audience is underwriters and claims managers. The goal here is to
impart the maximum clarity to the information whilst also making the
production task easy and flexible.
The code used will essentially comprise of existing package material: the
intellectual added value provided here is really around the collation of this
material into a useful bundle of value to analytical practitioners.
The talk will also compare and contrast the process with current
alternatives used in the industry and discuss ideas for future
development to assist actuaries in their roles within general insurance.
Our talk will focus on the massive potential for R in the London
Insurance Market, our practical experiences of using it with our
insurance clients and the main obstacles R faces in gaining wider
acceptance and usage in the London Market. All of this talk is based
on practical experience of using R in real-world examples and draws
from the presenter’s personal experience.
There are three distinct sections to the talk:
Why R is useful in the London market
Personal experiences of using R in real-world problems
Practical barriers to using R in Insurance
Since the first part of the talk will be well-understood by most
attendees, this will be the briefest, but will offer our perspective
based on the model development and modelling projects we deliver
across a wide range of Lloyd’s and London Market clients.
The second part will discuss different applications of R we have found
useful, how they have been implemented and what value they have added
to the client. This part of the talk will use examples of how R has
been successfully used in pricing, reporting and in producing Lloyd’s
returns.
The third part of the talk is likely to prompt the most discussion;
here we will discuss the barriers R encounters in Insurance and how
these might be overcome. There is little doubt that while seasoned R
users believe strongly in its abilities R has not, yet, reached a high
level of market penetration. We hope that this talk will stimulate
debate within the audience about overcoming these obstacles so that R
can achieve wider recognition throughout the Insurance industry.
Catastrophe (cat) models are used to estimate loss distributions from
natural hazards like tropical cyclones, floods, or earthquakes. They
integrate multiple disciplines such as meteorology, climatology,
hydrology, structural engineering, statistics, software engineering
and actuarial sciences.
The ever increasing complexity of these models, the need for model
transparency, as well as the desire to integrate models with diverse
APIs have led us to develop an open source web-based cat model engine
based on R using Shiny.
By using R, users can easily create custom analytics and integrate
auxiliary data from any data source, while being able to probe
underlying model assumptions, perform sensitivity analysis and in-
vestigate all components of the cat model. We will demo our software
and speak about the various technology components.
Head of Exposure Management and Reinsurance, Lloyd's
In 2005, a group of nerds in Lloyd's (with one honorary member from outside) started a group called R Souls (say it fast and you'll get the joke).
They met every Friday to make the most of the fish and chips and swapped stories about R; learning from one another and becoming ever more proficient in the amazingly stable, flexible and exciting tool that is, R.
From these humble beginnings R is now embedded in many of Lloyd's core functions from benchmarking and reporting to catastrophe modelling.
My talk will give a short history of this turbulent and emotional journey including some tips on how to work with IT departments, and convince others to move from planet Excel to the 21st century.
Insurance Commissioner Mike Kreidler has scheduled a hearing for 10 a.m. on May 9, 2013, in Olympia to consider whether he should approve or deny the request for the merger of Washington-based Washington Dental Services (WDS).
Here's a summary of the proposal, including background, history, and a brief explanation of the hearings process and what we look at. If the proposal is approved, WDS would become a subsidiary under a new holding company system. WDS would later change its corporate name to Delta Dental of Washington.
To view filed documents and information about the hearing process, go to Washington Dental Service #13-0115. (Scroll down a bit after clicking on that link.) Those documents include the notice of hearing, the reorganization plan, board resolutions, organizational charts, and other requests for transactions filed in this proceeding.
The hearing is open to the public. Any interested parties may submit letters of support or concerns or objections and/or may participate in the hearing by appearing in person or by telephone at no charge. For street address or directions on dialing in by phone (as well as more background on the proposal), please see the hearing order.