Insurers need to revise their view on standard actuarial models. Dr Ben Zehnwirth, managing director of Insureware, shows how
What kind of technologies are currently being developed to meet insurer modelling challenges?
Assessing reserving, risk capital and pricing requires a statistical model with inherent application to time series analysis that links past behaviour with future expectations in a natural way. The optimal statistical models – which we provide – offer a narrative about the data and connect with future forecast assumptions.
Standard actuarial methods applied to loss reserving triangles are, to use a euphemism, statistical drivel. Any technology being developed based on link ratios, whether statistical regressions or some deterministic calculation, puts companies at a high level of model specification risk.
The technology we have developed over the last 15 years for property and casualty (P&C) long-tail liabilities is well suited to solving the modelling issues the insurance industry faces. Clients have used our software to estimate loss reserves, price portfolios and risk capital transfers.
We do not provide methods. Instead, our technology provides a modelling framework in which the statistically optimal model is identified from the data. As part of mitigating model risk we require that assumptions carried by the model are supported by the data. We have also developed technology to test the assumptions made by link ratio-based methods and can easily show they have no relation to the data.
Cyber risk seems to be an ever growing theme in the market. How are insurers coping?
With the development of the Internet of Things with so many devices having not only internet access but the potential to be controlled remotely, this is understandably an area of grave concern.
In this area, the application of blockchain to prevent unauthorised modification of devices to ensure the integrity of access – more than just passwords – could provide significant benefit. Insurers face challenges not only in assessing risk exposure, but also whether any technology purporting to fix the issues actually addresses the underlying problem.
Are technologies such as big data, blockchain and AI any closer to making a difference for insurers?
In my opinion, not yet – and perhaps this is for the better (for big data and AI anyway). Artificial intelligence and big data modeling can certainly provide useful insights – but more at the learning/education level to provide feedback into the market rather than enabling insurers to price cohorts better. After all, discriminatory policies can very easily be introduced by unidentified biases in big data or AI-generated models. These tools should always be supplementary, working alongside actuaries and analysts – not replacing them.
Embedding risk culture into the business process is an area that many feel could improve. Are risk managers keeping pace with technology?
Risk managers need to be able to understand the fundamental risks the business is exposed to. We focus on long-tail liability risk – often the largest risk component for any company writing a significant P&C insurance portfolio. In our interactions with clients, we typically find that companies do not assess this risk component appropriately as they rely on incorrect assumptions from the industry or other sources without actually deriving the calculations from the long-tail liability lines written.
With long-tail lines in particular, the risk assessment needs to be split between inherent volatility, parameter uncertainty, and overall model risk. Models derived from standard actuarial methods cannot do this.
There are many areas of risk exposure. What is the one you think actuaries are most likely to overlook?
Model specification risk. The traditional, ratio-based, actuarial methods cannot describe changes in the calendar time direction. As a result, practitioners of these methods assume that the average calendar trend is applicable for representing both past and future projections – without ever actually measuring what the trends are. This exposes the analyses to model specification risk in a way that is very hard to quantify without actually modeling the calendar trends.
Are actuaries finally moving away from Excel yet?
Sadly not. Even if actuaries only used our flagship software for data management they would find vast improvement over Excel. For our upcoming release we have developed tools that can interrogate data warehouses according to well specified rules – totally eliminating any errors in data calculations and related formulas.