Verne Global

Finance | HPC |

10 January 2019

With HPC no longer at a premium, insurance companies should pay in

Written by Shane Richmond (Guest)

Shane Richmond is a freelance technology writer and former Technology Editor of The Daily Telegraph. You can follow him at @shanerichmond

With growth in the high performance computing (HPC) market expected to continue at pace, one of the sectors that could certainly benefit from improved performance and lower costs is the insurance sector. Analysts Market Research Engine were quick out of the gates with a new report while 2019 was just a few days old. On January 8 the firm released a forecast for the HPC market, claiming that they expect to see it exceed $45 billion by 2023.

That's not really a surprise to anyone watching the sector. In November last year,Research and Markets forecast a market size of $49.3B by 2023, while in March 2018 Mordor Intelligence predicted $43.97B by 2023. The numbers are pretty consistent and the reports show a sector that is growing overall, but with some particular segments likely to expand.

One of those areas is insurance. Ironically for an industry that has been gathering data and using it to make predictions for decades, the insurance sector has lagged behind in digital innovation. Is it now time for the industry to embrace HPC? And what might it gain from doing so?

A 2017 survey by Willis Towers Watson found that 74 per cent of global insurers felt that their industry trailed its peers in the financial services sector . The company said insurers planned to focus on "developing data and analytic tools [...] with many carriers claiming to have already made substantial progress in this area".

According to US actuarial and consulting firm Milliman, in the mid-1980s insurers were modelling risk and return using seven scenarios . That gradually increased to 50 and was around 1,000 a decade ago. That might not sound like much but, as Milliman says, that still involves millions of data points: "In fact, a 1,000-scenario model with reserves and capital based on 1,000 paths at each valuation point for a 30-year monthly projection requires the cash flows for each policy to be projected 360 million times."

When the Milliman piece was written, insurers were being urged to embrace HPC, but it was still expensive. Since then, the price has come down sufficiently that the insurance industry can now do far more complex analyses. For example, a California startup called KatRisk recently used the Titan supercomputer at the Oak Ridge National Laboratory in Tennessee to model flood risk. The firm's aim was to evaluate flood risk on a city block scale - a far more precise estimate than has ever been attempted before.

"The KatRisk team focused on combining hydrology models, describing how much water will flow, with computationally intensive hydraulic models that calculate water depth," said Inside HPC. "In this way, the company could predict not only the probability of a flood in a given area but also how bad it might be—an important analysis for the insurance industry."

As climate change progresses, the ability to model complex systems such as extreme weather patterns and the behaviour of flood water is going to be increasingly important for governments and planners or all kinds, but the applications for the insurance sector are particularly clear.

Aiding with risk assessment analysis like this will be the proliferation of sensors. We're already seeing more data being gathered on individual health, movements of traffic and people, as well as things like energy use and crop health, thanks to the wide deployment of sensors. By 2025 we're expected to be well into the era of the "trillion sensor" economy, in which even more information will be collected, from more than 100 billion Internet of Things devices, each containing a dozen or more sensors.

The insurance industry as a whole is about to see an explosion in useful data. With sophisticated analysis, it can be used to create new products. For example, increasingly accurate information about weather and crops has helped to grow micro-insurance for farmers in the developing world, opening up products that simply weren't available before.

It isn't just price and capacity that make HPC a better option for insurance risk analysis. Speed is important as well. Companies need to be able to make quick decisions if they are to manage risk effectively and that only becomes more important as insurance products become more complex.

In order for the analysis to be useful, it is not just enough to interpret the results well, the model has to be well set up from the outset. Furthermore, it's important to use the right data. The vast amount of data available makes this problem more challenging. Historical data isn't always a good representation of the future, as habits and behaviour can change. On the other hand, for some new kinds of data, there is a learning curve to be followed because there are no historical patterns from which to judge its value.

As the industry progresses, more of this analysis will be passed off to artificial intelligence, which will be able to continually analyse things like health data to draw out significant factors that were previously unknown.

Aside from getting the data analysis right, insurers must be aware of the risks that come with handling personal data. Health data, for example, can be very sensitive and recent legislation, such as GDPR in Europe, has set down strict rules for how this data should be handled. Companies need to be aware of whether they are using data for a purpose to which the customer has consented.

The benefits clearly outweigh the risks, however, and it seems certain that HPC, big data and analytics will be a crucial element in the digital transformation of the insurance business.

Share:FacebookTwitterLinkedInmail

Sign up for the Verne Global newsletter

Opinion, thought leadership and news delivered directly to your inbox once a month.

SIGN ME UP