Meet us in Amsterdam to discover how HPC is transforming weather simulation

Earth Sciences HPC

As we at Verne Global prepare to attend the Meteorological Technology Expo in Amsterdam later this month, it occurred to me that as a society we have an extraordinary number of superstitious methods for predicting the weather, from cows lying down meaning that it’s going to rain (though there’s actually no truth in that) to “red sky at night, shepherd’s delight(which might actually have a little truth in it).

The fact that there are so many of these emphasises, first, how important the weather is to our daily lives, and second, that for a long time superstitions were all we had to go on.

Thanks to the many increased means of measuring, and analysis infrastructure in the form of high performance computing (HPC) - weather forecasting has improved immeasurably in recent decades. Many of us can remember when the weather forecast was unreliable beyond the next 24 hours, such as the hurricane in 1987 but irregular forecasting hasn’t been the case for years. Experts say that accuracy is increasing by around one day every decade and today’s five-day forecasts are almost as accurate as two-day forecasts were three decades ago.

In the most general terms, accurate forecasts rely on three things: availability of data, reliable models and simulations of weather trends and industrial scale HPC to process it all – which is why we will be showcasing our innovative hpcDIRECT platform to the global meteorological community in Amsterdam.

Getting it right is vitally important. In rural parts of the world, the livelihood of farmers can be shaped by significant weather events. Entire industries, for example the travel industry, can have their fortunes altered in ways that have a noticeable effect on the bottom line. More warning of significant weather changes allows people to make better plans.

On top of that, there are the public health and safety benefits of being able to predict extreme weather events. Even a few extra hours of warning of a hurricane can be a matter of life and death. And finally, as climate change alters long-standing weather patterns, our ability to predict medium-to-long-term changes will become more significant.

More than 11,000 weather stations worldwide measure a range of factors every hour, from temperature to air pressure, and relay this to state-run and private forecasters, where it is combined with data from ships, weather balloons, satellites and aircraft and then run through computer models.

These models require vast amounts of data so that they can quantify uncertainty. The models will inevitably have errors but so will the observations. It therefore is not possible to estimate a “state zero” from which the forecast starts. Instead, most weather models predict a range of starting points and forecast from there. The output will be a corresponding range of predictions and the goal is to make that range as narrow as possible and, of course, contain the accurate prediction. The modern weather forecast, then, is not the result of one simulation, but of many.

But accuracy depends to a large extent on the size of the area to which the forecast applies. In the early 1980s the ‘spatial resolution’ of weather forecasts covered around 200 square kilometres. Within that area, a lot of different things could be happening. If the forecast predicted rain there might still be large parts of that region that stayed dry. Today, the spatial resolution of many forecasts is around nine square kilometres.

At a recent talk at The Platform for Advanced Scientific Computing Conference (PASC18) in July in Basel, Dr Nils P Wedi of the European Centre for Medium-Range Weather Forecasts (ECMWF) said that the ECMWF is aiming for a resolution of one square kilometre but it will take more than a decade to get there.

The increase in spatial resolution has followed the rise in computing power. At higher resolutions, more data is needed and processing it all in a useful timeframe is a challenge. A forecast of nine square kilometres requires around 48 terabytes of data. At 1.25 square kilometres around 1.8 petabytes will be needed.

At the moment, said Dr Wedi said, ECMWF is archiving around one petabyte of data every week. He referred to a 2018 study from the Federal Institute of Meteorology and Climatology in Zurich that ran a kilometer-scale Earth system simulation using 4,888 GPUs.

More and more weather companies are partnering with specialist HPC providers like Verne Global so that the compute they need is available when they need it. The volume of compute required to achieve accurate forecasting is ever increasing, HPC has become the engine room for weather forecasting. In 1959, the Met Office’s computer Meteor, was capable of doing 30,000 calculations a second. The Cray Supercomputer the Met Office operates today is capable of over 14,000 trillion arithmetic operations per second – that’s more than 2 million calculations per second for every man, woman and child on the plane. Embracing provider HPC services enables these crucial organisations to further their efforts and improve their results.

This won’t just lead to more accurate forecasts; it will also bring new services as data starts to get connected. Your sat-nav might route you away from a particular bridge in high winds or let you know that a road is flooded, for example, while theme parks might give customers real time information about weather trends so that they can choose which rides they go on and when.

It might even be possible to determine once and for all whether cows really do know when rain is coming.

If you’re in Amsterdam for the Meteorological Technology Expo please do come along to our stand (5023) to hear how hpcDIRECT can support your compute demands. I look forward to meeting you.

Written by Spencer Lamb

See Spencer Lamb's blog

Spencer is Verne Global's Director of Research and head's up our high performance computing work with European research and scientific organisations. He is also a member of the European Technology Platform for High Performance Computing (ETP4HPC).

Related blogs

HPCs role in seismological study and advanced earthquake analysis

Researchers have typically taken an empirical approach to earthquake study, but as high performance computing (HPC) becomes more prevalent, traditional methods of seismological study are making way for a new paradigm of earthquake analysis based on high-granularity models.

Read more

Welcome to Centro Epson Meteo

It’s been a while since I wrote a blog so I thought there was no better opportunity to pick up where I left off and write something about my recent trip to the Meteorological Technology World Expo in Amsterdam. This was also the venue for where we announced our latest customer win – Centro Epson Meteo, one of Europe’s most innovative meteorological forecasting organisations who I am delighted to say chose Verne Global for their high performance computing (HPC) requirement.

Read more

How HPC is underpinning amazing advances in land, sea and space observation

In previous Verne Global blogs we’ve explored how HPC is being used throughout industry to make cars both faster and safer, to discover new materials and to advance bioinformatics. HPC has made many equally important contributions to the science of understanding our earth and the solar system around us, and it's an understanding that’s become increasingly important in the age of anthropogenic climate change.

Read more

We use cookies to ensure we give you the best experience on our website, to analyse our website traffic, and to understand where our visitors are coming from. By browsing our website, you consent to our use of cookies and other tracking technologies. Read our Privacy Policy for more information.