Verne Global

HPC | Manufacturing |

14 June 2018

HPC is unlocking the 'data gusher' in Oil and Gas research

Written by Nick Dale

Nick is Senior Director at Verne Global and leads our work across HPC and specifically its implementation within discrete and process manufacturing. He is based in our London headquarters.

An ‘oil gusher’, or a 'blowout', is the name for that phenomenon that you’ve seen in photos and film clips, when a drill strikes oil and it sprays out of the top of the well. It was common in the early 20th Century but is now quite rare, thanks to pressure control equipment. However in today’s oil and gas industry, data is the modern gusher – it sprays out in an uncontrolled fashion, signifying that something good is going on but it remains hard to get under control.

Chevron’s expanded Tengiz oil field in Kazakhstan is due to start production in 2022 and, when it does, it will gather data from around one million sensors. This is partly because sensors are getting cheaper and more powerful; it would almost be foolish not to include them. Data on everything from temperature to pressure can be considered alongside video footage and 3D seismic charts; and cheap, ready access to High Performance Computing (HPC) makes analysing it all easier than ever. Since we launched hpcDIRECT - our bare metal HPC platform for industrial applications, one of the sectors that has shown the most interest is oil and gas, and it's easy to see why.

Companies in the oil and gas sector are under increasing cost pressure, which means existing deposits must be worked more efficiently and new ones must be found more cheaply. Better use of data is one way to keep a lid on costs.

The problem is that companies don’t always use the data they have. Al Walker, the chief executive of Anadarko Petroleum, told a 2017 conference: “I have terabytes and terabytes of seismic data, and I might use five per cent of that.” A 2017 World Economic Forum white paper said that a third of oil and gas companies were investing in big data and analytics but only 13 per cent were using those insights to drive their approach towards the market and competitors”.

It’s easy to get overwhelmed, as with the gusher. McKinsey, the consultancy, recommends dealing with this by “thinking big, piloting small, scaling fast” . Each stage is key: you need a project large enough to be worth doing but your pilot must be small enough to be manageable. Once you’ve proved it works, then rapid scaling is the way to make it pay off. One example of that is ‘virtual drilling’. Royal Dutch Shell says it saved almost $10m on a well drilled into a shale deposit in Argentina by using real-time data analysis to control the speed and pressure of drilling.

Another application for real-time data is in preventing failure. Rockwell Automation uses sensor data to monitor equipment hundreds of miles away and predict failures before they happen. The company says that the failure of one pump on an offshore rig can cost up to $300,000 per day, so preventing it failing in the first place is a significant achievement.

It isn’t that long ago that it was simply impossible to process this data with anything like the speed needed for it to be useful. And today’s datasets are even larger – as I wrote last year, we’re talking about petabytes, rather than terabytes. However, the availability on HPC and cloud computing is a game-changer. Without it, the oil and gas industry is neglecting a key tool for the key challenges it faces.

First, the complexity of modern business is growing, and the oil and gas sector is no exception. Digitisation, automation and analytics are vital for managing complexity. Second the industry is facing an enormous knowledge loss as the baby boomer generation nears retirement. New recruits cannot replace the expertise leaving the industry and so machines will have to pick up some of the slack.

Third, the need for impeccable health and safety and environmental protection is paramount. Increased automation can mean fewer staff needing to be in situations where a hazard might arise. Virtual reality simulations of oil rigs, for example, make it possible to train staff without them needing to go to an oil rig until their ready. And the kinds of predictive analytics and sensors mentioned above can deliver early warning of the sort of failures that result in injuries and environmental catastrophes.

The role of HPC in oil and gas and geoscience research is something that I expect to hear a lot about across the next couple of weeks following the 80th EAGE Conference and Exhibition which is taking place currently in Copenhagen. We're talking to a number of clients within this sector and it's fascinating to see how HPC is helping them do their research faster, smarter and safer.

Share:FacebookTwitterLinkedInmail

Sign up for the Verne Global newsletter

Opinion, thought leadership and news delivered directly to your inbox once a month.