HPC and AI collusion – Rumours from the trade show floor

Insights Tech Trends


Recently I’ve garnered much of my blog inspiration from industry events. February has been no exception and I benefited from a fascinating day earlier in the month at the HPC and Big Data conference in London, just a stone’s throw from the Houses of Parliament. Here are some of my observations from my discussions there...

At the conference (at which our CTO Tate Cantrell spoke - image above) I recognised Ofcom as the UK communications regulator: “Ofcom is the communications regulator in the UK. They regulate the TV, radio and video-on-demand sectors, fixed-line telecoms (phones), mobiles and postal services, plus the airwaves over which wireless devices operate.” None of which is HPC or big data or the data centers that support them.

By chance, on separate occasions, I chatted with two of the Ofcom delegates and they both had a similar explanation for their attendance: they were familiarising themselves with the HPC industry and researching if there are facets of it, which would benefit from policy or regulation help. For sure, there are aspects of data privacy and artificial intelligence (AI) ethics that could benefit from non-commercial oversight, but I suspect HPC and big data technology is better left alone.

By coincidence, this month Elon Musk quit the board of the research group he co-founded, in 2015, to investigate the ethics of AI. OpenAI said the decision had been taken to avoid any conflict of interest as Elon's electric car company, Tesla, becoming more focused on AI". Surprisingly for an AI developer, he has been one of its most vocal critics, stressing the potential harms. Before founding OpenAI, Elon said “AI was humanity's biggest existential threat”. And in 2017, he said that the United Nations needed to act to prevent a killer robot arms race. I suspect that using some of the brightest minds in the business is a better approach than repurposing a dusty government regulator – I could be wrong.

As interesting as policy can be, I found another area of government involvement far more stimulating. A combination of pure science research grants from both the UK and USA funded a very bright team from Imperial College London (ICL) to investigate the fuel efficiency of commercial aircraft. Having a pilot’s license and spending many a fair-weather weekend guiding a sailplane around the skies, made this subject even more intriguing.

The project aims to improve jet fuel efficiency by more accurately modeling the rear-most rotor blades of the engine. Yes, yes, I know they have already been modeled. However, since the beginning of modern Computational Fluid Dynamics (CFD) in the 70s and 80s many assumptions were required to simplify the computing task to match the available computers. One such assumption was to assume all the jet engine blades operate in laminar or near-laminar airflow, which is a stretch.

A modern by-pass jet engine has a large set of blades in front pushing volumes of air around the more conventional jet engine inside it. Perhaps this first by-pass rotor can be modeled using a laminar flow model with accuracy but the remainder of the internal jet engine most likely can’t. The ICL team had evolved the CFD Navier-Stokes equations, etc. to improve their turbulent air modeling, using today’s HPC clusters, improving the predicted jet engine blade efficiency. I already feel the burden of my international travel carbon footprint improving!

In a footnote to their talk, they described a potentially “leap-frogging” technology combination. With some of the new HPC and GPU resources becoming available to ICL, they want to up their game, yet again. The CFD limitations often result from the homogeneous modeling grid being very fine to match reality. This fine grid multiples the necessary compute power and computation time, so often the grid spacing is compromised to match the available compute resources.

Their inspiration is to take a test environment and model it using abundant CFD computational resources until it provides accurate results. Thereafter, they take the homogeneous grid and evolve it using Machine Learning/ Deep Neutral Network techniques to find a non-linear/non-homogeneous grid which provides a strong correlation to the original CFD results. Now they use the condensed and more efficient grid to rapidly develop the solution with their available computational resources and then finally confirm the final results again using the original fine grid model. If only I was so innovative, I wouldn’t be sat in row 36A crossing another ocean every few weeks!

This collusion between traditional HPC applications and Machine Learning (ML) is going to accelerate the impact of AI way beyond its current data mining, natural language and machine vision hotspots. It’s just as well that we have ample industrial computing facilities at our Icelandic data center to perform this HPC/ML collusion without busting your budget. Join our most recent AI client win, Analytic Engineering and come and see how all AI training roads lead to Iceland!

Let’s chat at GTC Silicon Valley in San Jose in March - bob.fletcher@verneglobal.com


Written by Bob Fletcher

See Bob Fletcher's blog

Bob, a veteran of the telecommunications and technology industries, is Verne Global's VP of Strategy. He has a keen interest in HPC and the continuing evolution of AI and deep neural networks (DNN). He's based in Boston, Massachusetts.

Related blogs

Faster mobile speeds will bring big changes…but it will take time

The noise around 5G is growing. The new mobile data standard is not only faster than 4G but also capable of handling many more devices, which promises a huge boom for the Internet of Things (IoT), smart cities and all kinds of connectivity. However, there is still plenty of work to be done.

Read more


Next generation energy storage solutions: An emerging option for enhancing data center reliability

For years, data centers have been haunted by the threat of power outages and the associated costs of such events. This situation is getting worse, with the most recent numbers from a 2016 report by the Ponemon Institute indicating that the average costs of a data center outage rose from $505,500 in 2010 to over $740,000 in 2015, while the maximum cost increased from $1.0 million to $2.4 million. How can next generation energy storage solutions help?

Read more


Big tech is leading on data privacy – other firms must follow

Business historians might one day see 2018 as a pivotal year. We are in the midst of an AI revolution, with more and more data being processed by algorithms that will help us to make better decisions or simply make the decisions for us. But the collection and exploitation of this data is not without costs and historians might view this year as the year when society began to realise that.

Read more

We use cookies to ensure we give you the best experience on our website, to analyse our website traffic, and to understand where our visitors are coming from. By browsing our website, you consent to our use of cookies and other tracking technologies. Read our Privacy Policy for more information.