Deep Learning at Scale

AI Tech Trends

Deep learning is a current hot topic in HPC and I am sure it will be one of the key themes at SC18. In many cases, we have been hearing about artificial intelligence, machine learning, and deep learning discussed together, though in reality, machine learning is a type of AI, and deep learning is a subset of machine learning. In this article, we will try to best define deep learning and its industrial applications and, more specifically, the benefits of scale - from big data to high-performance computing - to the successful implementation of deep learning.

Below is an illustration from a USoft blog titled “The significant difference between Artificial Intelligence, Machine Learning and Deep Learning” from February earlier this year that shows the evolution, complete with timeline. (Personally, I started programming AI in the late 1980s on a U.S. Department of Energy installation.)

So, what is Deep Learning?

Simply put, what makes deep learning “deep” are layers - how many layers a program goes through or learns from to reach output. Deep learning utilises Credit Assignment Paths (CAPs) to go from concept/input to ultimate output. A good description of deep learning – with excellent illustration below – comes from Favio Vásquez (, 12/21/2017):

Deep learning is a specific subfield of machine learning, a new take on learning representations from data which puts an emphasis on learning successive “layers” of increasingly meaningful representations. Deep learning allows computational models that are composed of multiple processing layers to learn representations of data with multiple levels of abstraction. These layered representations are learned via models called “neural networks”, structured in literal layers stacked one after the other.

Applications for Deep Learning

Deep learning applications and their potential benefits/advancements are wide-ranging. Vartul Mittal provided a good list in his “Top 15 Deep Learning applications that will rule the world in 2018 and beyond” on (10/3/2017). In it, Mittal lists opportunities — some industrial — including:

  • Transportation: Using self-driving cars as an example, including the “need to teach a computer to take over key parts (or all) of driving…”
  • Healthcare: From diagnostics to monitoring apps and personalised medicine, “AI is completely reshaping life sciences, healthcare, and medicine as an industry.” Specifically to brain cancer detection using neural networks, he adds, “spotting invasive brain cancer cells during surgery is difficult, in part because of the effects of lighting in operating rooms. They found that using neural networks in conjunction with Raman spectroscopy during operations allows them to detect the cancerous cells easier and reduce residual cancer post-operation.”
  • Finance: Future markets are but one example that provide great test cases for neural networks and deep learning. The sheer volume of financial transactions worldwide provide more than enough data to leverage deep learning for business benefit.
  • Advertising: “…has been used by both publishers and advertisers to increase the relevancy of their ads and boost the return on investment of their advertising campaigns,” leading to “data-driven predictive advertising.”

There are, of course, many more applied examples of the use and benefits of the use of deep learning to benefit business intelligence, operations, and decision-making.

Why Scale Helps in Deep Learning

The short answer to why scale is important in deep learning is taking complex data and coming up with accurate output rapidly. HPC plays a special role because of its ability to handle this big, complex data rapidly. Deep learning provides the sophistication of the analysis with greater accuracy than other data analytics methods. Samsung’s Trending Tech article titled “Understanding the Business Potential of Deep Learning Technology” (7/2/2018) states:

"Though neural networks have existed for decades, the adoption of deep neural networks has grown dramatically within only the past few years. That’s in part due to the recent availability of extremely large data sets, thanks to the widespread use of the Internet and crowdsourcing services. If one is working with a relatively small data set, there is little difference in the performance between deep neural networks and traditional algorithms or smaller neural networks. However, as the amount of data increases, the performance of those traditional systems will eventually plateau. With a large data set, the performance capabilities and accuracy of deep neural networks begin to skyrocket."

Extremely large data sets - Performance - Accuracy skyrocketing. Compute resources at scale manage deep learning algorithms more effectively. With the explosive growth of available data, deep learning has great potential to be more widely used in industrial solutions.


In HPC, scale and deep learning go hand-in-hand. The volume of data available now compared to even a few years ago has grown exponentially, as has the sophistication of analysis that can be done to that data. Deep learning is just that — deeper — more iterative, and more sophisticated. As such, scale, from ability to handle huge amounts of data to being able to process (analyse) it, is very important in both accuracy and time from input to output, and HPC is the ideal solution provider.

Written by Brendan McGinty (Guest)

See Brendan McGinty (Guest)'s blog

Brendan McGinty is Director of Industry for the National Center for Supercomputing Applications (NCSA), University of Illinois at Urbana-Champaign.

Related blogs

3D Map Making Challenges for Autonomous Driving

Building accurate road maps is a central part of the effort to build and deploy more autonomous vehicles in the real world. The term “map” may be a bit of a misnomer, though, because these maps aren’t anything like the flat 2D images available online, they’re complete three-dimensional recreations of roadside environments that are updated on a continuous basis to provide a high degree of accuracy — often down to the centimeter scale. These 3D digital maps are a critical part of an autonomous vehicle’s ability to perceive the world, and have key applications in other technologies, which has made the effort to develop the definitive map a highly competitive endeavour.

Read more

Explainable AI

SC18 here in Dallas is proving once again to be a fascinating melting pot of HPC insights and observations, and it's intriguing to see the continuing convergence of AI into the supercomputing ecosystem. Along these lines I started to think about the movement towards 'Explainable AI'. Being able to explain and understand how models work when making predictions about the real world is a fundamental tenet of science. Whether solving equations in a dynamic system for precise answers or using statistical analysis to examine a distribution of events, the results sought from these methods are intended to increase our clarity and knowledge of how the world works.

Read more

HPC and AI collusion – Rumours from the trade show floor

Recently I’ve garnered much of my blog inspiration from industry events. February has been no exception and I benefited from a fascinating day earlier in the month at the HPC and Big Data conference in London, just a stone’s throw from the Houses of Parliament. Here are some of my observations from my discussions there...

Read more

We use cookies to ensure we give you the best experience on our website, to analyse our website traffic, and to understand where our visitors are coming from. By browsing our website, you consent to our use of cookies and other tracking technologies. Read our Privacy Policy for more information.