Deep learning for autonomous cars

Insights Engineering


Since Sebastian Thrum and his team used machine learning to win the DARPA Grand Challenge in 2005, machine learning and deep learning have been an integral part of developing autonomous vehicle technology. Great progress is being made, but complex questions remain. My latest blog looks at these issues.

The first and most popular approach to deep learning technology in autonomous vehicles is the traditional “robotics” or “mediated perception” approach, which means using computer vision, LiDAR and other components that rely on deep learning technology to parse an image into relevant objects (or “features”), such as pedestrians or other cars. The information from these disparate systems is then assembled by the car’s processing unit where driving decisions are made according to a system of pre-programmed rules. Companies such as Uber and Google’s Waymo have pursued this “traditional” approach, using deep learning in the computer vision and sensor system to perform tasks like locating lanes and pedestrians, recognising and understand road signs, and monitoring the vehicle’s blind spots.

Although this method has been widely adopted, it’s not without its limitation. For example, sensor input may be degraded when rain or mist gets on sensor lenses, thereby reducing the quality of the data returning to the system. Lidar systems in particular may be vulnerable to poor weather, sometimes falsely-perceiving rain or snow as approaching objects to be avoided. Black ice and other road conditions further complicate the safe operation of these systems.

Meanwhile, other companies are approaching vehicle autonomy from a purely deep learning perspective. One of these companies, Drive.ai, has been particularly successful with this deep-learning first approach to vehicle autonomy. Instead of a pre-programmed, rules-based approach, Drive.ai is training a deep learning system to devise its own decision-making capability based on the scenarios that it encounters. By encouraging vehicles to learn on their own, the team at Drive.ai hopes to develop a system that can handle the “edge cases” in autonomous driving — the unexpected experiences that make driving unpredictable, and often stymie pre-programmed intelligence systems. In 2017, Drive.ai raised $50 million (and then another $15 million), added renowned machine learning researcher Andrew Ng to its board, and signed a partnership deal with Lyft to start testing self-driving cars alongside with the ride-sharing service. Other companies working on similar technologies include Hungarian firm Almotive (which recently opened an office in Mountain View), and deep learning leader Nvidia, which is bringing its accrued expertise to develop the Nvidia PilotNet solution.

Proponents of the “deep-learning-centric” model consider the endeavor something like teaching a teenager to drive, though the challenges are probably a little greater than that. More importantly, this deep-learning-first model presents the “black box” problem. With all of the intelligence of these system occurring deep within a convolutional neural network, understanding and adjusting their intelligence can be a difficult, opaque process for developers. In a situation where human lives are at stake, this opacity and lack of control can be very hard to accept.

Though the majority of companies are following one of these two approaches, still other methods of providing vehicle autonomy exist as well. An MIT spin-off called ISEE, operating out of Boston lab space the Engine, is eschewing both the models mentioned above in favor of developing an intelligent “common sense” for cars, based on deep-learning enabled observation of human experience and interaction. By developing algorithms to study how humans behave with each other, and in the physical world around them, the company hopes to develop a model that can anticipate and improvise in reaction to the subtler aspects of driving, based on its accumulated knowledge of human social nuance.

It remains to be seen how exactly deep learning will find its optimal implementation in vehicle autonomy. Is it just a tool for vision and perception, or should it provide the cognitive “heart and soul” of our vehicles as well? While the answer to this fundamental debate is still unclear, the benefits of autonomous vehicle technology become more apparent by the day. Not only can they help keep the roads safer, but good autonomous vehicles will also help the disabled regain their independence and self-confidence, and may even make ships that are safer, more efficient, and cheaper to operate a reality. Verne Global, as a provider of cost-effective, 100% renewable HPC computer power, which includes our new hpcDIRECT solution, greatly values its role in making the age of vehicle autonomy a safe and satisfying reality by delivering HPC and deep learning capabilities to our clients worldwide.


Written by Nick Dale

See Nick Dale's blog

Nick is Senior Director at Verne Global and leads our work across HPC and specifically its implementation within discrete and process manufacturing. He is based in our London headquarters.

Related blogs

Trends Advancing Industrial HPC

As we build-up to SC18, Verne Global is delighted to welcome Brendan McGinty, Director of Industry for the National Center for Supercomputing Applications (NCSA), University of Illinois at Urbana-Champaign, as a Guest Blogger. In his first blog Brendan looks at the industry trends around supercomputing and industrial HPC, and how these are advancing innovation in key markets.

Read more


Iceland provides the power behind Germany's most pioneering AI start-ups

This week has seen the announcement of Analytic Engineering, a pioneering German AI engineering firm, choosing Verne Global’s data center in Iceland as the location for their intensive computing. This represents another impressive AI and Machine Learning client win for us, following DeepL joining us just before Christmas.

Read more


AI Icelandic Style – Deep Learning in Reykjavik

Last week I was privileged to be part of our AI and HPC Field Trip to Iceland. The goal of the trip was to share insight and observations around the evolution of AI deep neural network (DNN) training and to tour our HPC-optimised data center. The attendees spanned DNN veterans like Eric Sigler of OpenAI, large enterprise data science leaders like Pardeep Bassi of Liverpool Victoria Insurance (LVE) and start-up pioneers like Max Gerer of e-bot7.

Read more

We use cookies to ensure we give you the best experience on our website, to analyse our website traffic, and to understand where our visitors are coming from. By browsing our website, you consent to our use of cookies and other tracking technologies. Read our Privacy Policy for more information.