Where are all these AI chips coming from?

AI



Blink, and the silicon industry changes. I thought we knew what kinds of microprocessor we needed, and the silicon chip makers were just getting on with making them faster, cheaper and better. But now, I’m seeing reports of a whole bunch of so-called “AI chips”. So what’s happening, and why?

The birth of AI chips is a strange thing, because they are not coming from the usual places: cloud providers like Google and Alibaba, telecoms firm Huawei, and tiny start-ups like GraphCore. IBM is making AI chips, after it seemed to be scaling down its silicon work. And even Intel, when it got involved, bought a start-up called Nervana, and then teamed up with Facebook.

It’s also come just after system developments in the data center and elsewhere seemed to be already providing everything needed by AI, and conventional hardware was bringing new developments along to meet further needs.

AI and machine learning has always been a software discipline. The first generations of computers automated calculations, and when they got more powerful, software running on them began to simulate things more like the way human brains work. IBM’s Deep Blue chess playing computer, for instance, and more recently the Watson system.

A new type of processor did come along: the “graphics processing unit” (GPU) which performed image processing fast through direct access to memory. Gaming made these chips widespread and cheap; then it turned out that they were great for AI and machine learning applications.

Most cloud based high performance computing and AI services are heavy users of GPUs, and NVIDIA has ridden this boom to become a giant. But apparently, GPUs are not enough. Why do we need these new, even more specialised chips?

The basic answer, according to Andy Patrizio at Ars Technica, is that artificial intelligence, and pattern-recognition,which forms a large part of it, doesn’t need the same kind of precision mathematics, that CPUs (and GPUs) have evolved to perform.

In his own words: “AI for self-driving cars, doesn’t use deterministic physics to determine the path of other things in its environment. It’s merely using previous experience to say this other car is here traveling this way, and all other times I observed such a vehicle, it travelled this way.”

AI calculations only need “single precision” calculations, not sums carried to the final decimal point: “So while CPUs and GPUs can both do it very well, they are in fact overkill for the task. A single-precision chip can do the work and do it in a much smaller, lower power footprint.”

That’s why Google has TPUs, Intel and Facebook have Nervana, and all the other AI chips are appearing. But this is not simple.

The processors created by cloud giants will be deployed within their data centers, and may not be visible to partners and customers (except possibly through programmatic interfaces). And the other chips will find their way into a range of different places, including specialist processors for intelligent vehicles, to tiny AI edge processors for devices like VR headsets.

There will still be a huge mass of x86 chips out there. But a wave of change is coming…


Written by Peter Judge (Guest)

See Peter Judge (Guest)'s blog

Peter Judge is the Global Editor at Datacenter Dynamics. His main interests are networking, security, mobility and cloud. You can follow Peter at: @judgecorp

Related blogs

The AI Circuit – Rumours from the Trade Show Floor

I’m starting to feel like a Formula 1 racing driver where every month is a new venue with huge crowds but in my case it’s AI industry events. This autumn I’m alternating events across the Atlantic providing a great insight into any differences in current practices between North America and Europe. Back in October I attended the excellent World AI Summit in Amsterdam. It was a great event and had a very new-age European feel to it, making extensive use of video, virtual reality and animation with a video DJ as the master of ceremonies. It was quite the AI Grand Prix pit-party!

Read more


AI needs to be ‘explainable’ but is that possible?

Amazon's AI made the news early in October after it was revealed that the company had scrapped a recruitment engine because it was 'sexist'. Private Eye, the UK's satirical news magazine, described it as "a reminder to take an extra big pinch of salt whenever you hear that AI will improve the world". However, the reality is more complicated...

Read more


AI and HPC Field Trip - Rumours from the trade show floor

The long-lasting Icelandic winter didn’t impact the momentum of our second AI and HPC Field Trip. Once again we gathered around 15 practitioners from the worlds of AI, machine learning, deep learning and high performance computing to network and brainstorm together, as well as tour our industrial scale campus on Iceland’s former NATO based near Keflavik.

Read more

We use cookies to ensure we give you the best experience on our website, to analyse our website traffic, and to understand where our visitors are coming from. By browsing our website, you consent to our use of cookies and other tracking technologies. Read our Privacy Policy for more information.