Verne Global

Data Center | Tech Trends |

23 November 2017

Is the Edge Over Hyped?

Written by Peter Judge (Guest)

Peter Judge is the Global Editor at Datacenter Dynamics. His main interests are networking, security, mobility and cloud. You can follow Peter at: @judgecorp

Edge computing is the hottest thing in digital infrastructure right now. There’s so much said about it you might be forgiven for thinking that Edge is about to take over from Cloud. Needless to say it’s not as simple as that.

The concept of Edge came about because applications are emerging which need to have data very close to the applications and devices that use them. Streaming media is an obvious example, but emerging areas such as the Internet of Things (IoT), connected cars, augmented reality (AR) and virtual reality (VR) are expected to add to the demand.

The reasoning is straightforward: in the IoT, many billions of sensors will be connected to the Internet. The data they gather will be used to control and monitor the world’s infrastructure, effectively making a nervous system, which will make our environment more efficient and responsive.

The IoT could require zettabytes of data, we are told. If it is all transferred back to central cloud services, it would swamp our backhaul networks. And the round trip delay would make IoT applications unworkable. So these applications will need micro-facilities close to users and devices.

I’ve been sceptical of this: IoT sensors tend to be things like temperature sensors, whose data output is actually quite low. At the same time, it’s been well established that centralised facilities have massive economies of scale compared with small localised facilities.

Surely, the kind of super-fast latency that Edge promises can only be delivered at a significant cost over central cloud facilities. How many distributed applications are really delivering enough benefit to pay for that? At DCD, we wanted to find out the truth behind this, so we ran an Edge Summit at our San Francisco event earlier this year. The response told us that Edge is definitely real.

Take connected cars. They’ll generate a huge volume of data - especially when applications start to need video feed. And whether those applications are driving the car or assisting human drivers, they will have to give a response in real time - meaning less than 5 milliseconds. Augmented reality and virtual reality will have the same demand for latency, because of the human sense of balance. If the images sent back to VR headsets lag by more than 5ms compared with other sensory input, the wearer will be physically sick.

Network suppliers are gearing up to make their infrastructure meet the demands of the edge. And people building IT equipment are figuring out ways to miniaturise their kit without losing efficiency. The eventual result could change the balance of power between two parts of the industry.

“There will be more more capital spend on the edge in the next few years than has been spent in the entire history of telecoms,” said Alan Bock, VP of corporate development, at Crown Castle, a firm which manages thousands of US cell towers. “It’s crazy amounts of money.”

So it’s not idle hype. But it’s not a replacement for the cloud either. These are new applications embellishing a network which will always need efficient centralised facilities. If anything the Edge will fuel greater demands on the Cloud. Clearly every byte of data form the IoT will not make it back to the remote centers where data is efficiently processed and stored. But archives and reports will be needed - creating a whole new category of cloud data.

Note: Peter Judge is global editor at DCD. You can read more of Peter's blogs here.

Share:FacebookTwitterLinkedInmail

Sign up for the Verne Global newsletter

Opinion, thought leadership and news delivered directly to your inbox once a month.