Verne Global

Data Center | Tech Trends |

4 January 2018

Getting Edgy for No Reason

Written by Doug Mohney (Guest)

Doug is a 20 year veteran of the ICT industry and has been writing about technical issues within telecommunications and data centers since 1995. You can follow Doug at: @DougonIPComm

I've lost count of how many times the data center has gone out of style. When PCs came into popularity around the mid-90s, data centers first became passé', displacing desktop terminals and walks to the IT window to pick up printouts. The 2017-2018 "Data Centers are out" wave is tied into a telco-driven push for edge computing that bucks larger waves of IoT, big data/analytics, and machine learning.

Edge computing translates into putting one or more (presumably) latency-sensitive processes as close to the end-user as possible, with compute power positioned outside of a traditional data center to do what needs to be done for faster response time. Since computation is done locally, there is less travel time for data moving up and results moving back - assuming there's enough local computational power to get the task done as fast as a typical server farm might.

Phone companies have fallen in love with the concept of edge computing for a couple of reasons. As they move out legacy equipment from their wireline and wireless networks, there's a lot of power and physical space being freed up in central offices and cell towers. Rather than leaving the real estate vacant, dropping in a high-density rack or two of servers creates opportunities to increase performance and, more importantly, offer new revenue-generating services.

Having spent about a decade figuring how to get rid of old wireline phone equipment that sucked up power and took up floors of buildings, plus removing lashups of wireless gear from older generations and/or mergers, phone companies such at AT&T are doing an about-face and want to put gear back into the field. It must be a pretty good economic case because we're talking more truck rolls and inventory to distribute resources outside of a data center.

The big problem with edge is scale matters. One of IoT's fundamental principles is taking data from lots of distributed things and aggregating it all in one place across days, weeks, months, and years, building up history. Large amounts of information need to be stored, analysed and archived. You can't stick the resources of a proper data center into every cell phone tower, nor would you want to for power and maintenance considerations alone.

Big data analysis goes hand-in-hand with machine learning, AI, or whatever the marketing people are calling it this week. Machine learning tools are specialised and expensive. They are not designed to be distributed resources because their value comes from their aggregation complexity. Heavy-lifting tasks and any type of storage will remain in the domain of the data center, not a tool for edge computing.

Similarly, cloud services ranging from basic voice through on-demand office tools will remain in the data center. Response times are good enough today that people pay for such services without complain and the need for it to be just an extra couple of milliseconds faster. It will be a hard sell for most cloud service providers to move out of centralised facilities.

History hasn't been kind to edge computing concepts. AOL floated the idea of distributed data centers as being a "solution" to its scale problems a few years ago. The company never got past the prototype stage. While new ideas continue to pop up to disrupt the idea of centralised resources, faster bandwidth and the need to scale continue to keep data centers fresh and relevant.

Note: You can follow Doug at: @DougonIPComm

Share:FacebookTwitterLinkedInmail

Sign up for the Verne Global newsletter

Opinion, thought leadership and news delivered directly to your inbox once a month.