You might think that the issue of air purity in data centers is done and dusted, so to speak. We all know how to handle electronics, and the dust and pollutants in air are well understood. However it turns out that air contamination is a live issue and attempts to clean up other parts of the data center environment could have unintended consequences on the atmosphere inside the data center.
Since Sebastian Thrum and his team used machine learning to win the DARPA Grand Challenge in 2005, machine learning and deep learning have been an integral part of developing autonomous vehicle technology. Great progress is being made, but complex questions remain. My latest blog looks at these issues.
For years, data centers have been haunted by the threat of power outages and the associated costs of such events. This situation is getting worse, with the most recent numbers from a 2016 report by the Ponemon Institute indicating that the average costs of a data center outage rose from $505,500 in 2010 to over $740,000 in 2015, while the maximum cost increased from $1.0 million to $2.4 million. How can next generation energy storage solutions help?
For some time now, I’ve been trying to talk more about “digital infrastructure” than “data centers”. That’s because the connections that link data centers, their users and other resources such as power, are just as important as the servers and infrastructure inside the buildings. When it comes to the 'Edge' - new, exciting opportunities could exist for telecommunications providers...
In previous Verne Global blogs we’ve explored how HPC is being used throughout industry to make cars both faster and safer, to discover new materials and to advance bioinformatics. HPC has made many equally important contributions to the science of understanding our earth and the solar system around us, and it's an understanding that’s become increasingly important in the age of anthropogenic climate change.
The direction of travel for the industry should be away from tightly controlled cooling to a more easygoing approach.
In July last year, we wrote about Iceland’s sizeable renewable resources, and the philosophy of responsible entrepreneurialism, specifically as it applied to an integrated geothermal industrial park and the intensive energy industries that utilise such a abundant and sustainable power profile.
Late in 2017 I told our senior leadership team that I wanted to make our partners part of our customer success programme. As with everything “Verne Global” the support was unwavering and I was asked to outline what that might look like for 2018. Having just drawn up my goals for this year its far easier to describe exactly how that’s going to happen...
Recently I’ve garnered much of my blog inspiration from industry events. February has been no exception and I benefited from a fascinating day earlier in the month at the HPC and Big Data conference in London, just a stone’s throw from the Houses of Parliament. Here are some of my observations from my discussions there...
The costs of renewable energy continue to fall at a remarkable rate in markets across the globe. This opens up new buying opportunities for energy intensive companies that want to meet sustainability goals or – in an increasing number of instances – simply lock in cost effective power prices through long term contracts.
Data center providers will have welcomed the recent announcement that the NHS has approved the storage of patient data outside the UK . This could remove a barrier to the development of international colocation and cloud services for health and research data, and free organisations from the requirement to store patient data in their own country.
In my previous blogs I've highlighted how high performance computing (HPC) has become a powerful tool aiding automobile design. HPC has been particularly important in the realm of simulated crash test simulation and this blog focuses on the rapid improvements being made in this field.
Researchers have typically taken an empirical approach to earthquake study, but as high performance computing (HPC) becomes more prevalent, traditional methods of seismological study are making way for a new paradigm of earthquake analysis based on high-granularity models.