In an era where generative AI and deep learning models are swiftly ascending to prominence, we are on the cusp of a transformative upheaval in data center demand. The growing interest in advanced AI technologies is not just a trend, but a harbinger of a new age; necessitating a profound re-evaluation of our data center infrastructure, its energy consumption, and its environmental footprint.
The digital landscape is evolving at an unprecedented pace, with generative AI at its heart. Unlike traditional AI, generative AI pushes the boundaries of creativity and innovation, generating new, original content from images to videos to conversational responses. This leap in capability requires a substantial computational backbone, involving models with billions of parameters that necessitate rapid data flow for training and execution. Such complexity not only amplifies the demand for advanced data center infrastructure, but also significantly increases the energy footprint of these operations.
The surge in AI applications, coupled with a critical climate crisis, underscores the urgency for data centers to accommodate AI’s hefty compute requirements. Forecasts by industry experts, such as Savills and the Dell’Oro Group, project a need for approximately 3,000 new data center facilities in Europe by 2025 and anticipate global data center capital expenditure to exceed US$500 billion by 2027 – primarily driven by AI infrastructure demands.
In response, data center operators are spearheading innovations to optimize infrastructure efficiency. New technologies such as liquid cooling are emerging as drivers of transformation, potentially reducing a data center’s energy consumption by addressing its demanding cooling needs. Additionally, structural adaptations to accommodate heavy AI computing cabinets and strategic server rack placements to enhance bandwidth capacity between servers exemplify the ongoing efforts to meet AI’s formidable demands.
However, infrastructure optimization is only part of the equation. The transition towards renewable energy sources is critical in mitigating the environmental impact of data centers. With most applications not being latency-sensitive, there exists a significant opportunity to locate data centers in regions abundant in renewable energy. Iceland, with its 100% renewable energy supply and temperate climate, exemplifies the ideal location, enabling data centers to leverage free-air cooling and further reduce their environmental footprint.
Looking forward, the trajectory of generative AI and related technologies points to an exponential increase in demand for data center services. This growth is not without challenges, as metro markets across Europe grapple with outdated power infrastructures and energy crises, further complicated by global conflicts. The advancing complexity of AI systems will inevitably heighten the energy demands of data centers, emphasizing the importance of prioritizing facilities equipped to handle these computational loads while harnessing sustainable energy sources.
At Verne Global, we are meeting this challenge head-on. Our data centers are meticulously designed to optimize the efficiency of AI and machine learning applications, reducing energy costs and minimizing environmental impact. By balancing power, cooling requirements, security, and high-speed access, we ensure our facilities are not only capable of hosting the most demanding generative AI datasets but also advancing artificial intelligence in a manner that is sustainable for our planet.
As we navigate this exciting juncture in technological advancement and sustainability, Verne Global is at the forefront of this revolution. We are committed to meeting the escalating demands of AI with innovative, efficient, and environmentally responsible solutions. Our vision is clear: to power the progression of artificial intelligence and its boundless possibilities, while steadfastly championing the cause of environmental stewardship. It is a truly exciting moment for the data center industry as we usher in a new era of innovation, efficiency, and sustainability in the digital domain.
In the era of generative AI, escalating energy demands of data centers have become a pressing concern. Training AI models is highly energy intensive. According to MIT’s Lincoln Laboratory, the GPUs that trained ChatGPT’s precursor, GPT-3, are estimated to have consumed 1,300 megawatt-hours of electricity – nearly the amount of electricity consumed by 1,450 average U.S. households per month. Yet, amidst this escalating demand, a significant opportunity remains untapped—the recovery and repurposing of waste heat generated by data centers.
Nearly 97% of the electrical energy consumed by data centers could be harnessed in the form of heat. This surplus heat, when redirected towards heating nearby buildings, providing hot water, or supporting industrial processes, can drastically enhance energy efficiency and mitigate carbon emissions. However, this potential often remains unexploited, with waste heat either released into the atmosphere or inefficiently managed.
Heat recovery technologies
Repurposing data center waste heat involves intricate processes and advanced technologies. The primary heat sources within data centers include servers, storage devices, networking equipment, and the operational systems themselves.
Once heat is captured, it is transferred to a heat exchange system, using heat pumps or other heating systems to raise the temperature of the waste heat to the desired level. For instance, the heat may need to be increased to 70 degrees Celsius (°C) or more to make it suitable for district heating.
The recovered heat then finds its way through dedicated distribution networks. These networks, comprising pipes or other infrastructure, facilitate transporting the heat to end-users, whether for district heating, agricultural purposes, or industrial needs.
How we do it in Finland
At our data center campus, The Air, situated in Helsinki, we have been able to put a heat recovery system into action. Utilizing air cooling methods, we extract excess heat from the hot return air and transfer it to water. The water is then elevated from around 30⁰C to a temperature suitable for the district heating company, typically exceeding 90⁰C. For every 1MW of IT power allocated to servers, The Air generates 1.3MW of heat. Remarkably, the system incurs minimal energy losses, just a few percent, showcasing its efficiency.
The presence of an existing district heating network greatly facilitates the integration of The Air’s surplus heat. The pre-existing pipeline is conveniently situated near the data center, simplifying the connection process unlike other cases where a customized strategy is necessary to incorporate excess heat into the heating network. The seamless integration into Helsinki’s district heating network is pivotal to this success as we are not reliant on a single business or application – the district heating network is an integral part of a city-wide energy solution with consistent demand.
Challenges in implementation
While the benefits of heat recovery are evident, challenges persist. Economic viability stands as a significant concern, especially in regions lacking subsidies or supportive policies. Infrastructure costs, initial investments, and operational expenses often pose barriers to implementation.
Moreover, geographical location matters, impacting proximity to end-users and necessitating costly infrastructure for heat transportation. Instances like fluctuating electricity prices pose additional hurdles, demanding innovative solutions to optimize resource allocation.
Policy as a catalyst for change
Public policies, through incentives and mandates, wield considerable influence in promoting sustainable practices. Countries like Finland and France have implemented subsidies and tax incentives for heat recovery. However, mandates without supporting infrastructure can prove counterproductive, emphasizing the importance of a cohesive approach. The European Union’s revised Energy Efficiency Directive marks a significant step toward data center sustainability, aiming for decarbonized district heating by 2050. This directive sets the stage for greater industry-wide sustainability accountability.
Just as important are voluntary initiatives where companies take accountability into their own hands. For some time now, Verne Global has been voluntarily tracking its sustainability metrics, which include its scope 1, 2 and 3 greenhouse gas emissions, as well as setting tough targets for Power Usage Effectiveness (PUE), Water Usage Effectiveness (WUE) and Carbon Usage Effectiveness (CUE). The push and pull of public policy and voluntary accountability will enable greater adoption of more sustainable practices industry wide.
The Road Ahead
Harnessing waste heat from data centers presents a promising pathway towards sustainable energy solutions. While challenges persist, innovative technologies and supportive policies offer a roadmap for maximizing the potential of waste heat recovery, paving the way for a more environmentally conscious data infrastructure.
As the global dialogue on sustainability gains momentum, collaborations between industry, policymakers, and innovators will be pivotal in shaping a more sustainable future for data centers worldwide. The journey towards sustainability demands concerted efforts and collective action, and leveraging waste heat represents a crucial stride towards that goal.
The world is at a critical juncture in its battle against climate change. As the consequences of carbon emissions become increasingly apparent, the urgency to transition toward sustainable energy sources has never been more pronounced.
The green energy transition is boosting investments in renewables worldwide. For instance, a Norwegian company invests in green steel based on green hydrogen in Finland, which was partly driven by negative perception of wind power among Norwegians. This example sheds light on the intricate relationship between public opinion, geographical factors and the urgency to adopt renewable energy sources.
To navigate toward a greener and more sustainable future, let’s explore six emerging green energy sources, their advantages, disadvantages, carbon footprint and why they matter right now.
1. Wind Energy:
Wind energy is a remarkable source of renewable power. In Finland alone, wind power capacity increased by 75% and brought over 2.9 billion investments in 2022. Furthermore, Finland’s wide open spaces and vast amounts of land enabled wind power to generate over 14% of the country’s electricity consumption in 2022, with no end to growth in site.
On a life-cycle basis, onshore wind emits 11 gCO2 equivalent per kWh of electricity produced, while offshore wind emits 12g. The footprint includes emissions from the construction, operation and decommissioning, where construction emissions are highly dependent on chosen materials.
Pros: Lowest carbon footprint among all energy types. Reduces the need for energy imports, promoting energy independence and bolstering the security of energy supply. Abundant energy and technology advances are improving efficiency.
Cons: Noise and visual appearance of the natural landscape. Turbines require large, open areas often in remote locations making large component deliveries challenging. Also, when there is no wind, no power is produced, so reliance on a single source may not be sufficient. Additionally, windmills should be analyzed for their impact on bird and bat populations and locations should be chosen carefully to avoid disturbing wildlife.
2. Hydroelectric Energy:
Hydroelectric energy, also known as hydropower, generates electricity by harnessing the energy of flowing or falling water while leveraging generators and turbines to harness the pressurized water energy. Hydropower is a low-carbon source, emitting 24 gCO2 per kWh on a life-cycle basis. The emissions consist mostly of the construction of reservoirs, hydroelectric dams, tunnels, and other necessary infrastructure.
Pros: Hydro energy is usually stable, cost effective and can be adjusted to meet demand and has low indirect emissions.
Cons: High upfront capital costs and resource-intensive construction are required. The built dams can disturb fish and other wildlife. What’s more, to construct the dams, water may overtake large land masses that may have been previously used for other purposes. In addition, efficient hydroelectric energy utilization can be challenging in warmer climates with droughts. Nonetheless, hydropower holds the potential to contribute to the development of remote communities, particularly in areas with abundant water resources.
3. Solar Energy:
Solar energy is rising in rankings and is expected to overtake oil production investment for the first time in 2023. Solar energy is not dependent on other energy sources because the energy is converted directly into electricity using solar cells or photovoltaic (PV) panels.
The upfront costs of a commercial solar system is between $1.54 and $1.56 per watt, with a system in the U.S. costing approximately $325,000 USD. Other costs to consider include the storage space required to host the energy system and the solar panels, in addition to the land needed for the storage buildings.
Solar energy emits between 38 and 48 gCO2 per kWh on a life-cycle basis. The carbon footprint varies depending on the specific study, but the median value for rooftop solar is 41 gCO2 equivalent per kilowatt-hour of electricity produced.
Pros: Readily available source, with costs historically going down.
Cons: Contingent on sunny weather and has seasonal variations in solar radiation, particularly in northern countries. Large installations require lots of land. Dependent on rare minerals. Carbon footprint significantly larger than wind or hydropower.
4. Green Hydrogen:
Green hydrogen is an emerging green energy source that has gained attention due to its ability to only emit water vapor. The energy source is a form of hydrogen gas produced through a process called electrolysis, which uses renewable energy sources to power the reaction. It’s labeled “green” because it offers zero carbon dioxide emissions and increases efficiency. There are no carbon emissions produced during H2 production, with one gallon of H2 emitting zero pounds of CO2 when combusted. To put things into perspective, green hydrogen-powered vehicles achieve double the fuel economy of gasoline-powered vehicles. Due to its many advantages, Finland is expected to produce 10% of the European Union’s green hydrogen in 2030.The U.S. is also investing, with a $52.5 million clean hydrogen initiative already underway.
Pros: Enables the exploitation of intermittent green energy sources.
Cons: Production comes with high upfront costs, requiring storage and involving high cost of raw materials. Green hydrogen is highly flammable, creating additional safety precaution costs. The ecosystem is still evolving.
5. Fusion Energy:
Fusion energy is a promising clean energy source with zero carbon emissions and energy security. Based on nuclear fusion, where light atomic nuclei combine to form a heavier nucleus, the process releases an enormous amount of energy that mimics the same reaction that powers the sun and stars in our solar system.
While not strictly a renewable energy source, nuclear energy is often considered a low-carbon energy option. However, there are challenges to overcome, including the technical complexity that comes along with fusion reactors requiring high temperatures and pressures to sustain the process.
Pros: It offers an abundant fuel supply without greenhouse gas emissions, reducing reliance on fossil fuel imports. No radioactive waste.
Cons: Technical complexity and cost.
6. Thermal Heat:
Thermal heat is rapidly emerging as a groundbreaking green energy source for data centers, presenting an environmentally sustainable alternative to traditional power solutions. Iceland, in particular, has become a frontrunner in harnessing the power of thermal heat for data center operations. Located in an active geothermal zone, Iceland boasts an abundance of geothermal resources, allowing data centers to tap into the Earth’s natural heat. By strategically integrating geothermal energy into their operations, data centers can achieve efficient and renewable power sources, significantly reducing their reliance on traditional energy grids. This approach not only aligns with global efforts to transition to greener practices but also positions Iceland as a pioneering hub for sustainable data center solutions.
Pros: Thermal heat is a renewable and sustainable energy source offering consistent energy while producing low carbon emissions and offering a reduced reliance on external power grids. These systems also come with an extended equipment lifespan and have lower long-term operational costs.
Cons: High initial set-up costs, energy transmission losses and highly location dependent to tap into geothermal reserves.
The Importance of Developing a Green Energy Source Strategy
Green energy sources are crucial in addressing climate change and reducing reliance on fossil fuels. While each source has its own benefits and drawbacks, they collectively offer the potential to significantly reduce greenhouse gas emissions and mitigate the impacts of climate change. It is crucial to consider the comprehensive life-cycle emissions of these sources as organizations define their journey to climate neutral — or even carbon negative.
Accelerating the adoption of green energy sources propels the global energy transition forward across every industry while reducing fossil fuel emissions. As tech leaders, we can be the drivers of real sustainable change in our industry and beyond. Contact us today to learn more about the green energy solutions we leverage.
In the ever-changing dynamic landscape of high-intensity compute, Verne Global has always strived to stay one step ahead when it comes to innovation and progress. We evaluate the latest advancements in power, cooling, security and connectivity to make sure our customers can grow, scale and optimise their AI workloads as their data center needs evolve. With this we are pleased to announce the integration of liquid cooling technology on our data center campus, The Rock.
The surge in AI, machine learning and high-performance computing is amplifying the need for robust data center cooling solutions. At a time when increasing energy efficiency is paramount for most data center operators, the average data center commits 40 percent of its energy consumption to cooling its infrastructure. The network demands of the AI models also require a significantly higher level of computational density, putting further strains on conventional air-cooling technologies. As a result, liquid cooling is quickly gaining traction in the market.
Recognizing this pressing need for cooling innovation, we have collaborated closely with industry leaders Dell Technologies and Intel to introduce direct liquid cooling (DLC) to The Rock campus in Finland. The deployment features DLC on Dell DLC3000 rack solutions with up to 80kW+ cooling capacity and 4th Gen Intel® Xeon® Scalable processors to maximise performance and cooling efficiency. Working with Dell and Intel enables reliable and efficient performance for the fastest growing AI workloads that businesses depend on today. Notably, it marks Dell’s inaugural this type of liquid cooling deployment in Finland.
Verne Global is at the forefront of meeting the unique challenges and demands that AI models place on data centers. Our Nordic data center campuses are designed to sustainably host, support and scale the most demanding AI models in the world. As AI begins to reshape industries, we are committed to deploying technologies that are pivotal to the future of compute. Liquid cooling isn’t just an advancement; it’s an integral component of our vision to exceed the ever-evolving demands of the technology landscape.
We are excited to extend an invitation to customers to explore the potential of liquid cooling technologies for their AI workloads. In addition to the deployment mentioned above, Verne Global Finland will also have DLC cold plate technology samples on hand for customer evaluation. Starting in January at our London data center, samples of direct and immersion liquid cooling will be available for customers to see and evaluate in person.
For more information on evaluating liquid cooling for your compute needs, please contact [email protected].
Designing a data center is a multifaceted challenge. It requires meticulous planning for power provision, cooling systems, robust security, consistent reliability, and lightning-fast network connectivity, all of which must harmoniously integrate. However, the advent of artificial intelligence (AI) has introduced an entirely new layer of complexity to the traditional data center blueprint. The network demands of the AI models require a significantly higher level of computational density, further accentuating the challenges inherent in conventional data center designs.
AI models, like those used in medical and climate research or large language modelling or financial services, demand relentless computational power. Unlike some traditional data center workloads that experience peaks and troughs, AI model training is continuous. And the current trend of the latest AI algorithms is increasing the computational demands of AI enabled data centers. Most conventional data centers are not equipped to deal with the enormous compute required to train the neural networks required of AI technology.
First and foremost is the need for higher density racks. Whereas the average rack density a few years ago was 5 kilowatts (kW) per rack, the latest generation of AI supercomputers, like the NVIDIA DGX H100, require much more from data center infrastructure. Just four of these systems in one rack could consume more than 40 kW while only occupying 60% of the space of a typical computing rack. While housing more computational power within a smaller space offers substantial cost-efficiency, it does present some unique challenges.
Traditionally cooled data centers have resorted to widely spaced racks to alleviate heat issues. But at-scale machine learning applications require server racks placed close together – as this optimizes the network configuration that is made up of very expensive high-throughput cables between servers, while also minimizing the overall cost of deployment. This rising rack density is putting strains on conventional air cooling technologies. As a result, liquid cooling is gaining traction in AI data centers.
With faster and hotter servers, data centers must also adapt their network architecture and connectivity. Low-latency east-west spine and leaf networks are essential to support production traffic and machine learning processes. Additionally, investments in cache-capable systems like CXL and NVLink are necessary to ensure efficient data transfer between CPUs, GPUs, and DPUs without causing delays.
AI-optimized data centers must also be structurally robust. These data centers need to support the transportation and installation of exceptionally heavy AI computing cabinets, some weighing over 1.5 tonnes or more when fully configured. This structural integrity is essential for the efficient operation of AI infrastructure.
As AI models deal with large and complex datasets, one of the questions being raised is how companies will look for ways to protect their proprietary data. Instead of training public models, companies will leverage proprietary engines that not only protect access to their data sources, but could help them gain a competitive edge in the marketplace. This takes data center infrastructure to a whole new level requiring even greater connectivity, agility, and scalability.
At Verne Global, we understand the unique challenges and demands that AI models place on data centers. Our data centers are designed to reliably host, support and scale some of the most demanding AI models in the world. As AI continues to reshape industries, we are staying one step ahead to meet these demands, adopting optimized designs that prioritize power, density, cooling, strength, and security. We know what powering progress looks like as we’ve been delivering high intensity compute for more than a decade. Let us help you take AI center stage for your business.
In the rapidly evolving landscape of finance, AI is becoming an essential component of our journey towards a net-zero future. In June, the European Commission introduced a new package aimed at providing user-friendly guidance on transition finance, emphasising support for companies in their sustainability efforts and promoting private investment in eco-friendly projects, many of which leverage AI technologies. Let’s take a look at the potential applications and limitations of AI within finance, and how harnessing its power can yield a positive impact on the planet and global economy.
AI continues to ensure the smooth and efficient execution of financial operations, working behind the scenes to process extensive and diverse datasets with remarkable precision, enabling real-time, informed decision-making. This is exemplified in AI-driven passive investing – a low-maintenance strategy that simplifies the process of building and managing an investment portfolio.
Wealthfront’s AI software delivers this concept by continuously monitoring the performance of investment portfolios and automatically rebalancing to match the intended distribution by buying or selling assets where necessary. This style of investing outperforms traditional active investing in efficiency and ROI, as well as avoids the hefty fees associated with human management.
How can we make sure that our investments, whether through active or passive management, meet green criteria? AI simplifies this by enhancing transparency in sustainable investments, particularly in carbon trading. In the past, carbon offset initiatives have faced criticism for their lack of effectiveness and issues like double counting. AI-driven Monitoring, Reporting, and Verification (MRV) technology, including satellites, drones, blockchain, and smart sensors, is a cutting-edge solution for this. For example, the Climate Action Data Trust (CAD Trust) automates data collection and analysis while streamlining carbon registries, delivering highly accurate emissions data. This is a game-changer for the financial sector, translating into better risk management, more accurate forecasts, and efficient compliance reporting for carbon credits. Investors can now access clear emissions data or integrate it into AI-powered investment models. This ensures that funds flow into the most environmentally sustainable assets, aligning with the broader goal of achieving net-zero commitments.
The financial sector is the “canary in the coal mine” for applications of AI across the wider economy: a highly scrutinised, information-intensive domain that operates in real-time – in other words, an ideal testing ground for AI. Despite its many fantastic capabilities, applications of AI in finance show us that AI is not a one-size-fits-all solution. Experts have identified several limitations associated with autonomising all aspects of the financial sector, highlighting the need for a balanced approach to enable businesses to thrive. For instance, whilst AI excels at processing hard, rapidly changing data, like stock prices, it is less effective for analysing softer, longer-term factors like a company’s prospects, management quality, or pricing strategies. These factors are better assessed by human experts and are ultimately the most important elements in shaping a business’ trajectory and how it is received in the market. Similarly, areas such as wealth management and lending have seen slower AI adoption, predominantly because clients continue to favour the personal, emotional approach offered by human advisors as opposed to robo-advisors, despite the recent advancements in generative AI and large language models. What’s more, in a domain where success hinges on scale and access to technology and data, larger industry players have the advantage, potentially leading to consolidation of AI, rather than democratisation, and ultimately limiting innovation in the field. It is therefore important to recognise the enduring value of human expertise and insight, especially since they are more accessible than technology monopolies.
Ahead of the upcoming COP28 summit in the United Arab Emirates next month, a cutting-edge initiative and competition named TechSprint has been launched. Its purpose is to inspire financial innovators and developers to devise creative technological solutions addressing challenges within sustainable finance today. These solutions will harness the powers of AI, blockchain and IoT and sensor technologies to tackle obstacles hindering the progress of transition finance. It will be interesting to see how participants use AI to their advantage and the strengths and qualities of AI showcased by their submissions, particularly in the context of finding the right balance between AI capabilities and human judgment and imagination. It remains just as important that the technological solutions presented in TechSprint and beyond are powered by clean, green energy, ensuring that the financial sector is progressing towards a more environmentally conscious and economically viable future.
The process of using renewable electricity such as wind or solar power to split water and produce hydrogen in an environmentally responsible way—that’s the definition of the sustainable energy production technology that may be reaching a tipping point: green hydrogen. After decades of being considered a technology of the future, it’s finally becoming a reality, promising to become an important driver of a worldwide green energy transformation. In fact, many believe green hydrogen is emerging as one of the most interesting technologies to help countries and companies achieve net-zero goals.
In Europe and the U.S., the green hydrogen boom is exploding. Finland, for example, is becoming an influential leader in green hydrogen production as it works aggressively toward climate neutrality. In fact, Finland is expected to produce 10% of the European Union’s green hydrogen in 2030 while targeting carbon neutrality by 2035.
Green hydrogen initiatives are increasingly hitting the headlines. In the past year, major green hydrogen developments and investments have emerged worldwide:
• In January 2023, a Norwegian company announced a $4.3 billion investment in a steel plant with an integrated hydrogen production facility.
• Construction of Finland’s first green hydrogen plant also began early this year, backed by a $76 million private investment.
• In Spain, a Madrid-based company is developing the first green hydrogen supply chain between southern and northern Europe, anticipated to supply Northwest Europe with 6 million tons of green hydrogen by 2030.
• In the U.S., the government is investing heavily in clear hydrogen initiatives. And the states are getting into the act, too. New York, for instance, has entered into an agreement with nearby states and 40 hydrogen ecosystem partners to develop a regional hydrogen hub to accelerate green hydrogen energy innovation and investments.
But what’s bringing about this global green hydrogen boom? The main driver comes as no surprise—it’s the urgent need to eliminate harmful fossil fuels and carbon dioxide emissions for a more sustainable, environmentally responsible future. And there are several other factors as well, including:
• The price of green hydrogen becoming more competitive.
• The advanced development of the hydrogen ecosystem, with sectors on both the supply and demand side of the equation.
• The improved availability of green energy, thanks to sufficiently developed wind and solar energy plants, resulting in a grid frequency and storage market that can be more easily tapped.
• The enhanced efficiency of production and storage technologies.
How Green Hydrogen Technology Works
Green hydrogen technology is based on hydrogen generation, which uses electrical currents to separate hydrogen from the oxygen in water. So, what’s the difference between traditional hydrogen and green hydrogen production? To put it simply, if the energy used for generating hydrogen comes from renewable sources, such as wind or solar, green hydrogen is born. Unlike the traditional hydrogen generation process, green hydrogen production leverages green energy sources that don’t release harmful carbon dioxide emissions into the atmosphere.
How Green Hydrogen Can Reduce Fossil Fuel Dependence And Carbon Emissions
Years ago, hydrogen was seen only as a solution for the evolution of greener vehicles. As electric vehicles have gained more traction, hydrogen is increasingly being seen as a solution for other industries.
The demand for hydrogen continues to increase as its usage expands across industrial and manufacturing industries for a variety of purposes, including oil refining, steelmaking and cement production. However, as hydrogen’s popularity grows, the importance of green hydrogen can’t be overstated. Alarmingly, 98% of hydrogen is made from fossil fuels with no carbon dioxide emissions controls or regulations in place. But green hydrogen has the potential to change that—for good.
From commercial plant production smoke to gasoline and diesel-powered car exhaust fumes, green hydrogen production reduces or eliminates the need for fossil fuel energy sources that release large amounts of carbon dioxide into the air. In the data center industry, as storage systems develop hydrogen, it can be used instead of diesel-powered backup generators to energize future data centers. As a result, green hydrogen benefits abound, allowing governments and organizations to bolster national energy security, conserve fuel, reduce overall emissions and diversify transportation energy options from cars to expansive public transit systems.
Green hydrogen technology couldn’t have been introduced at a better time. The U.S. Energy Information Administration predicts global energy demand will increase by 47% by 2050. The only way to offset that demand in the form of oil and coal energy production is by adopting greener methods, such as green hydrogen.
And thanks to the technological breakthroughs that have essentially decarbonized the production of hydrogen, many companies are turning to carbon offsets that leverage green hydrogen to reduce their carbon footprint and meet aggressive ESG goals.
The process of generating green hydrogen comes with advantages. The International Energy Agency (IEA) states that green hydrogen saves approximately 830 million tons of carbon dioxide emitted annually compared to when the gas is produced using traditional fossil fuel methods. That’s equivalent to an entire year’s worth of emissions from the U.K. and Indonesia combined!
Just like with any new technology, there are some challenges to overcome as the green hydrogen boom takes hold. Some issues to consider include process efficiency and costs of production on a large scale, in addition to establishing long-term pressurized storage solutions. Challenges aside, green hydrogen is an exciting new technology that could help balance the much-needed large-scale production of green energy.
Develop A Strategy That Promotes Green Hydrogen
To ensure a sustainable future, organizations across every industry must do their part to establish a plan to reduce their carbon footprint. As you set out on your journey to climate neutral—or even carbon negative, there are many actions you can take to implement greener practices across your organization, such as optimizing your procurement and facilities for emissions and energy efficiency. And most importantly, commit to using 100% green energy, harnessing the natural resources around you like wind, solar and hydro to promote green hydrogen production, all while reducing fossil fuel emissions in a meaningful way.
This article was originally posted on Forbes Technology Council.
Last month, people around the world gathered to celebrate Earth Day, the annual event to demonstrate support for environmental protection. This year’s theme, “Invest in Our Planet”, focused on engaging governments, businesses and individuals to make environmentally-conscious decisions that prioritise the health of our planet and future generations. Investing in our planet means investing in new technologies that optimise our sustainability efforts and ensure efficiency and accuracy in everything we do. New avenues for applications of Artificial Intelligence (AI) and high performance computing (HPC) in green projects continue to open up as the technology grows more intuitive and creative.
The intelligence of AI has developed to the extent that its artificial component is becoming undetectable. This refers specifically to generative AI – AI that processes large quantities of data to create original content, including code, text, images, videos and simulations. Generative AI was popularised last year by the launch of OpenAI’s chatbot ChatGPT, which set the record for the product with the fastest growing user-base. Generative AI has proven the creative capabilities of AI, something we previously underestimated, advancing its role in sustainability efforts from background analytics to being the face of campaigns. In advance of Earth Day 2023, WWF released an exhibition of AI-generated images envisioning a bleak future for UK nature without environmental protection. The exhibition, entitled Future of Nature, features apocalyptic scenes in the style of British Romantic artwork, deliberately making irony of the Romantic focus on the beauty of nature. The artwork sends the powerful message that the fate of our world is in our hands, linking to the futuristic theme of Earth Day 2023.
Generative AI also has a role to play in designing the cities of the future. The architecture company Maket utilises AI to enhance building design and urban planning. Maket’s technology can generate 3D renderings of buildings and cities that prioritise energy efficiency, protection of green space and local ecology and reduction of waste and pollution for better quality of life in urban environments. Experts forecast that the future of generative AI will be in specialised domains rather than general-purpose use. Niche applications, such as realizing an artistic concept or designing buildings, can benefit from the ability of generative AI to generate unique and optimised solutions, whereas generalised applications may be less effective in the face of multiple tasks and large quantities of data.
Edge computing is a rapidly growing field that lends itself well to the harnessing of data for hyperlocal climate research. Researchers from Northwestern University and Argonne National Laboratory have been launching edge AI-driven Waggle sensors around the world to create custom climate models that can provide more precise and localised predictions of weather patterns and climate change. Edge computing based environmental monitoring has various useful applications, such as studying wildfire patterns in forests and observing heat waves in urban areas. It is particularly useful for wildlife conservation efforts because, thanks to reduced latency and bandwidth requirements, conservationists can respond to threats to endangered species, such as poachers, in real-time and therefore protect wildlife more efficiently. Conservation AI has deployed 70+ AI-powered cameras worldwide, trained via deep learning to detect and report such threats. By adapting our digital infrastructure, we can preserve our planet’s most precious ecosystems.
The transition to a net zero economy will require significant investments in renewable energy, but its drawbacks, namely variability and unpredictability of energy output, prolong our reliance on fossil fuels. AI is helping us manage these areas of uncertainty. Vestas, a leading wind turbine manufacturer, is utilising generative AI to maximise the power output of wind turbines by adjusting blade angles and other factors in real-time, based on wind speed and direction, a method known as wake steering. The technology can also select optimal locations for wind farms and predict when turbine maintenance is needed, enabling more effective and sustainable energy generation to progress the renewable revolution.
As the intelligence and capability of AI grows, the demand for computing power to support new applications increases dramatically. These powerful tools that support our sustainability efforts are also major contributors to the problem without the backing of renewable energy at every step of their development. It is more important than ever to scrutinise the carbon footprint of upcoming technology to ensure that we are not just bringing more shiny new toys into the world – rather, we are prioritising environmental impact and making meaningful investments in our future and the future of our planet.
Hildegard van Zyl is Verne Global’s first General Counsel and joined the company in December 2022. Although she started as a financial services attorney, she soon developed a keen interest in IT services and all aspects of tech law after joining Amazon in 2015. Hildegard has practised law and completed a range of multi-million dollar deals in New York, Australia, Luxembourg and the U.K, in both public and private sectors. She is passionate about sustainability and biodiversity and thrilled to support Verne Global in its commitment to sustainability.
How would you describe yourself?
I’m a natural explorer and it’s a strong theme that has played out in my approach to my life and my career. I’ve lived, worked and studied all around the world, so I bring a broad international perspective to the table.
I love being able to explore novel areas of the law. When I first got into private practice my goal was to get as much exposure to different areas of the law as possible. Given that private practice is built upon specialisations, I realised early on I would need to take risks and be single minded about the type of work I wanted to do if I wanted to grow my skill set and stay interested in the law.
How did that influence your career path?
I knew I was more of a generalist at heart, so I began searching out opportunities to enable that. When I went in-house and joined Amazon they really encouraged me to move through different areas of the business. I would say that they value good judgement and versatility more than expertise – when I was there they typically didn’t hire legal subject matter experts, but rather curious individuals. At Amazon, I got exposure to operations, transportation, e-commerce, media infrastructure & distribution, devices, video streaming and an interconnected web of laws that apply to all of the above. It also sparked my interest in technology and how it shapes and responds to our needs as consumers (like the way we engage with media content) and the values we focus on as a society (like privacy).
I’ve also realised I love being a student of the law. Because of the societal shifts that happen with technology, it’s made tech law so relevant and topical. It has prompted me to do a master’s degree in tech law and innovation at the London School of Economics, which gives me a chance to explore timely issues like cyber law, artificial intelligence and how it should be regulated, digital rights, intellectual property and more. It’s a very exciting time to be involved in this kind of work.
What are you most excited about in coming to Verne Global?
I think it’s an ideal time to join Verne Global as the company is at a very exciting point in its growth trajectory. The company’s explorer mindset and focus on sustainability seems to be a good match with my own. Operationally, they also sit at a different layer of the tech stack than what I’ve primarily been involved with to date, so it’s a great opportunity for me to grow and learn and to draw from my different, but relatable, experience in other fields.
I’m looking forward to being able to help in a variety of areas ranging from contract and corporate law, construction law, procurement law, employment law, the odd dispute, an array of compliance and regulatory considerations, and much, much more (in multiple jurisdictions!).
The highlight of this opportunity is to be a part of a company that is at the forefront of key issues facing the data centre industry. Verne Global’s approach to sustainability and delivering transparent metrics enables its customers to make the best decisions about where their data is stored. Sustainability and transparency are going to be at the forefront of the data centre industry for years to come and so being in a place where I can help set company policy in this space is a privileged position to be in.
Data gravity refers to the concept that with sufficient scale, a given mass of data attracts more data, which in turn pulls applications and workflows with it. The greater the mass of that data, the more gravitational pull it will have. The term ‘data gravity’ was first coined by software engineer, Dave McCrory, who described the gravitational pull that large masses of data seem to exert on IT systems, drawing an analogy to physics, which posits that objects with sufficient mass will pull objects with less mass towards them.
To some extent, this has been at the centre of the model by which all digital infrastructure has developed over decades. Historically, data sets were monolithic, often created from one source, stored in one location, and processed in a linear manner. Data and the infrastructure supporting it grew for decades in centralised locations, often dictated by proximity to internet exchange connectivity.
Today, data creation is dynamic, growing at a scale that is hard to contemplate, is omnipresent, and perhaps most importantly is incredibly valuable. The scale and volume of data sets continues to grow exponentially with two-thirds of the world’s data created in the last three years. However, the more data we create, the greater the mass, the greater the gravity, and the harder it becomes to escape the gravitational pull of that data set. This is now creating significant challenges.
The data gravity problem
The historic proximity requirements of the digital infrastructure industry, and the data gravity that has resulted, has created an over-concentration of the digital infrastructure industry in metro locations, such as London, New York, Dublin, Frankfurt, Ashburn and Amsterdam. Some of these locations can no longer support the current, let alone future, growth being drawn to them. As data gravitates to those concentrated areas, the physical infrastructure of land, buildings, water and most importantly power, cannot keep up.
Current estimates suggest that data centers in Dublin will soon be consuming 20% of the city’s power, which is ten times the estimated global average and up from closer to 5% only seven years ago. This enormous growth has largely been driven by hyperscale cloud operators, such as Microsoft, Google and AWS expanding their data center footprints in a location in which they have been established for decades. Even worse, the cloud operators unnaturally increase the strength of data gravity with immensely high data-egress costs from their platforms. This is clearly unsustainable in more ways than one. Not only must there be a finite level of resources available for this type of infrastructure at some point in time, but Ireland’s energy production relies heavily on a predominantly carbon-generating grid, meaning that its data center industry is far from green.
Availability of power and infrastructure is one factor, as is the economics of the digital infrastructure industry, particularly the data center industry, which has also changed significantly. Due to recent events, the power prices in London and Frankfurt have risen dramatically, increasing the cost of operating data centers enormously. This has a knock-on effect for the economic viability of storing and processing data in these historically gravitationally-dense locations. Does it still make sense for a German company to process all of its applications and data in a data center in Frankfurt if the price to do so has risen 5 times in the last twelve months?
Overcoming data inertia
Data gravity creates challenges. As data sets grow and the gravitational pull of those data sets increases, more concentration and pressure has been put on the historically centralised digital infrastructure locations. Not only are those locations running at or beyond capacity, they are also not sustainable. Therein lies the Sustainable Data Gravity Paradox. The greater the data gravity of a location, the more pressure is placed on its finite resources under the current model. Something has to change, and it can.
Some data is latency sensitive, such as in high frequency trading, for example. Some must reside in in a specific country for data privacy regulations. Other data must be hyper-connected, as is the case for content distribution. For these types of data, there is likely a strong need for it to be located, stored and processed in a specific location. Cost, efficiency and sustainability may play a less important role in the location of that data as a result.
However, what about data that is less latency sensitive, does not have specific data privacy requirements, or need to be hyper-connected? Should that data also reside in resource-constrained, expensive, inefficient and less sustainable locations? One would hope the answer is, ‘no’. The fact is, if your data or applications can sit in a cloud environment, it can probably sit anywhere, as you rarely have control or knowledge over where your data resides in a cloud environment.
Data gravity can be strong, but what if the gravitational pull of lower cost, more efficient, more sustainable digital infrastructure pulled harder than the gravitational pull of the data itself? Data is by nature inherently mobile. Terrestrial, wireless and subsea networks transport massive, almost unimaginable, volumes of data around the world at increasingly fast and lower cost rates. It is possible and it is time to break the Sustainable Data Gravity Paradox.
At Verne Global, our data center customers can be more efficient and can save 75% of their data center costs by locating their digital infrastructure in our sustainable data center facilities in Iceland and Finland when compared with metro locations like London and Frankfurt. They can also reduce their carbon footprint by 98% by doing so. We help our customers take the journey towards digital infrastructure sustainability by enabling them to divide their data and applications between those that justify metro locations and those that can take advantage of more efficient sustainable locations. It is time to solve the Sustainable Data Gravity Paradox and we have a solution.