The Deep Dive: How Underwater Data Centers Are Changing the Equation for Sustainable Computing

The rapidly escalating global demand for computing power, primarily fueled by the explosive growth of artificial intelligence, high-scale cloud services, and digital applications, is placing immense strain on traditional, land-based data centers. These conventional facilities are increasingly criticized for their staggering energy consumption, and critically, their massive consumption of freshwater for cooling, particularly in regions facing drought or resource scarcity. To address these infrastructural challenges—which involve balancing efficiency, scalability, and sustainability—the technology sector has been forced to look beyond terrestrial constraints, identifying the ocean and even space as the next frontiers for deploying digital infrastructure.

Among these innovative concepts, the underwater data center presents a pragmatic and powerful solution to the cooling challenge. These systems are essentially sealed, modular capsules or cylinders, filled with racks of servers and communication equipment. These submerged structures are connected to the mainland infrastructure via robust power and fiber optic cables, ensuring both energy supply and high-speed data transfer. The primary, game-changing advantage of this deployment method is the surrounding seawater, which serves as a massive, passive, and continuous cooling system.

On land, cooling infrastructure, which includes energy-intensive air conditioning and chillers, can account for anywhere from 20% to over 40% of a data center’s total energy expenditure. By leveraging the naturally cold and stable temperatures of the ocean depths, underwater data centers can dramatically reduce this energy requirement, sometimes achieving savings of up to 90% in cooling-related power consumption compared to their terrestrial counterparts. This simple yet profound shift eliminates the need for complex, failure-prone, and energy-hungry active cooling systems, presenting a pathway toward genuinely efficient digital infrastructure.

The concept was famously explored by Microsoft through its pioneering research venture, Project Natick. The idea originated during a Microsoft ‘ThinkWeek’ event, where employees proposed harnessing renewable ocean energy to provide high-speed cloud services to densely populated coastal areas. The first prototype, named “Leona Philpot” after a Halo character, was a relatively small vessel deployed off the coast of California in August 2015. This initial trial lasted 105 days and demonstrated the basic feasibility of deploying and retrieving a sealed data center module.

Following the successful first phase, Project Natick moved to a more ambitious Phase 2, seeking to prove the operational viability of a full-scale unit in harsher, real-world conditions. In June 2018, a shipping-container-sized vessel, equivalent in power to several thousand high-end consumer PCs, was submerged 117 feet below the surface in the European Marine Energy Centre off the Orkney Islands, Scotland. The location was deliberately chosen for its challenging and often choppy waters, ensuring the system could withstand deployment almost anywhere.

The Phase 2 trial was monitored remotely for over two years, operating on a ‘lights out’ basis with zero human intervention. The results were compelling: the submerged servers experienced a significantly lower failure rate—only six servers failed out of 855—compared to a control group of similar servers operating on land. This reliability rate was reportedly up to eight times better than typical land-based facilities. The stable, cold environment proved highly beneficial, reducing the stress on equipment.

This increased reliability was not just due to the cold water, but also the internal environment of the sealed capsule. Unlike land-based facilities exposed to ambient air, the Natick vessels were filled with dry nitrogen gas instead of corrosive oxygen and humidity. This inert, low-humidity atmosphere prevented component corrosion, mitigated the risk of fire, and maintained a consistent operating temperature, all factors that contribute to the long-term health and lifespan of delicate electronic equipment.

Beyond operational efficiency, underwater data centers promise significant sustainability advantages. They require no potable freshwater for cooling, conserving a critical resource that land-based facilities consume in billion-gallon volumes annually. Furthermore, their coastal or offshore positioning allows for easy co-location with dedicated offshore renewable energy sources, such as wind, tidal, and wave power. Project Natick, for instance, sourced its power from a mix of onshore and offshore sustainable sources, offering a blueprint for a truly zero-emission data center model that contributes zero waste products to the environment.

Moreover, positioning these facilities near coastlines helps solve the issue of latency for populations that are geographically clustered near major bodies of water, which encompasses nearly 50% of the global population. By deploying data centers just offshore, organizations can provide faster cloud services and better network responsiveness without the complicated process of acquiring and developing large tracts of land in crowded urban or suburban areas.

Despite the promising findings regarding efficiency and reliability, the concept faces substantial operational and economic hurdles. The most often cited difficulty is maintenance and upgrading. When a server fails, or when technology requires an upgrade—as happens frequently in the high-demand environment of AI and cloud services—operators cannot simply swap out a part on a rack. Instead, the entire sealed vessel must be lifted from the seabed, transported to shore, thoroughly cleaned, opened, serviced, resealed, and then re-deployed, incurring significant costs, energy expenditure, and downtime.

Critics, noting that only 20-40% of a data center’s energy is used for cooling, point out that the substantial effort and specialized marine logistics required for maintenance may outweigh the cooling efficiency benefits, especially given the cost and time involved in lifting and working on equipment submerged in corrosive saltwater environments. The economic equation is complex, requiring a holistic assessment of capital expenditure, operational costs, energy savings, and logistical difficulty.

Another major concern revolves around potential environmental and ecological impacts. While Microsoft’s small-scale experiments suggested only localized warming (a few thousandths of a degree meters downstream), the concern of thermal pollution scales up rapidly with the size of the facility. If megawatt-scale data centers are widely deployed, the effect of warm water discharge on fragile marine ecosystems needs far more careful study. Regulators have already raised environmental concerns, such as in the proposed pilot in San Francisco Bay.

Nonetheless, global interest remains strong. China, in particular, is aggressively moving forward with underwater data center deployment to fuel its domestic AI ambitions. Companies like Highlander Digital Technology have launched facilities near Hainan Island and Shanghai, reporting significant cooling power reductions. These projects are strategically placed near offshore wind farms, aiming for highly renewable-powered cloud infrastructure and setting a new standard for sustainable computing infrastructure, with reports suggesting that more than 95 percent of the energy used will come from renewable sources.

Market analyses reflect this expanding interest, predicting robust growth in the sector. The global underwater data center industry is projected to grow from an estimated USD 1.5 billion in 2024 to potentially exceed USD 6 billion by 2033, driven by the intense pressure on companies to find energy-efficient and scalable solutions for data processing.

However, Project Natick itself was officially declared inactive by Microsoft in 2024, confirming that the company is not actively pursuing the deployment of subsea data centers anywhere in the world. This decision underscores that while the experiment was a resounding success in proving technical feasibility, reliability, and cooling advantages, the operational and long-term economic practicality—particularly concerning maintenance and upgrades—did not align with immediate business deployment strategies.

Despite the end of deployment, the research from Project Natick has not been abandoned. Microsoft confirmed that the lessons learned regarding operation below sea level, vibration effects, and server impacts are being actively applied to refine land-based data center sustainability strategies, especially in the realm of liquid-based cooling. The reliable, high-density performance achieved in the stable, inert underwater environment serves as a key reference point.

The research has directly informed the development of liquid cooling technologies now being adopted in traditional data centers. For example, methods such as direct-to-chip cooling via “cold plates” and various phases of immersion cooling—where servers are completely submerged in specialized dielectric fluids—are being implemented on land. These techniques are proving effective at reducing energy demand by up to 20 percent and water consumption by over 30 percent, transferring the thermodynamic advantages identified underwater back to the terrestrial environment.

In conclusion, the journey into underwater data centers, spearheaded by ambitious projects like Natick and now propelled by large-scale deployments in countries like China, signals a fundamental shift in how digital infrastructure is conceptualized. The concept offers undeniable benefits in cooling efficiency, water conservation, and reliability, setting a viable blueprint for environmentally conscious, high-performance computing. Whether widespread future adoption takes the form of fully submerged vessels or, more indirectly, through the application of its core cooling and stability principles to next-generation land-based facilities, the exploration of the ocean as a data center frontier has proven invaluable in accelerating the industry’s necessary transition toward sustainable computing.

The core innovation remains the efficient thermal management achieved by harnessing the ocean’s natural capacity to dissipate heat, contrasting starkly with the energy-wasting struggle against thermodynamics in air-cooled land centers. Even as the industry considers the *other* ultimate frontier—orbital data centers in space, which offer solar power and escape from land constraints—the subsea environment presents a much more immediately deployable, latency-friendly solution that leverages existing submarine cable infrastructure. The ongoing evolution of underwater systems ensures that the pursuit of optimal chemical formulation, materials science (to counter corrosion), and environmental engineering will continue to guide the next generation of global data center design.

The successful operation of the Phase 2 vessel for two years, even processing workloads for COVID-19 vaccine research via Folding@home, demonstrated the system’s operational resilience. This successful test provided critical data confirming that underwater deployment is a technically sound method for achieving high reliability and low power usage effectiveness (PUE), with Natick achieving a PUE of 1.07, significantly better than the industry average. Therefore, while Microsoft has pivoted to applying the learnings on land, the physical deployments by others, combined with the established market demand for green infrastructure, ensure that underwater data centers remain a significant and strategic possibility in the ongoing effort to manage the immense energy needs of the AI revolution.

The future viability of this sector hinges on two key factors: solving the mechanical challenge of remote maintenance and ensuring robust regulatory compliance to mitigate ecological risk. If these challenges are overcome, particularly through advancements in autonomous repair mechanisms or highly standardized, long-lifecycle components, underwater facilities could realize their full potential, fulfilling the goal of delivering fast, scalable, and environmentally sound cloud services to the vast majority of the world’s connected population.

×

Download PDF

Enter your email address to unlock the full PDF download.

Generating PDF...