Underwater data center Microsoft Abandons the Concept of Reliable Underwater Data Centers

From Susanne Braun | Translated by AI 4 min Reading Time

Microsoft's Project Natick, which ran from 2018 to 2020, involved submerging a container with 855 servers off the coast of Scotland. The goal was to determine if hardware could benefit from the stability of the underwater environment. The project demonstrated that underwater data centers can offer advantages over traditional land-based ones. Despite these findings, Microsoft has decided not to continue this approach, shifting its focus to more promising areas like AI. However, other parties remain interested in the concept.

Microsoft's Project Natick—before its sinking off the Scottish Orkney Islands and after its lifting. The container was less dirty after two years than the project managers had expected. According to Microsoft, the container and its contents were recycled and the seabed was restored to its original condition.(Image: Microsoft)
Microsoft's Project Natick—before its sinking off the Scottish Orkney Islands and after its lifting. The container was less dirty after two years than the project managers had expected. According to Microsoft, the container and its contents were recycled and the seabed was restored to its original condition.
(Image: Microsoft)

The world is practically craving for fast and reliable data centers to meet the challenges of our time. This concerns data storage, but also data processing. Ideally, everything should be ready immediately and everywhere, namely in the cloud. And then there is the topic of AI. Artificial intelligence is extremely hungry, not only in terms of the computing power needed for running and particularly training large language models, but also in terms of environmental compatibility.

The most powerful AI chips not only require a large amount of electricity to operate, but also water, as water cooling is now used in large data centers. So, if you set up your data center in the heart of a very dry area, you might face a problem if air cooling is not sufficient.

Gallery
Gallery with 8 images

If water is beneficial for the operational stability of a data center... why not submerge the entire data center in the ocean and thus benefit from the relatively stable ocean climate? Microsoft has been addressing this question since 2013, and the company has even received answers that are already several years old. So, what are these answers?

Project Natick in Scotland

To determine whether data centers on land or underwater offer more reliability, Microsoft sank a container with 855 servers at a depth of about 35 meters off the Orkney Islands off the Scottish coast in 2018 and lifted it again in 2020. These 855 servers were surrounded within the container by inert nitrogen gas. Outside, the sea ensured temperature stability. Of the 855 servers, six failed during the 25 months and eight days of their operation. Of course, nobody could take care of the failures, because they were underwater. Staff can't quickly check up on things there.

The comparison of the results of the underwater data center with servers that were set up on land during the same period is interesting. 135 servers were placed in the racks of a land data center. Eight of them failed during the time the other hardware was on the seabed. So the underwater servers failed much less often than the hardware on land; the underwater failure rate is 0.7 percent, the overwater failure rate is almost 6 percent.

That doesn't sound bad in and of itself, but it needs to be put in relation to the effort and costs that arise when you want to sink the hardware. In case of doubt, land data centers will probably be more cost-effective for a while. And cost efficiency is often the crucial factor.

The theory behind Natick

The plan for the small, modular Natick data clusters was for the hardware to be kept underwater for five years without being serviced. After five years, the containers would have been pulled out and equipped with new hardware. Microsoft saw the life span of the Natick containers at twenty years.

After the lifting of Project Natick and the positive news of greater reliability, the discussions revolved around the question of how the underwater data centers can be scaled to run the entire range of Microsoft Azure cloud services, which could potentially require a dozen or more ships to be interconnected.

"As we move from general cloud computing to cloud and edge computing, we're seeing more and more need for smaller data centers closer to the customer, instead of these massive warehouse data centers in the middle of nowhere," said Spencer Fowers, a technical staff member of Microsoft's Special Projects research group back in September 2020. However, this demand for small data centers close to the customer doesn't seem to be as large as previously assumed.

No interest from Microsoft

With a shift in focus from small cloud data centers to large AI data centers for algorithm training and execution in the cloud, Microsoft seems to be moving away from the theses it made in 2020. This is understandable. Artificial intelligence requires a lot of computing power over a corresponding amount of accelerators. Not only the sheer amount of accelerators, but also their size, reliability and expandability theoretically require continuous work on the racks. And this is not feasible underwater.

After the Natick seal with the servers was retrieved from the sea, the results of the project were analyzed and lessons learned from it. These in turn are applied in other areas of Microsoft's business, as Noelle Walsh, Head of Cloud Operations and Innovation at Microsoft, confirmed in an interview with Data Center Dynamics. Otherwise, Microsoft has no increased interest in further experimenting in the direction of underwater data centers.

Subscribe to the newsletter now

Don't Miss out on Our Best Content

By clicking on „Subscribe to Newsletter“ I agree to the processing and use of my data according to the consent form (please expand for details) and accept the Terms of Use. For more information, please see our Privacy Policy. The consent declaration relates, among other things, to the sending of editorial newsletters by email and to data matching for marketing purposes with selected advertising partners (e.g., LinkedIn, Google, Meta)

Unfold for details of your consent

"My team worked on it and the functionality was given. We have learned a lot about operations below the sea level, vibrations and the impact on the server. We will apply these findings to other areas," Walsh is quoted by DCD. Why is this topic coming up again now, by the way? Walsh recently emphasized in an interview that Microsoft is not working on underwater data centers.

After all, the idea of commercially operated underwater data centers is not buried with the dying of Microsoft's interest. The data center operator Highlander sank several server modules to a depth of around 35 meters in 2023, near the island of Hainan. The project, which is to be expanded, is intended not only to save electricity and water, but also land mass.