Here's why Microsoft is sinking data centres under the sea
The project from Scotland's Orkney Islands. Image: REUTERS/Nigel Roddis
- Microsoft has concluded its 'Project Natick' experiment off the coast of Scotland's Orkney Islands.
- The shipping container sized underwater data center proved to be eight times more reliable than its dry-land counterparts.
- Its success could shape the future of scaling up data centers, while keeping energy and operation costs low.
Microsoft has concluded a years-long experiment involving use of a shipping container-sized underwater data center, placed on the sea floor off the cost of Scotland’s Orkney Islands. The company pulled its “Project Natick” underwater data warehouse up out of the water earlier this year (at the beginning of the summer) and spent the last few months studying the data center, and the air it contained, to determine the model’s viability.
The results not only showed that using these offshore submerged data centers seems to work well in terms of performance, but also revealed that the servers contained within the data center proved to be up to eight times more reliable than their dry-land counterparts. Researchers will be looking into exactly what was responsible for this greater reliability rate in the hopes of also translating those advantages to land-based server farms for increased performance and efficiency across the board.
Other advantages included being able to operate with greater power efficiency, especially in regions where the grid on land is not considered reliable enough for sustained operation. That’s due in part to the decreased need for artificial cooling for the servers located within the data farm because of the conditions at the sea floor. The Orkney Island area is covered by a 100% renewable grid supplied by both wind and solar, and while variances in the availability of both power sources would’ve proven a challenge for the infrastructure power requirements of a traditional, overland data center in the same region, the grid was more than sufficient for the same size operation underwater.
Microsoft’s Natick experiment was meant to show that portable, flexible data center deployments in coastal areas around the world could prove a modular way to scale up data center needs while keeping energy and operation costs low, all while providing smaller data centers closer to where customers need them, instead of routing everything to centralized hubs. So far, the project seems to have done spectacularly well at showing that. Next, the company will look into seeing how it can scale up the size and performance of these data centers by linking more than one together to combine their capabilities.
Don't miss any update on this topic
Create a free account and access your personalized content collection with our latest publications and analyses.
License and Republishing
World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.
The views expressed in this article are those of the author alone and not the World Economic Forum.
Stay up to date:
The New Data Economy
The Agenda Weekly
A weekly update of the most important issues driving the global agenda
You can unsubscribe at any time using the link in our emails. For more details, review our privacy policy.
More on Climate ActionSee all
Johan Rockström and Tania Strauss
November 19, 2024