GadgetsTechnology

Microsoft’s Project Natick underwater data center experiment confirms viability of seafloor data storage

Microsoft has concluded a years-long experiment involving use of a shipping container-sized underwater data center, placed on the sea floor off the cost of Scotland’s Orkney Islands. The company pulled its “Project Natick” underwater data warehouse up out of the water earlier this year (at the beginning of the summer) and spent the last few months studying the data center, and the air it contained, to determine the model’s viability.

The results not only showed that using these offshore submerged data centers seems to work well in terms of performance, but also revealed that the servers contained within the data center proved to be up to eight times more reliable than their dry-land counterparts. Researchers will be looking into exactly what was responsible for this greater reliability rate in the hopes of also translating those advantages to land-based server farms for increased performance and efficiency across the board.

Other advantages included being able to operate with greater power efficiency, especially in regions where the grid on land is not considered reliable enough for sustained operation. That’s due in part to the decreased need for artificial cooling for the servers located within the data farm because of the conditions at the sea floor. The Orkney Island area is covered by a 100% renewable grid supplied by both wind and solar, and while variances in the availability of both power sources would’ve proven a challenge for the infrastructure power requirements of a traditional, overland data center in the same region, the grid was more than sufficient for the same size operation underwater.

Microsoft’s Natick experiment was meant to show that portable, flexible data center deployments in coastal areas around the world could prove a modular way to scale up data center needs while keeping energy and operation costs low, all while providing smaller data centers closer to where customers need them, instead of routing everything to centralized hubs. So far, the project seems to have done spectacularly well at showing that. Next, the company will look into seeing how it can scale up the size and performance of these data centers by linking more than one together to combine their capabilities.

Leave a Reply

Your email address will not be published. Required fields are marked *