Microsoft Tests ‘Cool’ Concept: Underwater Data Centers
For Microsoft, the data center just keeps getting cooler. The software giant recently completed a trial of a prototype underwater data center to see if it can slash the cost of cooling the servers and also bring the data center closer to actual users.
The research initiative, dubbed “Project Natick,” originated around three years ago, when it was the subject of an internal Microsoft white paper, and the company is now testing whether it could be a model for the data center of the future. This revolutionary data center would have servers inside steel tubes that could be linked via fiber optic cables deep below the sea’s surface.
“When I first heard about this I thought, ‘Water ... electricity, why would you do that?’” Ben Cutler, a Microsoft computer designer who worked on Project Natick, told The New York Times. “But as you think more about it, it actually makes a lot of sense.”
Seeing Multiple Benefits
On the Project Natick website, Microsoft explains that the effort reflects its “ongoing quest for cloud datacenter solutions that offer rapid provisioning, lower costs, high responsiveness, and are more environmentally sustainable.”
It has been clear for years that keeping data centers cool is critical, since servers generate lots of heat. As cloud computing grows in scale and importance and more commercial data centers are deployed, cooling solutions will become even more challenging and necessary. Project Natick aims to solve the temperature problem by plunging a data center into cold water. And, as the Times notes, Microsoft may also combine this new data center “either with a turbine or a tidal energy system to generate electricity.”
Microsoft says that Project Natick envisions a future in which cloud computing will serve customers who live near large bodies of water, since half the world’s population lives within 125 miles of the ocean. In that world, putting data centers in containers underwater could bring cloud resources closer to where people live, which would lower latency and speed up data delivery. Lower latency times result in faster delivery of data to users. “Deepwater deployment offers ready access to cooling, renewable power sources, and a controlled environment,” Microsoft says on the project’s website.
Although there are hurdles to deploying underwater data centers — including environmental worries and just keeping the servers secured — Microsoft believes that if it can produce the data centers at scale, the deployment time for a typical data center will be cut from two years to 90 days.
For Microsoft data centers are big business. As The Times reports: “Microsoft manages more than 100 data centers around the globe and is adding more at a rapid clip. The company has spent more than $15 billion on a global data center system that now provides more than 200 online services.”
Over the course of a 105-day trial, Microsoft tested the underwater data center, christened the Leona Philpot after a character from the “Halo” videogame series on its Xbox platform. The steel capsule, eight feet in diameter, housed the servers and was placed 30 feet below sea level in the Pacific Ocean off the Central California coast near San Luis Obispo. The data center was monitored via 100 sensors to keep track of water pressure, humidity, motion and other metrics, according to The Times. The computing rack was covered in pressurized nitrogen to keep chipsets cool and was sealed inside a steel tube before being lowered underwater.
In the end, the data center survived, and Microsoft even kept the trial going and ran “commercial data-processing projects from Microsoft’s Azure cloud computing service,” The Times reported.
Diving Down to the Future
What happens now? Project Natick is still just a research project, and the company admits “it’s still early days in evaluating whether this concept could be adopted by Microsoft and other cloud service providers.”
Microsoft says a Natick underwater capsule is designed to last five years. After that, the data center would be retrieved and outfitted with new servers. The data center is designed to last 20 years, according to the Natick website.
Looking ahead, Microsoft engineers are working on an underwater system that will be three times as large as the first system, The Times reported. Microsoft plans to choose an ocean-based alternative-energy systems vendor to partner on a future deployment, which might take place near Florida or in Northern Europe, the publication added.
The project holds much potential for how future data centers could be designed and powered. Instead of having servers in racks that are accessible to humans who monitor them, a Natick data center would remove those parts, since divers won’t be going down to check on the servers. Additionally, as The Times report pointed out, the data center generates electricity via seawater currents, making it possible that no additional energy is added to the ocean. As a result, the water temperature doesn’t rise. In fact, researchers found an “extremely” small amount of heat right near the container, The Times reported.
“We aspire to create a sustainable datacenter which leverages locally produced green energy, providing customers with additional options to meet their own sustainability requirements,” Microsoft states on the Project Natick website. The company adds that the project’s data centers are designed to be fully recycled and produce no waste, “whether due to the power generation, computers, or human maintainers are emitted into the environment.”