Microsoft has announced the next phase of Project Natick — an ongoing research project to determine the feasibility of underwater datacenters — by launching a full-fledged prototype datacenter off the coast of Scotland’s Orkney Islands.

The concept was first derived several years ago at one of the company’s internal ThinkWeek events, where rank-and-file employees are invited to submit ideas for consideration. The project was launched in 2014, though it wasn’t formally unveiled to the public until early 2016, when the company revealed it had already tested a proof-of-concept datacenter capsule in 30 feet of water off the Californian coast.

The next step was to design and manufacture a full-scale capsule, with the location off the coast of a small Scottish island chosen specifically for its renewable energy credentials.

Indeed, the “moonshot” project is all about creating an energy-efficient datacenter that uses the ocean’s natural cooling powers while also delivering high-speed computing power to highly populated nearby coastal communities.

The Northern Isles datacenter is situated at the European Marine Energy Centre, a facility set up to test and develop wave and tidal power technologies — and this is what will power Microsoft’s prototype datacenter. At around 40 feet long, the capsule is roughly the size of a shipping container, except it contains 864 servers and now sits at 117 feet below the ocean surface.

Spencer Fowers, senior member of technical staff for Microsoft’s special projects research group, prepares Project Natick’s Northern Isles datacenter for deployment off the coast of the Orkney Islands in Scotland.

Above: Spencer Fowers, senior member of technical staff for Microsoft’s special projects research group, prepares Project Natick’s Northern Isles datacenter for deployment off the coast of the Orkney Islands in Scotland.

Image Credit: Scott Eklund/Red Box Pictures.

Microsoft said the datacenter is designed to “hold data and process information” for up to five years without requiring any maintenance, though for now it won’t be used to power any mission-critical cloud computing work for the company.

Over the next year, the Project Natick team will monitor the datacenter’s performance, including its power consumption and other elements needed to run a fully operational datacenter. As Microsoft puts it, the second phase of the project is all about “determining the economic viability” of operating offshore datacenters in this manner.

Many of the major technology companies are investing heavily in the infrastructure needed to connect the world and process large amounts of data. Microsoft is expanding its Azure cloud regions all the time, while last year Facebook and Microsoft completed their 4,000-mile transatlantic internet cable Marea. Google, too, recently announced it was investing in a 6,000-mile subsea cable system connecting Japan to Australia.

Project Natick is all about testing the water, so to speak, and determining whether subsea datacenters are viable solutions. If nothing else, the research should offer new insight.

“When you go for a moonshot, you might not ever get to the moon,” noted Microsoft AI and research corporate VP Peter Lee. “It is great if you do but, regardless, you learn a lot, and there are unexpected spinoffs along the way. You get Velcro at some point. That is happening in this case. We are learning about disk failures, about rack design, about the mechanical engineering of cooling systems, and those things will feed back into our normal datacenters.”