This week´s green data center hype moment is brought to you by Microsoft, bringing us underwater data centers. I know I´m being harsh, because I think the idea has flaws, but I´m willing to also give Microsoft props for out-of-the-box thinking and execution.
Project Natick, Microsoft´s name for its experiment, put together and put into operation the "Leona Philpot" subsea data center in less than a year. (Under. A.Year. Think about that the next time someone tells you they can´t get new racks or gear installed in the next three months, by the way). General goals for Natick were cloud data with rapid provisioning, lower costs, high responsiveness, and environmental stability.
Since nearly 50 percent of society lives near large bodies of water, dropping containerized data centers off shore near major population centers means "free" cooling, a controlled environment -- a sealed container in the ocean is about as good as you can get "controlled" short of putting a rack on the space station -- and access to renewable power if you drop a couple of turbines into a nearby current flow. Latency would be significantly lower to the local, shore-based population, resulting in better responsiveness.
Real world deployments would happen from start to finish in 90 days, assuming you have all your goodies lying around. Microsoft´s Natick website cites both market demand and quick deployments for natural disasters and special events, such as the World Cup, as areas where speed to deployment is a good thing. Unspoken is an inherent level of physical security, since the data center is underwater in a sealed container.
Other rah-rah green points Microsoft lists are the use of recycled materials to build the data center, as well as the ability to recycle the whole data center at its end of life, no need for cooling water, and a reduced cadence for data center turnover given the approach of the end to Moore´s Law -- which is an interesting assertion, given we´ve got at least a decade or so more mileage before we can call Moore´s Law at an end.
Each data center deployment would last for up to 5 years. At the end of the deployment, you dredge up the data center, scrape off the barnacles and seaweed, open it up, drop in a new set of servers, and then redeploy it, with the container itself designed to last for up to 20 years before being recycled.
Microsoft is still in the experimental stage and is building a second generation bigger "pipe" with 4 times as much volume for more experimentation. But I think the idea of a hands-off server cluster works equally well on land as in a sealed container dumped off shore. In both cases, you are just attaching heat exchange, power, and broadband -- just the sort of thing that works equally well if you are deploying in a developed or undeveloped market.
Nobody has aggressively explored the space of highly resilient, hands-off data center space. The idea of containerizing data centers has been done by Google, Microsoft, and others, but designs have been made human-serviceable. Cutting out remote hands may be the true wave of the future if you can build a reliable rack that doesn´t need to be touched for five years. Being able to aggressively deploy local, low-latency-based data center resources to emerging markets in Asia and Africa could turn out be a bigger win, either on dry land or in a sealed container off-shore.