Big data collection and analytics underwater is not a new concept.
Applications such as seismic monitoring, detection and tracking of marine objects applications, underwater multimedia systems, 3D maps of underwater sites, require large volumes of information to be gathered generated continuously at fast rate.
Microsoft last week went public on a new big data research project – not to draw information from the seas on coral reefs or fisheries but to use water as a depository for cloud-based consumer data.
Named Project Natick, the research initiative is seeking new solutions to global demand for cloud storage as well as environmentally sustainable ways to run energy-hungry data centres.
In August 2015, the researchers lowered a prototype vessel – nicknamed Leona Philpot – onto the seafloor approximately one kilometre off the Pacific coast of the US.
The 38,000-pound container consumed computing power equivalent to 300 desktop PCs.
Once the vessel was submerged, researchers monitored the container from their offices in Building 99 on Microsoft’s Redmond campus, Washington, according to a Microsoft blog.
Using cameras and other sensors, they recorded data like temperature, humidity, the amount of power being used for the system, even the speed of the current.
Norm Whitaker, who heads special projects for Microsoft Research NExT, said: “The bottom line is that in one day this thing was deployed, hooked up and running. Then everyone is back here, controlling it remotely. A wild ocean adventure turned out to be a regular day at the office.”
A diver would go down once a month to check on the vessel, but otherwise the team was able to stay constantly connected to it remotely – even after they observed a small tsunami wave pass.
The next big data storage solution?
The team is still analyzing data from the experiment, but claims that the results are “positive so far and will yield results”, according to Christian Belady, general manager for datacenter strategy, planning and development at Microsoft.
Mr Belady said: “While at first I was skeptical with a lot of questions: what were the costs, how do we power, how do we connect? However, at the end of the day, I enjoy seeing people push limits.”
“The learnings we get from this are invaluable and will in some way manifest into future designs.”
Belady says the data centre business is so fast moving that “you have to predict two years in advance what’s going to happen in the business”.
Big data storage also presents challenge such as latency if the distance between the data centre and end users is too far, maintenance costs as well as high operating costs due to running chiller plants to keep computers from overheating.
Underwater data centres appear to counteract these problems, claim Microsoft.
Half of the world’s population lives within 120 miles of the sea, which means there us latency, says the software giant.
Deployment time of data centres is potentially faster, turning it from a construction project – which require permits and other time-consuming aspects – to a manufacturing one. Building the vessel that housed the experimental datacenter took 90 days.
Big data storage project – the next phase
The team is currently planning the project’s next phase, which could include a vessel four times the size of the current container with as much as 20 times the compute power. The team is also evaluating test sites for the vessel, which could be in the water for at least a year, deployed with a renewable ocean energy source.
So is it safe? Project Natick research engineer Jeff Kramer said: “It’s kind of like launching a satellite for space.
“Once you’ve built it, and you hand it to the guys with the rocket – or in our case the guys with the crane – you can’t do anything about it if it screws up.”