Follow Datanami:
September 3, 2013

Can Big Data Tech Revive Dead Zones in Lakes?

Isaac Lopez

Can big data revive the dead zone at the south end of Lake George where oxygen levels are too low to support fish and other aerobic life?  The FUND for Lake George, combined with IBM, & The Rensselaer Polytechnic Institute are determined to find out.

In an announcement this summer, the group said that they would be undertaking a three-year, multi-million dollar collaboration aimed at understanding the complexities of the Lake George ecosystem in an effort to protect the lake and the $1 billion dollars of tourism that the lake brings in to New York’s Warren County and surrounding regions.

According to the scientists at Rensselaer who have been studying Lake George for 30 years through their Darrin Fresh Water Institute, the lake is facing a rising glut of environmental stressors. These include rising levels of chlorophyll threatening water clarity, large increases in road salt which are finding their way into the lake as run-off, and a 62% increase in building construction compounding conditions.

With the health of Lake George under threat, the group aims to stock the lake with sensors monitoring a large array of variables from which to gather real time information. This new sensor array will be initiated with 40 sensing platforms monitoring 25 different aspects, including water chemistry, weather, current speeds and directions, and more. The researchers say they will create 3-D circulation models that will allow them to understand how current in the lake distribute nutrients and contaminates across all 32-miles of the lake, and model how contaminates are spread – and what actions should be taken to minimize such contaminations.  

With the new sensor array in place, the researchers plan to combine it with the 30 years of already collected data to answer key questions, including:

  • What was the pristine state of the lake?
  • What is the impact of salt overloading on the lake?
  • What are the consequences of nutrient loading on algal growth (as measured by chlorophyll levels) in the lake?
  • What remediation strategies for these and other stressors may be effective?

With the data collected from the project, the FUND says they intend to invest in solving the problems that that the science reveals.

“Both process and product of this strategy promise to serve as a model for how sustainability can be effectively pursued,” the FUND wrote in a strategy document. “Actions of the Project will guide investment decisions as they also help build an informed and, indeed, empowered constituency committed to protecting Lake George.”

Indeed, the Lake George effort could serve as a great model for others to follow, with big data technologies proving to be the difference makers. Currently, the EPA reports that ~44% of assesses stream miles, 64% of assessed lake acres, and 30% of assessed bay and estuarine square miles are not clean enough to support such uses as fishing and swimming.

Many environmental onlookers will be eager to see what success the group has in using these technologies for solving the problems of Lake George.

Related items:

Erecting Operational Intelligence Using Machine Data 

Dutch Turn to Big Data for Water Management & Flood Control 

Finding Value in Data Through Life Saving Applications