Bigdata
Now Reading
Smarter Lake, via Internet of Things -New York’s Lake George
0

Smarter Lake, via Internet of Things -New York’s Lake George

Smarter Lake, via Internet of Things -New York's Lake George : The Jefferson Project at Lake George, one of the most ambitious research projects to deploy Big Data and analytics technology to manage and protect a body of fresh water, is entering a new phase in which enormous amounts of data will be captured from sensors and analyzed. Scientists anticipate that insights uncovered from the data collection and discovery stage of the project will not only help manage and protect one of America’s most famous lakes, but create a blueprint to preserve important lakes, rivers and other bodies of fresh water around the globe. The potential impact of these new developments extends well beyond the shores of Lake George. By capturing and pooling data from all sorts of sensors and swiftly analyzing it, scientists, policy makers and environmental groups around the globe could soon accurately predict how weather, contaminants, invasive species and other threats might affect a lake's natural environment. Armed with these new insights and a growing body of best practices, corrective actions could be taken in advance to protect fresh water sources anywhere in the world.

A collaboration between IBM Research Rensselaer Polytechnic Institute and The FUND for Lake George, the Jefferson Project involves more than 60 scientists from around the world and IBM Research labs in Brazil, Ireland, Texas and New York. The project is deploying Internet of Things technology on a grand scale in conjunction with research and experimentation to understand the ecology of large lakes and the impact of human activity. Thirty-five years of monitoring the chemistry and algae in Lake George by scientists at Rensselaer’s Darrin Fresh Water Institute, in collaboration with The FUND for Lake George, have demonstrated the lake is changing. Chloride inputs from road salt have tripled, algae have increased by one third, and five invasive species have been introduced. These factors threaten entire regional economies driven by water recreation, boating and other forms of tourism on fresh water lakes, rivers and streams.

The new phase of the project is the culmination of several milestones. An array of sophisticated sensors of different shapes and sizes, including underwater sonar based sensors; customized software programs and solar energy systems to power off-grid equipment have now been deployed, tested and refined. These enhancements have led to greatly improved measurement data that will be used to better understand the lake and lead to improvements in the accuracy of four predictive models built by IBM researchers that precisely measure weather events, water run-off from the surrounding mountains into the lake, inputs of road salt to the lake, and water circulation.

Even as the data collection and discovery phase of the Jefferson Project ramps up, the ambitious initiative has already offered intriguing insights into Lake George. For example, Lake George flows south to north, with the lake draining into Lake Champlain via the La Chute River. However, sensors deployed on the lake bottom during the winter ice-over period recently confirmed complicated flow patterns within Lake George. These findings, which are being further investigated, include higher than expected currents and countercurrents during the time in which the lake is frozen.

In addition, data from sensors recently confirmed the existence of an underwater "ghost wave" which can be as high as nearly 100 feet running about 30 feet below the surface of the 32-mile long lake. Scientifically known as a seiche wave, the phenomenon is characterized by dramatic oscillations that occur beneath the lake which are mostly undetectable on the surface. The seiche wave phenomenon was first discovered by noted Swiss Hydrologist François-Alphonse Forel in 1890, on Lake Geneva, Switzerland. The computing infrastructure powering the Jefferson Project involves multiple computing platforms, ranging from an IBM Blue Gene/Q supercomputer located in a data center on the Rensselaer campus to embedded, intelligent-computing elements and other Internet of Things technology situated on various sensor platforms in and around the lake.

New Jefferson Project milestones include:

• Using IBM's Deep Thunder system, the weather model has improved its resolution with two-day forecasts now being made twice daily, with greater accuracy at more than half-mile intervals for precipitation; temperature; wind speed; wind chill and direction; humidity; visibility and more.

• The water run-off model, which maps the flow of precipitation and snow melt, now utilizes improved six-foot resolution topographical data of the lake’s watershed through the utilization aircraft-based LiDAR surveying mapping technology.

• The salt model provides the first ever assessment of the relative amounts of road salt deposited in the lake from various segments of local roadways in the Lake George watershed. It identifies and compares more than six dozen locations around the lake where the application of salt to roads may cause the greatest contamination to the lake and surrounding area.

• The water circulation model has improved its resolution of the 200 foot deep lake, with new, high-resolution bathymetry from a recent hydrographic survey. The second generation model uses 468 million depth measurements from the new survey – a vast improvement over the first generation model, which relied on only 564 depth measurements over the entire lake.

These four models, together with Rensselaer’s food web model, which examines how the lake’s ecosystem is affected by nature and human activities, comprise the interconnected environmental management system, which is the heart of the project. The food web model is also being further calibrated with extensive surveys of the lake’s algae, plants, and animals.

What's your reaction?
Love It
0%
Very Good
100%
INTERESTED
0%
COOL
0%
NOT BAD
0%
WHAT !
0%
HATE IT
0%