Jefferson Project at NY's Lake George entering new phase with IoT technology
The Jefferson Project at Lake George in New York is entering a new phase in which enormous amounts of data will be captured from sensors and analyzed. The results are expected to create a blueprint to preserve important lakes, rivers and bodies of fresh water around the globe.
LAKE GEORGE, NY, July 8, 2015 -- The Jefferson Project at Lake George in New York -- one of the most ambitious research projects to deploy Big Data and analytics technology to manage and protect a body of fresh water -- is entering a new phase in which enormous amounts of data will be captured from sensors and analyzed. Scientists anticipate that the results will not only help manage and protect one of America's most famous lakes, but also create a blueprint to preserve important lakes, rivers and other bodies of fresh water around the globe.
The potential impact of these new developments extends well beyond the shores of Lake George. By capturing and pooling data from all sorts of sensors and swiftly analyzing it, scientists, policy makers and environmental groups around the globe could soon accurately predict how weather, contaminants, invasive species and other threats might affect a lake's natural environment. Armed with these new insights and a growing body of best practices, corrective actions could be taken in advance to protect freshwater sources anywhere in the world.
A collaboration between IBM Research, Rensselaer Polytechnic Institute and The FUND for Lake George, the Jefferson Project involves more than 60 scientists from around the world and IBM Research labs in Brazil, Ireland, Texas, and New York. The project is deploying Internet of Things (IoT) technology on a grand scale in conjunction with research and experimentation to understand the ecology of large lakes and the impact of human activity.
Thirty-five years of monitoring the chemistry and algae in Lake George by scientists at Rensselaer's Darrin Fresh Water Institute, in collaboration with The FUND for Lake George, have demonstrated the lake is changing. Chloride inputs from road salt have tripled, algae have increased by one third, and five invasive species have been introduced. These factors threaten entire regional economies driven by water recreation, boating and other forms of tourism on freshwater lakes, rivers and streams.
|Jefferson Project Director Rick Relyea (left) and IBM Research Distinguished Engineer Harry Kolar (right) examine a visualization of Lake George (Photo for IBM)|
The new phase of the project is the culmination of several milestones. An array of sophisticated sensors of different shapes and sizes, including underwater sonar-based sensors, customized software programs, solar energy systems, and power off-grid equipment, have now been deployed, tested and refined. These enhancements have led to greatly improved measurement data that will be used to better understand the lake and lead to improvements in the accuracy of four predictive models built by IBM researchers that precisely measure weather events, water runoff from the surrounding mountains into the lake, inputs of road salt to the lake, and water circulation.
The computing infrastructure powering the Jefferson Project involves multiple computing platforms, ranging from an IBM Blue Gene/Q supercomputer located in a data center on the Rensselaer campus to embedded, intelligent-computing elements and other IoT technology situated on various sensor platforms in and around the lake.
|IBM Research scientists Mike Kelly (left) and Harry Kolar (right) deploy an array of sensors that capture data, which will be analyzed to help manage Lake George. (Photo for IBM)|
New Jefferson Project milestones include:
- Using IBM's Deep Thunder system,the weather model has improved its resolution with two-day forecasts now being made twice daily, with greater accuracy at more than half-mile intervals for precipitation, temperature, wind speed, wind chill and direction, humidity, visibility, and more.
- The water run-off model, which maps the flow of precipitation and snow melt, now utilizes improved six-foot resolution topographical data of the lake's watershed through the utilization aircraft-based LiDAR surveying mapping technology.
- The salt model provides the first ever assessment of the relative amounts of road salt deposited in the lake from various segments of local roadways in the Lake George watershed. It identifies and compares more than six dozen locations around the lake where the application of salt to roads may cause the greatest contamination to the lake and surrounding area.
- The water circulation model has improved its resolution of the 200-foot-deep lake, with new, high-resolution bathymetry from a recent hydrographic survey. The second generation model uses 468-million-depth measurements from the new survey -- a vast improvement over the first generation model, which relied on only 564-depth measurements over the entire lake.
These four models, together with Rensselaer's food web model, which examines how the lake's ecosystem is affected by nature and human activities, comprise the interconnected environmental management system, which is the heart of the project. The food web model is also being further calibrated with extensive surveys of the lake's algae, plants and animals.
Now with the Jefferson Project, important work is being digitized and accelerated, augmented with automated real-time data monitoring via a customized network of sensors that collect massive amounts of information and transmit it to supercomputers for analyses and modeling using sophisticated 3-D visualization technology. The project is also developing new tools, such as image recognition software that identifies plankton from data collected via a GPS-enabled towable camera, as well as state-of-the-art data visualizations that bring new data-driven discoveries to life for scientists, tourists and local residents.