Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.
Joao-Pierre S. Ruth
February 6, 2023
3 Min Read
In the wake of Hurricane Dorian, waves crash onto the beach near the Ocean View Fishing Pier in the Ocean View section of Norfolk. The area is situated just minutes from Naval Station Norfolk and Joint Expeditionary Base Little Creek-Fort Story in Virginia Beach. The area is prone to coastal flooding. (US Navy Photo by Max Lonzanida/Released)APFootage via Alamy Stock Photo
There is much debate whether climate change is responsible for recent bouts of extreme weather, but efforts are being made regardless to use machine learning and data to better prepare for floods and other weather-borne calamities.
Sridhar Katragadda, lead data scientist for the City of Virginia Beach, Va., works on a water monitoring project that uses data analysis company HEAVY.ai’s analytics platform. The sensor data, once processed, can inform locals of roadway flooding, make predictions on where future flooding may occur, and inform the city on future urban development and planning for flooding and safety.
Having seen the impact of devastating hurricanes in the past, Katragadda says an important thing to understand is environment and surface waterflow. Some 10 years ago, there were few sensors in place to offer such insight. “That prompted me to think we need to sensorize Virginia Beach as a city to address hurricanes and flooding,” he says.
Before building its sensor network, Katragadda says Virginia Beach took a look at the sensor network established by the Iowa Flood Center at the University of Iowa for some inspiration.
The Virginia Beach sensorization effort, he says, would ultimately include areas such as Newport News, Va., to offer a broader scope of data. “Water levels are provided every six minutes and they are autonomous sensors,” he says. “They can be in bridges, ponds, or reservoirs.” The sensors offer a look at areas such as intersections that may be prone to flooding.
The city has also worked with the United States Geological Survey (USGS), placing cameras at two bridges. “The cameras are pointed to a staff gauge,” Katragadda says, which is a rule set in bodies of water to determine water surface levels. Every six minutes, a picture is taken of the staff gauge. “We wrote machine learning models on that camera so it can tell automatically what the water level is,” he says.
Environmental and Weather Analysis
There is an interest in expanding the sensor usage and data collection for environmental and weather analysis, Katragadda says, given that weather sensors are already available to citizens for use on their houses. “We want to bring all the sensors together in the future,” he says, “from citizen scientists as well as the city.”
The sensor analysis project runs entirely in the cloud on AWS, Katragadda says. Data is drawn from sensors networks that include the USGS and the National Oceanic and Atmospheric Administration. Some private companies also maintained Virginia Beach’s sensor network. “We bring all of them into the city’s cloud, in real-time, and then use our apps to broadcast that data through the city’s app,” he says.
By 2019, Katragadda was using HEAVY.ai to help analyze decades of data from severe weather events to understand shifts in water levels. Now with several years of water level data available, he says attention turned to getting a machine to learn from the patterns. “We are in the later stages of developing prediction models for these centers,” Katragadda says, which could be used to make predictions up to 10 days in advance.
For example, local public works teams might be alerted to areas likely to flood during storms based on the predictive analysis of water levels.
Satellite images might be added to the mix in the future, Katragadda says, such as from the Surface Water and Ocean Topography satellite launched in December. “That’s going to provide some data in late fall of 2023,” he says, which should include ocean and lake levels. Katragadda says Virginia Beach would like work with such data to further advance its own project. “For us, the future is looking like bringing these disparate data sets together and then using data science to automatically have statistical summaries of what these models are doing in real-time so we can look into the future.”
What to Read Next:
Read more about:Business Continuity/Disaster Recovery
About the Author(s)
Joao-Pierre S. Ruth covers tech policy, including ethics, privacy, legislation, and risk; fintech; code strategy; and cloud & edge computing for InformationWeek. He has been a journalist for more than 25 years, reporting on business and technology first in New Jersey, then covering the New York tech startup community, and later as a freelancer for such outlets as TheStreet, Investopedia, and Street Fight. Follow him on Twitter: @jpruth.
You May Also Like
Five Advantages of Fortinet Data Center Firewalls
The ultimate survival guide to SOC 2 compliance
MontanaPBS Shifts to Agile Broadcasting With Help from Raritan KVM Solutions
Checklist: Top 6 Considerations to Optimize Your Digital Acceleration Security Spend
2022 Retrospective: The Emergence of the Next Generation of Wi-Fi