IBM Develops Better Flood Prediction System - InformationWeek

InformationWeek is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

IoT
IoT
Cloud // Software as a Service
News
8/23/2011
04:42 PM
Connect Directly
Google+
LinkedIn
Twitter
RSS
E-Mail
50%
50%

IBM Develops Better Flood Prediction System

Ready for flood modeling as a service? Check out how IBM and researchers at the University of Texas at Austin are trying to give earlier flood warnings.

The next time the Guadalupe River in Texas floods, IBM researchers may be able to warn area residents and officials beforehand.

Disaster recovery will be front and center for IT teams this week, given the earthquake that struck the East Coast Tuesday, plus the impending arrival of Hurricane Irene. (See our related coverage of how NYSE IT held up during the earthquake.)

IBM, working in conjunction with researchers at the University of Texas at Austin, has demonstrated the ability to predict river behavior 100 times faster than real-time. In theory, this should allow warnings about impending floods several days before rivers actually overflow.

Such flood prediction, like any weather modeling, requires significant calculating and analytical power. IBM's approach is particularly calculation-intensive because it includes river tributaries in its number-crunching. Traditional approaches to flood prediction focus on main river stems and overlook tributaries. Hence, the project's reliance on IBM Power 7 systems.

IBM has posted a video on YouTube that demonstrates its flood prediction technology.

"The challenge here is to be able to model the flow of water in very large scale river networks, like the Mississippi River Basin or even the whole continental United States," said David R. Maidment, professor of civil engineering at the University of Texas, Austin, in a phone interview. "And that capacity has existed for the weather community for a long time. Even in the oceans community...we can model how [hurricanes] move through the Gulf of Mexico. But there's never been any capability to model similar scale things in hydrology, especially when it comes to solving the real equations of river flow on a proper GIS representation of the rivers. That's what IBM has succeeded in doing."

Maidment says that models exists for the main stems of rivers, but not the flows coming in through the tributaries. He notes that it was the tributaries that flooded in the recent Mississippi flooding. "The flooding that happened down around Vicksburg, Miss., was because the tributaries couldn't get into the Mississippi," he said.

Maidment is working with IBM not only to improve flood modeling capabilities, but also to develop flood modeling as a service. Such services, he suggests, could be useful to keep emergency responders aware of moving flood boundaries and to warn residents of flooded areas when necessary.

Flood modeling as a service is being made possible by the development and adoption of a specification called WaterML, an XML-based markup language for communicating hydrology data. In recent years, the USGS and the National Weather Service have adopted WaterML and are publishing their water-related data in that format.

"What's gradually happening is the water information of the country, particularly that related to flooding, is coming together in a common language," said Maidment. "And what we're doing with IBM is to develop what we're calling virtual observation points all along the river, where the values of the water level and the flow rate are being calculated rather than being measured."

Present flood systems are based on only a few data collection points. The goal is to have widespread virtual observation points for better situational awareness during floods.

InformationWeek Analytics has published a report on backing up VM disk files and building a resilient infrastructure that can tolerate hardware and software failures. After all, what's the point of constructing a virtualized infrastructure without a plan to keep systems up and running in case of a glitch--or outright disaster? Download the report now. (Free registration required.)

We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Comment  | 
Print  | 
More Insights
News
IT Spending Forecast: Unfortunately, It's Going to Hurt
Jessica Davis, Senior Editor, Enterprise Apps,  5/15/2020
Commentary
Helping Developers and Enterprises Answer the Skills Dilemma
Joao-Pierre S. Ruth, Senior Writer,  5/19/2020
Slideshows
Top 10 Programming Languages in Demand Right Now
Cynthia Harvey, Freelance Journalist, InformationWeek,  4/28/2020
White Papers
Register for InformationWeek Newsletters
Video
Current Issue
Key to Cloud Success: The Right Management
This IT Trend highlights some of the steps IT teams can take to keep their cloud environments running in a safe, efficient manner.
Slideshows
Flash Poll