Using supercomputers, computational scientists in Texas virtualized Hurricane Harvey’s storm surge. This was valuable information to rescue workers.
In the wake of Hurricane Harvey first responders didn’t have complete on-the-ground information. They didn't know the flooding condition of all the neighborhoods and streets. But the computer model spelled out where the water was, its depth and possible ways to reach an area. That was then applied to Hurricane Irma, which is still moving through the Southeast.
“The ability to get information -- reliable information -- to people quickly in many of these events is paramount,” said Niall Gaffney, an astronomer by training, who is director of Data Intensive Computing at Texas Advanced Computing Center (TACC).
Computer generated models can tell the story about flooding and storm surge. Without it, first responders "would be operating in the complete dark and just taking guesses about what they should do,” said Gaffney.
Texas has some powerful advantages. TACC, based at the University of Texas in Austin, runs one of the largest supercomputers in the world, Stampede 2, a $30 million federally funded system. The systems vendor is Dell, and it uses Intel chips and Seagate storage. It is capable of 12 petaflops but once it is built-out it will work at about 18 petaflops.
The computers are doing this work at high resolution. Harvey was modeled at 25 meters, an area of 25-by-25 meters, or about the size of a city block. This is over a very large region. Continuing advances in supercomputing and storm modeling will lead to even greater detail.
Clint Dawson, a computational scientist at the University of Texas, is one of the researchers working on open source software used in storm surge and flooding, called ADCIRC (Advanced Circulation Model). This is a program “for solving the equations of motion for a moving fluid on a rotating earth.”
“This is the first time that we've been able to really test the system on a real hurricane for Texas at high resolution, and it performed incredibly well,” said Dawson.
What Dawson expects for the future is an ability to get to a 5-meter resolution, which will bring it to the point where “you could actually look at impacts on structures.”
The amount of computing power needed will be another two orders or more of magnitude, said Dawson. The models will a get larger and scale to take advantage of that power.
With advances in computer power, surge and flood models will be able to account for rainfall and erosion impacts on a storm surge. They could use information about structures to develop a picture of what will happen when a surge hits, building by building. Will the surge be powerful enough, for instance, to knock a house or chemical tank off its foundation?
This type of information may lead to better hardening of shelters, better planning and more thorough information about the impacts of a storm. Emergency officials will be able to tell people whether they can shelter in place.
Gaffney said the team'sr work is “giving people a good view of areas that cannot be viewed,” thanks to the power of high performance computing systems.
The models change “the way you work with the data the same way Google and others have changed the way people work with the Internet. It’s become a tool in your tool box and you rely on it and that’s where we are going right now," said Gaffney.
This supercomputer-generated information is also prompting emergency operations staff to ask new questions about storm surge and the flooding retreat. From “where is the water,” to “when can I actually get into this neighborhood?” said Gaffney.
The modeling can also save lives, because it can tell people exactly what will happen to their neighborhoods if they decide to ride out the storm, said Gaffney.
Patrick Thibodeau is a reporter covering enterprise technologies, including supercomputing, workplace trends and globalization. His work on outsourcing and H-1B visa issues has been widely cited, read on the floor of the U.S Senate, and has received national awards for ... View Full Bio