Forecasting Katrina

Scientists use a slew of technologies to predict impact of hurricanes

Scientists are using a range of technologies to better predict the impact hurricanes can have on the economy and environment to minimize future damage and save lives.

Supercomputers, modeling programs and geographic information systems are some of the technologies scientists use to track the movement of hurricanes and predict damage. Experts warn, however, that skilled professionals are as crucial to accurate forecasting as technology.

Supercomputers aided the National Oceanic and Atmospheric Administration in accurately forecasting Hurricane Katrina's path. The storm devastated the coastal areas of Alabama, Louisiana and Mississippi.

"Two and a half to three days before the hurricane hit, we were pretty much zoomed in on the Louisiana/Mississippi Gulf Coast as where the hurricane would hit," said Jack Beven, a hurricane specialist at the NOAA Tropical Prediction Center. "It's probably not the most accurate we've been, but it's certainly pretty accurate."

Agency officials say predictions about the intensity of hurricanes Isabel and Frances may have been more accurate.

"The one reason perhaps that we had a little trouble earlier in the storm's life was that we underestimated how much the high-pressure ridge over the southern [United States] would push the hurricane southward," Beven said. The storm grew stronger than expected, particularly over the Gulf of Mexico, where it became a Category 5 hurricane, he added.

Forecasters calculate the power and route of a hurricane largely with the assistance of an IBM-built supercomputer.

Supercomputer models play an important role when a storm is about three to five days from the coast, said Kevin Cooley, director of central operations at NOAA's National Centers for Environmental Prediction. With Hurricane Katrina, "there was a lot of warning provided. The storm basically went where the forecasters thought it was going to go," he said.

In the past year, NOAA's National Weather Service increased its supercomputing capacity threefold from 0.5 teraflops to 1.5 teraflops as part of its long-term contract with IBM.

But computer scientists say the federal government needs smarter people, not just smarter technology, to improve hurricane-intensity forecasting.

"We often look at the hardware and become enamored with that race car," said Jack Dongarra, a computer science professor at the University of Tennessee and co-author of the Top 500 list of the world's fastest supercomputers. "But there's more to it than the race car. We need to more effectively drive that race car."

The United States needs more physicists, mathematicians and other specialists to develop weather-forecasting equations for the supercomputers, he said.

"It starts with having faster computers, clearly, and then more accurate information going into the computers, and then better models used in the calculation," Dongarra said.

We cannot predict the occurrence of brilliant minds, he added.

"The observations: We know what we need to do. We can do more of them," Dongarra said. "As far as the technology things: Moore's Law will continue into the future and double the computing power every 18 months. One of the things that is unpredictable is improving the models. That comes from creative people understanding what's happening to the physics of the sea and the land."

Other researchers say more satellite observations are needed, along with human intelligence, to fuel forecasts.

"For continuous observation, nothing beats a satellite," said Chuck Watson, founder of disaster planning firm Kinetic Analysis, which operates a hurricane damage estimation Web site in cooperation with the University of Central Florida. "We need to be smarter about how we ingest and use observations in real time. We have a lot of data that doesn't make it into the current generation of forecast models effectively, in my opinion."

Watson's computer simulations, available at hurricane.methaz.org, tabulated the mammoth financial losses caused by Katrina. As of 8 a.m. Aug. 30, the computer simulations calculated that Katrina would inflict $50 billion worth of wind and flood damage by the storm's end.

The Web site monitors storms worldwide and lists estimates of how much damage specific ones are likely to cause based on wind models, topography, storm tracks and other exposure data. Although the project's official mission is to support the state of Florida, which likely sustained $1.4 billion worth of damage from Katrina, the site has attracted thousands of federal visitors.

Last week, 9,760 .mil users, 6,977 .gov users and 2,097 FEMA employees viewed the site. During calm weather, the site has about 200 .gov visitors a week.

"Five days out, we were showing that this could knock out a quarter of the gulf's oil production for more than a month and would cause some disruption over 90 percent of the gulf's oil production," Watson said, adding that many government agencies were specifically looking at the oil impact and toxic cleanup data.

The damage statistics are generated by two 64-processor Beowulf clusters, which he called a poor man's supercomputer. One cluster processes meteorological data, while the other analyzes consequences.

One of the most difficult tasks in modeling economic losses is collecting enough structural data, Watson said. For Katrina, he had to assign map coordinates to some indiscernible buildings.

"We have good data about certain things — oil platform locations, for example," Watson said. "But for most businesses and for residential damage, all we have is ZIP code-level georeferencing, and little details such as construction type, value, etc., and that introduces some biases and errors."

After government officials fully comprehend the consequences of Katrina, they might be able to mitigate the impact of future storms. For example, the U.S. Geological Survey is studying the extent and cause of Katrina's coastal impact to inform officials who rebuild the area.

With the help of GIS and laser altimetry, oceanographers will compare land surveys from before and after the storm to assess erosion, sand dune destruction and other shoreline changes.

"The ultimate goal is to understand how the coast responds to storms and determine which areas are more vulnerable to storm impacts," said Hilary Stockdon, a USGS oceanographer. "That information can be used by coastal planners and managers in developing communities or planning evacuation routes to potentially avoid areas that are more vulnerable than others."

USGS oceanographers have collected preliminary data with aerial-view video and still photographs to contrast with video and photos from July 2001. Latitude and longitude coordinates from the Global Positioning System were merged with the video data, giving researchers a constant point of reference.

Stockdon said early observations showed that Dauphin Island, Ala., suffered extensive beach erosion. Homes were destroyed and some areas appeared to have been completely submerged during the hurricane.

Coastal impact

Officials at the U.S. Geological Survey will deliver data from Gulf Coast surveys to other agencies to aid in disaster recovery and alleviate beach erosion.

Here's how the data will help:

  • Aerial video, still photography and laser altimetry surveys of post-storm beach conditions will be compared with earlier data.
  • Juxtapositions will highlight the magnitude of coastal changes such as beach erosion, overwash deposition and island breaching.
  • Comparison data will help refine predictions of future storms' coastal impacts.

Forecasting power

Before the 2005 hurricane season, the National Oceanic and Atmospheric Administration installed a new generation of weather and climate supercomputers. Now, NOAA's National Weather Service has three systems working together to deliver weather and climate forecasts.

Primary and backup operating systems ensure that forecasts are delivered without interruption, and a research and development system speeds the transition of research results into the operational systems.

The supercomputers greatly increased the agency's computational power, enabling NOAA analysts to perform 1.3 trillion calculations per second. Previously, they were able to perform 450 billion calculations per second.

Increased computing power gives NOAA the ability to run higher-resolution models using more sophisticated applied physics to predict potential severe and extreme weather.

The supercomputers are a part of a $180 million, nine-year contract with IBM. The primary and research systems are located at the IBM facility in Gaithersburg, Md. The backup system is located at a NASA facility in Fairmont, W.Va.

— Rutrell Yasin

NEXT STORY: 9/11 remembered...