Follow Datanami:
October 5, 2015

Data ‘Assimilation’ Seen as Weak Link in U.S. Weather Forecasts

Second-guessing over the relative accuracy of the U.S. weather forecasting model compared to its European counterparts resumed last week as early predictions that Hurricane Joaquin would strike the U.S. east coast proved inaccurate. With the exception of the sodden residents of some southeastern coastal states, others were mostly spared after the hurricane drifted out into the Atlantic Ocean after pounding the Bahamas.

Early models from the U.S. National Weather Service had the Category 4 storm hitting the east coast anywhere from the North Carolina to New England. Late last week, the U.S. model had the Mid-Atlantic region in the storm’s crosshairs.

The storm’s behavior hued closer to models and forecasts from the European Center for Medium-Range Weather Forecasts, which unlike the American model consistently forecast Joaquin would turn away from the east coast and drift out to sea.

The “European model” also correctly forecast Hurricane Sandy’s “left hook” into the New Jersey coast in 2012, and did so well in advance of landfall. The U.S. model largely failed to predict the unusual turn inland.

U.S. weather experts say many of the problems with the current U.S. weather forecasting model, known as the Global Forecast System, or GFS, have to do with “data assimilation.” That process involves gathering available weather data to develop a description of the atmosphere on which weather models are run. The data set used for weather models includes clouds and radiation along with precipitation and winds aloft.

Storm forecasts tend to be only as reliable as the detailed atmospheric conditions fed into models. Indeed, there seemed to be little confidence last week in most of the U.S. models as Joaquin approached the U.S. east coast.

“It is clear that our [data] initializations are inferior,” Cliff Mass, a professor of atmospheric sciences at the University of Washington, told the New York Times. The Europeans “have taken a more sophisticated approach,” Mass added. “There’s a subtlety that the European center is getting right that we’re not.”

The U.S. moved swiftly after Hurricane Sandy to beef up computing power used for GFS forecasts as one way to increase the resolution, or detail, of its forecasting models. In response to Hurricane Sandy, Congress authorized $23.7 million for the National Weather Service to purchase forecasting tools and supercomputer infrastructure to beef up the U.S. forecasting model. The upgrades are expected to provide a ten-fold increase in U.S. computing capacity.

But critics argue it will take more than raw computing power for the GFS to improve. They note that the European model assimilates more satellite data like radiation from clouds. The European approach also makes greater use of historical weather data, giving forecasters a better idea of how storms are evolving over time rather than just a snapshot.

The subtleties of weather forecasting can be found in this meteorological big data. Despite the torrent of satellite and other sensor data available to U.S. forecasters, they remain behind their European counterparts in terms of developing accurate models and reliable storm forecasts.

Recent items:

Doing Something About the Weather With Big Data

The Big Data Weather Watchers in the Sky

Datanami