Follow Datanami:
June 10, 2014

Climate Data Being Analyzed to Predict California Droughts

California’s prolonged drought and the threat it poses to the Golden State’s huge agricultural sector has prompted some technology companies to harness big data as a way of gauge future water supplies across the parched state.

Meanwhile, a new NASA climate probe is expected to generate a database of information about the Earth’s carbon sinks and emissions.

The climate situation on the U.S. West Coast is becoming so dire that California’s Central Valley is even experiencing declines in a local phenomenon known as “tule fog.”

Hence, California-based companies like Intel have launched big data efforts to examine the climate impact of agricultural irrigation. Another will attempt to map snowfall California’s Sierra Nevada mountain chain that provides much of the water used for irrigation in the San Joaquin and Sacramento valleys.

Intel is using snow-pack data collected in the Sierras to help predict drought conditions in areas like the Central Valley where the bulk of the nation’s fruits and vegetables are grown. Much of the nation’s dairy production also is located in California.

Another Intel big data effort seeks to promote a “precision farming” effort in which sensors are used in farm fields to monitor soil moisture. California’s vineyards have long been using “drip irrigation” systems as a way to conserve water and target irrigation at critical times during the growing season.

What’s new with the Intel big data initiative is that the chipmaker is working with the University of California-Davis to scale up these irrigation techniques to larger agriculture operations that dominate the state economy. Intel predicts a data-driven approach could reduce the amount of water used in irrigation by 50 percent.

The big data push is also part of a larger trend to expand the use of data analytics beyond business applications. Intel researchers said the two water initiatives would help create “reference architectures” that could eventually be applied across other industries.

“With Big Data, we’re starting to see a secular movement across all industries,” Vin Sharma, Intel’s director of planning and marketing for Hadoop, told the web site Data Center Knowledge.

The water projects were selected partly because of the large amounts of data they could generate. For example, snow depth is measured using a remote sensing technique that estimates coverage based on reflectance on a scale of one to seven.

The technique generates about 2 terabytes of data every two weeks, Intel’s Sharma said.

Broader remote sensing projects measuring various climate factors are also generating large amounts of data. For example, mission planners for NASA’s upcoming Orbital Carbon Observatory-2 mission predict that a trio of spectrometers carried by the satellite will collect about 10 megabytes of science date each day on the Earth’s carbon sinks and carbon emissions.

NASA’s carbon-sniffing satellite is scheduled for launched from Vandenberg Air Force Base in California on July 1.

One goal of the Intel big data project is to create a database of snowpack images that could be used to predict the breadth of the California’s current drought.

NASA mission planners said science data collected by the carbon observatory would be used to create a “geophysical record” that can be applied by climate scientists to monitor the impact of rising atmospheric carbon dioxide.

Related Items:

Farmers Plant for Hyper-Local Forecasts with IBM’s ‘Deep Thunder’

A Weather Forecast Specific to your Back Yard?

Amazon Hosting 20 TB of Climate Data

Datanami