Follow Datanami:
August 17, 2021

Big Data Tool Enables Real-Time Disaster Recovery Monitoring

(lavizzara/Shutterstock)

The Southeast is bracing for a heavier-than-normal hurricane season while California continues to face down its second largest wildfire in recorded history – and with climate change continuing to worsen, that’s just the beginning. Many agencies are turning to novel technologies to help the country weather the weather, and now, researchers at Texas A&M University have developed a new big data-powered framework for assessing post-disaster recovery in many communities.

There are many metrics for disaster recovery, perhaps most famously the Waffle House Index, which gauges the state of a town’s recovery by the functionality of its Waffle House operations. The new tool is somewhat more advanced: the researchers leveraged anonymized cell phone data (provided by SafeGraph) from users in Harris County – home of Houston, Texas – before, during, and after the devastation of Hurricane Harvey.

The data contained information on daily visits and location type for over 55,000 points of interest in the Houston area. By sorting this enormous amount of data by category and viewing it over time, the researchers could get a bird’s eye view of how and when the people in the community began visiting essential and nonessential services again in the wake of the storm. This information has crucial applications for disaster management services and agencies.

“Neighboring communities can be impacted very differently after a natural catastrophic event,” said Ali Mostafavi, a civil engineer at Texas A&M University and one of the authors of the paper, which was published in Interface. “We need to identify which areas can recover faster than others, and which areas are impacted more than others so we can allocate resources to areas that need them more.”

This new method introduces several advantages over other methods of measuring resilience, which operate based on monitoring of specific metrics like hospital capacity or by using post-hoc surveys that ask individuals how they were affected and how they are recovering. But those surveys, of course, take months to yield results.

“In addition to being faster than surveys, these research methods avoid some human errors such as memory failure, they have some privacy-preserving advantages, and they don’t require time and effort by the people affected,” added Jacqueline Meszaros, a program director in NSF’s Directorate for Engineering. “When we can learn about resilience without imposing on those who are still recovering from a disaster, it’s a good thing.”

To learn more about this research, read the paper, which was published in the April 2021 issue of Interface as “Quantifying community resilience based on fluctuations in visits to points-of-interest derived from digital trace data”. The paper was written by Cristian Podesta, Natalie Coleman, Amir Esmalian, Faxi Yuan, and Ali Mostafavi.

Datanami