Follow Datanami:
June 30, 2020

Poor Data Hurts COVID-19 Contact Tracing, CDC Director Says

(lakshmiprasada S/Shuttertock)

The lack of accurate and timely data about COVID-19 infections has hampered the government’s ability to perform contact tracing, the director of the Centers for Disease Control and Prevention, Robert Redfield, testified today in Congress.

“There are a number of counties that are still doing this pen and pencil,” Redfield said during a Senate committee meeting on President Trump’s coronavirus response. “They really are in need of aggressive modernization.”

Many of the country’s healthcare networks have embarked upon extensive IT modernization projects as a result of President Obama’s 2009 Hitech Act, which mandated the use of electronic healthcare records (EHRs) in place of paper files. However, many areas of the country, particularly more rural areas that serve smaller populations, have not yet shifted to EHRs.

The responsibility for tabulating and aggregating COVID-19 case numbers falls to local public health departments, which are often tied to county governments. According to Redfield, a “substantial investment” is needed to modernize the systems that they use. “It would be one of the great investments in our time to make this happen once and for all,” he said.

Poor data is hampering COVID-19 contact tracing, CDC Director Robert Redfield says

The COVID-19 pandemic has put the onus on public health officials’ to quickly collect and analyze data. While other nations have a single, top-down healthcare system that can work in unison, the American approach is much more dispersed and involves stakeholders in private industry and public officials at the local, state, and federal levels. Those layers complicate efforts to gather and analyze data quickly.

To be worthwhile, contact tracing requires data that is accurate and fresh. Unfortunately, those two data qualities appear to be in short supply. Contact tracing “doesn’t have any value unless you could do it in real time,” Redfield said.

“We need to have a comprehensive integrated public health data system that’s not only able to do something that’s in real time, but actually can be predictive,” Redfield continues. “We have a moment in time where I think people are attuned, and I would say now’s the time to make the necessary investment in our public health, at the local, territorial, tribal state and federal level, so that this nation finally has the public health system not only that it needs, but that it deserves.”

The poor state of healthcare data should not be a surprise to Datanami readers. We have been sharing stories from front-line responders about the extensive manual efforts required to gather “heads and beds” data from hospitals, which share data only on a voluntary basis.

The poor state of COVID-19 data has led a handful of data analytics vendors to wade into the fray. Talend launched an effort to try and normalize and de-duplicate COVID-19 data so that it can be actionable, while AtScale partnered with Snowflake make sense of it. Others, such as Hitachi Vantara, have suggested that it would have been good if we had created a national data catalog for COVID-19 along the lines of Data.Gov. But alas, it’s too late to build it now.

To prepare for the next pandemic, hospitals and healthcare leaders should work together to enable faster access to critical data, said Frank Nothaft, the technical director for healthcare and life sciences at Databricks.

“Ultimately, data scientists working with legacy EHR architectures spend more time freeing data from their EHR, and less time building innovative models that can improve patient outcomes,” Nothaft says. “Pushing healthcare systems to open interoperable systems that enable seamless analytics across hospitals has been a major priority in the data science world recently.”

Related Items:

How the Lack of Good Data Is Hampering the COVID-19 Response

Coming to Grips with COVID-19’s Data Quality Challenges

Data Transparency: Lessons from COVID-19