Follow Datanami:

Tag: data quality

Do You Have Customer Data You Can Trust?

Sep 3, 2020 |

Customer data is the new business success factor, yet over 76% of the 4,500+ companies we’ve worked with did not have accurate customer data. Surprised? Here’s more.

Forty-seven percent of these companies did not have any data quality management solutions in place, while the remaining relied on ETL methods to make sense of increasingly complex customer data. Read more…

Poor Data Hurts COVID-19 Contact Tracing, CDC Director Says

Jun 30, 2020 |

The lack of accurate and timely data about COVID-19 infections has hampered the government’s ability to perform contact tracing, the director of the Centers for Disease Control and Prevention, Robert Redfield, testified today in Congress. Read more…

Precisely CEO Discusses the Rebrand from Syncsort

May 20, 2020 |

Syncsort last week changed its name to Precisely, a move that CEO Josh Rogers says was all about showcasing how the company has pivoted its strategy to an emerging category of tools centered on data integrity. Read more…

COVID-19 Has a Data Governance Problem

May 6, 2020 |

The COVID-19 pandemic has put many of the world’s activities at a stand-still. Scientists and government officials have been working furiously behind the scenes to forecast the spread and growth of the virus since the first observed cases were submitted to the World Health Organization (WHO). Read more…

How the Lack of Good Data Is Hampering the COVID-19 Response

Apr 27, 2020 |

We don’t have perfect data on COVID-19. That much is painfully obvious. But is the data we have good enough to make sound decisions? That’s the question that many people are asking as political leaders consider lifting social-distancing restrictions and resuming public life following the unprecedented coronavirus lockdown. Read more…

Coming to Grips with COVID-19’s Data Quality Challenges

Apr 21, 2020 |

The COVID-19 pandemic is generating enormous amounts of data. Large amounts of data about infection rates, hospital admissions, and deaths per 100,000 are available with just a few button clicks. Read more…

‘Inclusive’ Approach Seen as Key to Trusted AI

Mar 25, 2020 |

Trustworthy AI for enterprise applications remains elusive, frequently due to poor-quality or siloed data. The result is little confidence in early AI deployments, a vendor survey found.

Dataiku, the enterprise AI platform specialist, used the results its survey of about 400 data scientists, analysts and AI application users to make the case for “inclusive AI.” That approach not only democratizes data but improves quality and, with it, the potential for scaling AI projects into application users’ trust. Read more…

Room for Improvement in Data Quality, Report Says

Jan 23, 2020 |

A new study commissioned by Trifacta is shining the light on the costs of poor data quality, particularly for organizations implementing AI initiatives. The study found that dirty and disorganized data are linked to AI projects that take longer, are more expensive, and do not deliver the anticipated results. Read more…

Syncsort Doubles Down on Data Quality with Pitney Bowes Buy

Dec 6, 2019 |

Here’s a stat to ponder: With its $700-million acquisition of Pitney Bowes’ software and data business now complete, Syncsort becomes the second biggest vendor in the data quality space, per 2018 figures from IDC. Read more…

Databricks Donates Delta Code to Open Source

Apr 24, 2019 |

Databricks today announced that it’s open sourcing the code behind Databricks Delta, the Apache Spark-based product it designed to help keep data neat and clean as it flows from sources into its cloud-based analytics environment. Read more…

Do NOT follow this link or you will be banned from the site!