Follow Datanami:

Vendors » NVIDIA

Features

IBM, Microsoft, and Nvidia Pit AI Against Wildfires

Apr 2, 2020 |

In 2018, California experienced its worst wildfire season on record, with uncontrolled fires causing unprecedented death and devastation. 2019 saw another difficult season, with millions of residents experiencing precautionary power shutoffs. Read more…

An Open Source Alternative to AWS SageMaker

Jan 27, 2020 |

There’s no shortage of resources and tools for developing machine learning algorithms. But when it comes to putting those algorithms into production for inference, outside of AWS’s popular SageMaker, there’s not a lot to choose from. Read more…

2019: A Big Data Year in Review – Part One

Dec 18, 2019 |

At the beginning of the year, we set out 10 big data trends to watch in 2019. We correctly called some of what unfolded, including a renewed focus on data management and continued rise of Kubernetes (that wasn’t hard to see). Read more…

How 5G Will Serve AI and Vice Versa

Dec 10, 2019 |

5G is the future of the edge. Though it’s still several years away from widespread deployment, 5G is a key component in the evolution of cloud-computing ecosystems toward more distributed environments. Read more…

Inside OmniSci’s Plans for Data Analytics Convergence

Oct 23, 2019 |

You may know OmniSci as the provider of a fast SQL-based database that runs on GPUs. But the company formerly known as MapD is moving beyond its GPU roots and is building a data platform that runs on CPUs and does machine learning too–a vision that it shared at its inaugural Converge conference in Silicon Valley this week. Read more…

News In Brief

Tech Conferences Are Being Canceled Due to Coronavirus

Mar 2, 2020 |

Several conferences scheduled to take place in the coming weeks, including Nvidia’s GPU Technology Conference (GTC) and the Strata Data + AI conference, have been canceled due to fears of the COVID-19 coronavirus. Read more…

Microsoft Details Massive, 17-Billion Parameter Language Model

Feb 12, 2020 |

Microsoft this week took the covers off Turing Natural Language Generation (T-NLG), a massive new deep learning model that it says will push the bounds of AI in the field of natural language processing (NLP). Read more…

GPU Storage Approach Targets Big Data Bottlenecks

Aug 9, 2019 |

An emerging storage technology aims to leverage faster GPUs by creating a direct path between local and remote storage, thereby overcoming I/O bottlenecks that are slowing the crunching of AI and HPC data sets. Read more…

Booz Allen Gives Government a Deep Learning Edge

Mar 28, 2019 |

The latest breakthroughs in deep learning technology have emanated from places like Silicon Valley and Toronto, where Turing Award-winner Geoffrey Hinton did his seminal work. But deep learning applications today are finding their way to Washington D.C., often through the assistance of government contractors like Booz Allen Hamilton (BAH). Read more…

RAPIDS Momentum Builds with Analytics, Cloud Backing

Mar 19, 2019 |

Nvidia’s RAPIDS data science libraries picked up additional support this week from a roster of cloud, server and workstation vendors along with professional services giant Accenture.

Nvidia also announced that Apache Spark creator Databricks will integrate RAPIDS into its analytics platform. Read more…

This Just In

NVIDIA Launches Accelerator for Supercomputing and Big Data Analytics

Nov 18, 2013 |NVIDIA today unveiled the NVIDIA Tesla K40 GPU accelerator, the world's highest performance accelerator ever built, delivering extreme performance to a widening range of scientific, engineering, high performance computing (HPC) and enterprise applications. Read more…

Do NOT follow this link or you will be banned from the site!