Follow Datanami:

Tag: TPU

The Future of AI Is Hybrid

Artificial intelligence today is largely something that occurs in the cloud, where huge AI models are trained and deployed on massive racks of GPUs. But as AI makes its inevitable migration into to the applications and d Read more…

OpenXLA Delivers Flexibility for ML Apps

Machine learning developers gained new abilities to develop and run their ML programs on the framework and hardware of their choice thanks to the OpenXLA Project, which today announced the availability of key open source Read more…

SalesForce Taps LLM for Programming Boost with CodeGen

Large language models (LLMs) like GPT-3 are capturing the imaginations of data scientists around the world, thanks to their advanced capability to understand and generate text. Now researchers at Salesforce have leverage Read more…

Google’s Massive New Language Model Can Explain Jokes

Nearly two years ago, OpenAI’s 175 billion-parameter GPT-3 language model opened the world’s eyes to what large language models (LLMs) could accomplish with relatively little input, sensibly answering questions, tran Read more…

Google’s New Switch Transformer Model Achieves 1.6 Trillion Parameters, Efficiency Gains

Last year, OpenAI wowed the world with its eerily human language generator, GPT-3. The autoregressive model stood at a then-staggering 175 billion parameters, ten times higher than its predecessors. Now, Google is upping Read more…

Machine Learning Hits a Scaling Bump

Based on our reporting over the last year, you might conclude that machine learning technology has entered the mainstream with countless workloads in production. While developers are indeed moving up the machine-learning Read more…

In Search of a Common Deep Learning Stack

Web serving had the LAMP stack, and big data had its SMACK stack. But when it comes to deep learning, the technology gods have yet to give us a standard suite of tools and technologies that are universally accepted. Desp Read more…

Training Time Slashed for Deep Learning

Fast.ai, an organization offering free courses on deep learning, claimed a new speed record for training a popular image database using Nvidia GPUs running on public cloud infrastructure. A pair of researchers trained Read more…

Google Unleashes TPUs on Cloud ML Engine

As the amount of machine learning training data soars, so too does demand for new tools that will accelerate the process. With that in mind, Google Cloud announced the beta release of a new feature that allows users to s Read more…

H2O Ups AI Ante Via Nvidia GPU Integration

H2O.ai, the machine-learning specialist that unveiled its "Driverless AI" platform this past summer, is upgrading the system via integration with GPU specialist Nvidia's AI development system. The Mountain View, Calif Read more…

Nvidia’s Huang Sees AI ‘Cambrian Explosion’

The processing power and cloud access to developer tools used to train machine-learning models are making artificial intelligence ubiquitous across computing platforms and data framework, insists Nvidia CEO Jensen Huang. Read more…

Datanami