Follow Datanami:

Tag: BERT

An All-Volunteer Deep Learning Army

A team of researchers says their novel framework, called Distributed Deep Learning in Open Collaborations, or DeDLOC, brings the potential to train large deep learning models from scratch in a distributed, grid-like mann Read more…

FinTech Firm Explores Named Entity Extraction

Founded in 2018, San Francisco-based Digits Financial combines machine learning and analytics to give businesses insights into their transactions, automatically identifying patterns, classifying data, and detecting anoma Read more…

Inside eBay’s Optimization Techniques for Scaling AI

Getting the software right is important when developing machine learning models, such as recommendation or classification systems. But at eBay, optimizing the software to run on a particular piece of hardware using disti Read more…

Unlocking the True Potential of ML: How Self-Supervised Learning in Language Can Beat Human Performance

A core goal for many organizations using artificial intelligence (AI) systems is to have them mirror human language and intelligence. However, mimicking human language and mastering its unique complexities continues to b Read more…

Nvidia Inference Engine Keeps BERT Latency Within a Millisecond

It’s a shame when your data scientists dial in the accuracy on a deep learning model to a very high degree, only to be forced to gut the model for inference because of resource constraints. But that will seldom be the Read more…

Google’s ‘Breakthrough’ LaMDA Promises to Elevate the Common Chatbot

Many of Google’s language processing efforts – like BERT and, more recently, MUM – are focused on returning search queries. But as Google moves more toward Assistant – and search queries in general become more de Read more…

Google’s ‘MUM’ Search AI Aims to Move Beyond Simple Answers

Google’s current search answers may seem complex compared to a few years ago, but to hear the search giant talk about it, this is just the beginning – and there’s a long, long way to go. Now, Google is introducing Read more…

Experts Disagree on the Utility of Large Language Models

Large language models like OpenAI’s GPT-3 and Google Brain’s Switch Transformer have caught the eye of AI experts, who have expressed surprise at the rapid pace of improvement. However, not everybody is jumping onto the bandwagon, and others see significant limitations in the new technology, as well as ethical implications. Read more…

Baidu Releases PaddlePaddle Upgrades

An updated release of Baidu’s deep learning framework includes a batch of new features ranging from inference capabilities for Internet of Things (IoT) applications to a natural language processing (NLP) framework for Read more…

Datanami