Follow Datanami:

Tag: GPT-3

AI Experts Discuss Implications of GPT-3

Feb 23, 2021 |

Last July, GPT-3 took the internet by storm. The massive 175 billion-parameter autoregressive language model, developed by OpenAI, showed a startling ability to translate languages, answer questions, and – perhaps most eerily – generate its own coherent passages, poems, and songs when given examples to process. Read more…

One Model to Rule Them All: Transformer Networks Usher in AI 2.0, Forrester Says

Feb 16, 2021 |

The recent advent of massive transformer networks is ushering in a new age of AI that will give customers advanced natural language capabilities with just a fraction of the skills and data previously required, according to Forrester. Read more…

Google’s New Switch Transformer Model Achieves 1.6 Trillion Parameters, Efficiency Gains

Feb 8, 2021 |

Last year, OpenAI wowed the world with its eerily human language generator, GPT-3. The autoregressive model stood at a then-staggering 175 billion parameters, ten times higher than its predecessors. Read more…

Low-Code Can Lower the Barrier to Entry for AI

Sep 2, 2020 |

Organizations that want to get started quickly with machine learning may be interested in investigating emerging low-code options for AI. While low-code techniques will never completely replace hand-coded systems, they can help accelerate smaller, less experienced data science teams, as well as help with prototyping for professional data scientists. Read more…

OpenAI’s GPT-3 Language Generator Is Impressive, but Don’t Hold Your Breath for Skynet

Jul 21, 2020 |

Just a couple of months after releasing a paper describing GPT-3, AI development and deployment company OpenAI has begun testing its novel, AI-powered language generator with a select group of users through a private beta – and many have been understandably wowed. Read more…

Datanami