Follow Datanami:

Tag: Switch Transformer

Google’s New Switch Transformer Model Achieves 1.6 Trillion Parameters, Efficiency Gains

Last year, OpenAI wowed the world with its eerily human language generator, GPT-3. The autoregressive model stood at a then-staggering 175 billion parameters, ten times higher than its predecessors. Now, Google is upping Read more…

Datanami