AI Hardware is evolving – and so are we! As machine learning models continue to grow in size and complexity, and more and more models enter production in enterprises worldwide, the way we approach accelerating these workloads is changing.
At the front end, data-centricity is taking precedence over model-centricity. At the back end, AI practitioners want systems that are performant and efficient, but also sustainable, explainable and accountable.
From massive research models like GPT-3, to day-to-day models deployed by enterprises around the world, we are lifting the hood on how to make AI fast, efficient and affordable.
AI Hardware Summit is evolving, we continue our mission to help those who are accelerating AI workloads in the cloud and at the edge, and this year is all about systems level AI acceleration.