Follow Datanami:
October 29, 2020

Intel Acquires Model Optimizer SigOpt

Intel Corp. is acquiring AI optimization software vendor SigOpt, a move the chip maker said would complement its existing AI software portfolio while integrating SigOpt’s tools with its AI hardware to accelerate and scale AI software used by model developers.

The acquisition also addresses the growing complexity of machine learning and neural network models and the resulting inability of hardware to keep pace.

Terms of the transaction announced Thursday (Oct. 29) were not disclosed. Intel (NASDAQ: INTC) said it expects the acquisition to close by the end of this quarter.

San Francisco-based SigOpt’s co-founders and brain trust, CEO Scott Clark and CTO Patrick Hayes, will join Intel’s machine learning team.

SigOpt was founded in 2014 to create a commercial product from Clark’s academic research at Cornell University on Bayesian optimization techniques. Combined with Intel’s AI computing and machine learning capabilities, Clark said SigOpt’s optimization software would help “unlock entirely new AI capabilities for modelers.”

SigOpt’s AI software is designed to boost productivity and performance across hardware and software parameters, resulting in more accurate and better performing machine learning models—even as complexity grows.

“SigOpt’s AI software platform and data science talent will augment Intel software, architecture, product offerings and teams,” said Raja Koduri, Intel’s senior vice president and general manager of architecture, graphics and software.

The startup previously attracted the attention of In-Q-Tel, the investment arm of U.S. intelligence agencies, which eventually acquired a stake in the AI software developer.

Among the company’s strengths is its focus on metrics used to improve the performance of machine learning models.

The SigOpt deal therefore addresses concerns raised last year by Naveen Rao, vice president and general manager of Intel’s AI Products Group. Neural networks have grown so big, Rao noted, with so many parameters to calculate, that AI hardware is unable to keep up.

“The trend to be aware of is that the number of parameters–call this the complexity of the model,” Rao said. “The number of parameters in a neural network model is actually increasing on the order of 10x year-on-year. This is an exponential that I’ve never seen before,” Roa noted during Intel’s most recent AI summit.

“AI is driving the compute needs of the future,” Intel’s Koduri added in announcing the SigOpt deal. “It is even more important for software to automatically extract the best compute performance while scaling AI models.”

Recent items:

SigOpt Within In-Q-Tel’s Parameters

Why Getting the Metrics Right is So Important in Machine Learning

Deep Learning Has Hit a Wall, Intel’s Rao Says

 

Datanami