Follow Datanami:
December 12, 2019

Machine Learning Hits a Scaling Bump

Source: Algorithmia

Based on our reporting over the last year, you might conclude that machine learning technology has entered the mainstream with countless workloads in production. While developers are indeed moving up the machine-learning learning curve, a vendor survey finds they still have a way to go in scaling ML deployments.

Algorithmia’s annual survey of the state of enterprise machine learning technology released on Thursday (Dec. 12) did find an uptick in machine learning deployments over the past year. A modest 22 percent of companies surveyed report they have transitioned machine learning models to production.

As developers and data scientists roll up their sleeves, several pain points have emerged, ranging from version control and model reproducibility to “executive buy-in” and “aligning stakeholders,” the survey found. A key stumbling block is the familiar problem of scaling ML deployments. For example, the survey found that half of developers polled said they require up to 90 days deploying a single machine learning model.

Hence, Algorithmia concludes the state of enterprise machine learning is at the “fledgling but maturing” phase across most industries with software and IT vendors leading the charge. Investments in machine learning projects are growing, often by up to 25 percent over last year, with the heaviest investments coming from the banking, IT and manufacturing sectors.

Those investments reflect prioritization of data science hiring across many industries as adopters struggle to ramp up machine learning deployments. Half of those polled said they employ as many as 10 data scientists. Surprisingly, that percentage is less than last year’s survey.

“Companies are growing their investments in machine learning, and machine learning operationalization is maturing across all industries, but significant room for growth and improvement remains,” said Diego Oppenheimer, CEO of Seattle-based Algorithmia.

“The model deployment lifecycle needs to continue to be more efficient and seamless for ML teams,” Oppenheimer added. “Nevertheless, companies with established ML deployment lifecycles are benefiting from measurable results, including cost reductions, fraud detection and customer satisfaction. We expect these trends to continue as ML technologies and processes arrive to market and are adopted.”

In an interview earlier this year, Oppenheimer stressed his company’s focus on deployment. “If you think about the four areas in machine learning workflow — data prep, model training, deployment, and model management — we do the last two,” Oppenheimer told Datanami. “We help companies facilitate a way of getting models into production quicker, which makes it so that data scientists can actually work in a better way.”

That deployment approach includes packaging a machine learning model or function in a Docker container, deploying it to preferred hardware, then managing it via Kubernetes.

The survey also notes other key ML tools and applications have matured over the past year, including Google’s third-generation Tensor processing unit, Intel’s Nervana neural network processor and indications of customer AI hardware from Microsoft.

Hence, the survey concludes, machine learning technologies are expected to become ubiquitous across enterprises in the next year as data scientists overcome teething problems and move up the lifecycle learning curve.

“The bottom line is that we do see a shift toward greater ML maturity in all companies surveyed,” Algorithmia concludes, with the caveat that the percentage of production models will remain low. The key will be overcoming so-called “last-mile” deployment issues while boosting the sophistication of deployed models.

Algorithmia said is received 745 responses to its industry survey, with results compiled in the fall. The annual machine learning report is available here.

Recent items:

Algorithmia Laser-Focused on ML Deployment and Management

Intel Debuts New VPU Chip at AI Summit

Google Unleashes TPUs on Cloud ML Engine

Datanami