Follow Datanami:
February 10, 2022

IDC Survey Illustrates Importance of Purpose-Built Enterprise AI Infrastructure

NEEDHAM, Mass., Feb. 10, 2022 — International Data Corporation (IDC) recently debuted its AI InfrastructureView, a deep-dive benchmarking study on infrastructure and infrastructure as a service adoption trends for artificial intelligence and machine learning (AI/ML) use cases. To be run as an annual global survey of 2,000 IT decision makers, line of business executives, and IT professionals, the majority of which influence purchasing of AI infrastructure, services, systems, platforms and technologies, the research offers deep insights into the infrastructure requirements of enterprises investing in AI initiatives.

The survey results show that while AI/ML initiatives are steadily gaining traction with 31% of respondents saying they now have AI in production, most enterprises are still in an experimentation, evaluation/test, or prototyping phase. Of the 31% with AI in production, only one third claim to have reached a mature state of adoption wherein the entire organization benefits from an enterprise-wide AI strategy. For organizations investing in AI, improving customer satisfaction, automating decision making, and automating repetitive tasks are the top three stated organization-wide benefits.

“IDC research consistently shows that inadequate or lack of purpose-built infrastructure capabilities are often the cause of AI projects failing,” said Peter Rutten, research vice president and global research lead on Performance Intensive Computing Solutions. “With this in mind, IDC set out to probe deeper into the way in which organizations evaluate and invest in infrastructure solutions as part of their AI strategy. Our findings and analysis provide a wealth of data points for vendors and service providers to address the needs of their clients and prospects.”

Key findings from IDC’s AI InfrastructureView 2021 research include:

AI infrastructure remains one of the most consequential but the least mature of infrastructure decisions that organizations make as part of their future enterprise. Organizations have still not reached a level of maturity in their AI infrastructure – this includes initial investments, realizing the benefits and return on investments, and ensuring that the infrastructure scales to meet the needs of the business. High costs remain the biggest barrier to investments leading many to run their AI projects in shared public cloud environments. Upfront costs are high, leading many to cut corners and thus exacerbate the issue. People, process, and technology remain the three key areas where challenges lie and where organizations must focus their investments for greater opportunities.

Dealing with data is the biggest hurdle for organizations as they invest in AI infrastructure. Businesses lack the time to build, train, and deploy AI models. They say that much of their AI development time is spent just on data preparation alone. Many also lack the expertise or the ability to prepare data. This is leading to a new market for pre-trained AI models. However, like anything off the shelf, pre-trained models have their limitations, which include model availability and adaptability, infrastructure limitations to run the model, and insufficient internal expertise. Model sizes are also growing, making it challenging for them to run on general-purpose infrastructure. Organizations do expect that once they have crossed this hurdle, they will shift their efforts to AI inferencing.

AI infrastructure investments are following familiar patterns in terms of compute and storage technologies on premises, in the public cloud, and at the edge. Businesses are increasing their investments in public cloud infrastructure services, but for many on premises is and will remain the preferred location. Today, for AI training and inferencing, it is divided equally between cloud, on premises, and edge. However, many businesses are shifting towards AI data pipelines that span between their datacenter, the cloud, and/or the edge. Edge offers operational continuity where there is no or limited network connectivity. Security/compliance and cost also play a role. GPU-accelerated compute, host processors with AI-boosting software, and high-density clusters are top requirements for on-premises/edge and cloud-based compute infrastructure for AI training and inferencing. FPGA-accelerated compute, host processors with AI-boosting software or on-prem GPUs, and HPC-like scale-up systems are the top 3 priorities for on-premises/edge-based compute infrastructure for AI inferencing. In the cloud, the highest ranked priorities are GPU acceleration and a host processor with AI-boost, followed by high-density clusters. More AI workloads use block and/or file than object at this point.

“It is clear to us that most organizations have embarked or will imminently embark on their AI journey,” said Eric Burgener, research vice president, Storage and Converged System Infrastructure at IDC. “What is becoming clearer is that gaining consistent, reliable, and compressed time to insights and business outcomes requires investments in purpose-built and right-sized infrastructure.”

“Performance Intensive Computing (PIC), which is the process of performing large-scale mathematically intensive computations and used for processing large volumes of data or executing complex instruction sets in the fastest way possible, is a strategic research area for IDC,” said Ashish Nadkarni, group vice president, Worldwide Infrastructure at IDC. “PIC solutions are commonly used in artificial intelligence, modeling, and simulation – also known as high-performance computing (HPC), and Big Data and analytics (BDA) use cases.”

The IDC report, AI InfrastructureView 2021: Executive Summary (IDC #US48398821), examines adoption and trends among IT buyers. It looks at current and future AI infrastructure investments and adoption rates, workloads, and economics by IT customers and service providers. AI InfrastructureView helps IT companies prioritize their AI infrastructure investments and offers insights on the overall impact on infrastructure decisions, workloads, personas, solution selling, open source versus commercial software markets, deployment locations, and so forth. It seeks to provide insights on the impact of machine and deep learning workflows, AI-infused applications, and analytics technologies on the infrastructure software and hardware markets in a quantitative manner at a worldwide level.

About IDC

International Data Corporation (IDC) is the premier global provider of market intelligence, advisory services, and events for the information technology, telecommunications, and consumer technology markets. With more than 1,100 analysts worldwide, IDC offers global, regional, and local expertise on technology, IT benchmarking and sourcing, and industry opportunities and trends in over 110 countries. IDC’s analysis and insight helps IT professionals, business executives, and the investment community to make fact-based technology decisions and to achieve their key business objectives. Founded in 1964, IDC is a wholly owned subsidiary of International Data Group (IDG), the world’s leading tech media, data, and marketing services company. To learn more about IDC, please visit www.idc.com. Subscribe to the IDC Blog for industry news and insights.


Source: IDC

Datanami