Observability Primed for a Breakout 2023: Prediction
In 2023, companies will continue to invest in upgrading their IT systems via legacy modernization, cloud-native adoption, and the embrace big data, analytics, and AI. It doesn’t take a soothsayer to know this. What’s much less certain is the ability of companies to keep on top of all the growing technical complexity, which is a role that observability will be asked to fill.
Observability, security, and business analytics will converge in 2023 as organizations begin to tame the data explosion, says Dynatrace founder and Chief Technology Officer Bernd Greifender.
“The continued explosion of data from multi cloud and cloud-native environments, coupled with the increased complexity of technology stacks, will lead organizations to seek new, more efficient ways to drive intelligent automation in 2023,” Griefender says. “It’s not just the huge increase in payloads transmitted, but the exponential volumes of additional data, which can be harnessed to gain better observability, enhanced security, and deeper business insights. However, the prevalence of siloed monitoring tools that offer insights into a single area of the technology stack or support an isolated use case has impeded progress in accessing this value, making it difficult to retain the context of data. It also results in departmental silos, as each team remains focused on its own piece of the puzzle, rather than combining data to reveal the bigger picture.
“To address this,” he continues, “observability, security, and business analytics will converge as organizations consolidate their tools and move from a myriad of isolated and hard to manage DIY tools to multi-use, AI-powered analytics platforms that offer BizDevSecOps teams the insights and automation they need. This will help to tame clouds and the data explosion and drive intelligent automation across multiple areas, from cloud modernization to regulatory compliance and cyber forensics.”
The demand for data observability tools will increase among customers in 2023, says Rex Ahlstrom, CTO of Syniti.
“The data migration process involves many components,” he says. “There could be numerous cloud service providers and application vendors. You need a deeper grasp of the state of the data in your systems and the effects that state, whether positive or negative, has on your business in order to deal with it all. By using data observability, you can evaluate the health of your data and your capacity to troubleshoot and resolve issues before they worsen. The idea actually derives from DevOps and the capacity to break down barriers between IT operations and development in order to create products collaboratively, quickly and iteratively. The benefits of DevOps are essentially copied by data observability and applied to mission-critical company data. You’ll be inefficient and end up paying more than necessary if you don’t give your teams the tools they need to fully comprehend the effects data is having on the business. This is why I anticipate a rise in this trend in 2023.”
Vendors will respond to the critical need for actionable observability across data infrastructure, data lifecycle, and data tools, says Gopal Dommety, CEO of OpsMx.
“In large companies, different groups take different approaches to deploying new applications and data sets, and they continue to deploy different new tools to manage their new environments,” he says. “The result is tool sprawl, workflow snags and compliance gaps. But in a world of ever-growing cyberthreats, evolving privacy regulations, and economic pressure to do more with less, actionable intelligence based on observability into the flow of data is essential for data security, data management and developer productivity. Logging into each tool to obtain the necessary insight is highly inefficient and error prone to the point of undermining productivity and leading to dangerous gaps in knowledge. In 2023, we expect development and management teams to rebel against this inefficiency and demand solutions that can automatically roll up the necessary insight from all their tools into a single dashboard that enables immediate action.”
Observability will be the new face of digital transformation and digital experience, says Splunk Chief Strategy Officer Ammar Maraqa.
“Accelerated digital transformation is here to stay,” Maraqa says. “The complexity of greater, faster digital transformation is a challenge for most organizations. The tools to manage that complexity become even more important.”
2023 will see the beginnings of a convergence of testing and observability, predicts Ryan Vesely, vice president of global solution engineering at Sauce Labs.
“We’re starting to see consolidation in both the market and in the personas we’re all chasing. Testing companies are offering monitoring, and monitoring companies are offering testing. This is a natural outcome of the industry’s desire to move toward true observability: deep understanding of real-world user behavior, synthetic user testing, passively watching for signals and doing real-time root cause analysis–all in service of perfecting the customer experience. The widespread and rapid adoption of OpenTelemetry and OpenTracing (and their implementation into many testing tools) is indicative of what’s coming.”
Data observability will become mainstream and the key to scaling AI for business, says Nicolas Sekkaki, the general manager of applications, data, and AI at Kyndryl.
“Without a robust, secure data foundation and DataOps, it will be hard to scale and democratize data consumption. Besides, consumers will find it hard to expect quality and reliability in the data they are being provided with,” he says.
In 2023, there will be a thing called “single source of truth (SSOT) fabric” and it will be a key focus for achieving digital immunity, according to Mathivanan Venkatachalam, a vice president at ManageEngine.
“Building a SSOT fabric to streamline data-collection and enhance predictive analytics will be a priority in 2023, and these capabilities will pave the way for proactive data-driven IT management,” the company writes. “AI-powered augmentations will allow IT teams to achieve full observability, improve performance, enhance threat detection, and implement proper security management. This will come in handy as attackers become more sophisticated and high-quality malware improves. We expect to see a rise in the usage of AI and ML platforms for devising attack plans, and SSOT fabrics will play an increasing role in defense.”
The risk of IT outages will grow in 2023, thanks to the economic climate and tech and IT layoffs, says Mohan Kompella, BigPanda vice president of product marketing.
“Due to the current economic climate and an increasing number of tech and IT layoffs, in 2023, the risk of IT outages will inherently increase as enterprises are forced to do more with less,” he says. “These outages will lead to longer, more frequent business disruptions that impact customer experiences and cost organizations upwards of $12k/minute when an outage occurs. Enterprises will have to take a close look at existing observability and network management tools, their value – or the lack thereof – and any redundancies to create more efficient processes – at a lower cost.”
Observability will be the key to getting more from data in 2023, says Alexander Lovell, the head of product at Fivetran.
“Organizations want to be smarter with the data in their modern data stacks. But can it be trusted? Is it correct? Is the data up to date? Observability will be an important trend in 2023. By gaining trust in the data, new projects will be unlocked. With solid observability in place, organizations have fewer regulatory hoops to jump through, driving efficiencies and cost savings.”
Data observability investments are immune against the economic environment, according to Andy Petrella, founder and CPO of Kensu.
“While inflation and recession hit the global economy and stock markets plunge, funds will keep investing in the data observability industry, which is already worth billions in valuation,” he says. “This new category solution focuses on helping data teams better understand data usage and faster troubleshoot data incidents, saving money and ensuring data teams are more efficient and free to focus on revenue-generating initiatives.”
Editor’s note: This story has been updated.