Follow Datanami:
March 12, 2024

Grafana Lab’s Second Annual Observability Survey Offers Insights Into the Evolving Observability Market

Observability has become an integral part of an organization’s ability to gain a deep understanding of its internal systems and to derive actionable insights to improve the performance and resilience of its systems in today’s complex and dynamic digital landscape.  

While the benefits of observability are becoming increasingly evident, organizations often struggle to adapt to the complexity of this evolving space. 

To better understand where organizations are in their observability journeys, Grafana Labs, creator of open-source data visualization tools, has released the findings of its 2024 Observability Survey. Over 300 industry professionals participated in this year’s survey, which is in its second edition.  

Grafana’s study categorizes observability journeys into three segments: reactive, proactive, or systematic. The reactive approach is the least mature, while the systematic approach is the most mature.  Based on the results, 24 percent of organizations take a reactive approach, where the customers highlight the problem before the organization knows about it. 

Around one in five (19 percent) take a systematic approach where they have established procedures and implementation tools that flag issues before users report them. The majority of the organizations (57 percent) take a proactive approach that focuses on preventing issues from impacting customers and have tools that can highlight issues before they impact the customers.

One of the key findings of the report is that open-source technology continues to be a vital piece (98 percent) of respondents’ observability stacks. As Grafana Labs caters to the open-source community, the findings might be slightly skewed, however, there is no denying that open-source technology is becoming the new de facto standard for software development. 

Prometheus remains as the mainstay of observability, with three-quarters of respondents continuing to use it today. However, OpenTelemtry is on the rise, with 54 percent sharing the usage of the platform has increased compared to a year ago. 

(mindscanner/Shutterstock)

Interestingly, you would imagine Prometheus and OpenTelemetry to be competing technologies, but the study reveals otherwise. Nearly half of Prometheus users also use OpenTelemtry. 

This year’s report shows that teams are relying on lots of data sources with 72 percent having at least four data sources configured in Grafana, and 50 percent having at least six sources. Respondents shared more than 60 different technologies when asked what they were using in their teams. This is an indicator of the still-evolving nature of this market. 

The company size also makes a difference. The larger the company, the more tools and data sources they use. There are also significant differences between industries. For example, companies in the financial services and tech sectors are far more data sources compared to the telecommunications sector. 

With that high volume and complexity of tools and data sources, it is no surprise that 79 percent of those who have centralized observability rsay they have benefited in terms of costs and time savings. The top three benefits of correlating data include resource efficiency, capacity planning, and improved user experience. 

The rapidly evolving nature of the observablity market has created considerable excitement about what’s to come. However, there are also some concerns. More than half (56 percent) of the respondents shared cost as their biggest concern about observability. This was followed by the complexity of managing systems (50 percent) and cardinality (47 percent). 

Artificial intelligence (AI) could be deployed to reduce the complexity of observability. More than three-fourths of the respondents shared that they want to use AI for anomaly detection, while nearly half want to use it for predictive insights, dashboard generation, and query assistance. 

It is early days for observing AI. Only 7 percent of respondents shared they are using AI for overstability, while 46 percent say it’s not even on their radar at the moment. However, when asked about what they are most excited about, respondents shared OpenTelemetry, AI, and standardization/interoperability as their top areas of interest. 

Related Items

Data Observability in the Age of AI: A Guide for Data Engineers

Explosion of Observability Data from Cloud Reaches Tipping Point, Dynatrace Says

There Are Four Types of Data Observability. Which One is Right for You?

 

 

Datanami