Follow Datanami:
October 18, 2022

How to Address Scale and Performance Needs in IoT

Shahed Mazumder

(metamorworks/Shutterstock)

During the summer, electricity grid disruptions and outages can occur more frequently. It’s a frightening thought: 12 nursing home residents died in Florida in stifling heat after Hurricane Irma knocked out their facilities’ air-conditioning units in 2017. But utility failures aren’t just a concern during the hot months. In Texas, a winter storm that left more than 4.5 million homes and businesses without power in 2021 was blamed for hundreds of deaths.

The U.S. power grid is aging, strained, and inefficient. On the contrary, there is a demand for more power to heat and cool homes during extreme weather fluctuations attributed to global warming. In addition, outdated or deficient infrastructure is a real safety concern. All of these factors highlight the need for better real-time understanding of the health of the utility ecosystem — from production to distribution to restoration.

IoT Opportunities Exploding in Utilities

MarketsandMarkets reports that IoT utility market spending is expected to grow to $53.8 billion by 2024, compared to $28.6 billion in 2019.

One of the big drivers for this growth is the huge potential and proliferation of IoT sensors.

Leveraging IoT is seen as a way to create an infrastructure that’s powerful, efficient, and more resilient. Utility operators, for example, can detect any changes in usage levels, which can immediately help identify power/gas losses or leaks. Additionally, they can see overloading sooner. Quicker reactions enabled by early warning and phased restoration systems can potentially save lives by ensuring power is delivered to the most critical structures first, such as nursing homes or hospitals.

(Blue Planet Studio/Shutterstock)

Vendors can and should collaborate with U.S. utility companies on how to help them modernize their data infrastructure. Use cases include emergency response service, capturing and processing real-time data at the edge, demand forecasting and predictive maintenance via AI, data retention, and restoration planning.

Some large companies require three hours, daily, to ingest data to a machine learning and analytics platform, and another 40 minutes for indexing/partitioning of the same data on the same platform to make the data usable. But this can be streamlined significantly. With highly efficient systems, that entire cycle can take less than 30 minutes.

The Database As the Foundation of IoT Strategy

A database must be the foundation of any IoT strategy. An IoT ecosystem powered by a data platform provides the flexibility to push data closer to the points of acquisition and usage on edge networks. This helps drive wide-scale IoT adoption. The ability to dynamically add new devices future-proofs the data infrastructure. Apart from the utility segment, IoT use cases also include manufacturing, smart cities, connected vehicles, transportation and logistics, healthcare, and oil and gas.

There’s also strength in numbers. Strategic partnerships to be an effective ecosystem partner is a great use of resources. Joining forces with other companies who have a suite of APIs to build, extend, and deliver powerful event-driven applications is a strong match-up with a data platform. This can be a good fit for both enterprise-grade and mass-market IoT use cases that need data ingestion and processing at speed and scale with resilience and guaranteed performance. Efficiently incorporating event streaming is a key requirement for IoT/Telco edge systems. Partnerships like this can address challenges such as friction, implementation time, and most importantly, cost.

Database platforms provide the basis for high performance and high resilience while spanning multiple data centers and allowing for uninterrupted access during localized outages. The platform should deliver billions of transactions in real time, and can give its customers resilience against localized platform hardware, network and software faults. The solution should also have a relatively small infrastructure footprint, resulting in not just sustainable practices but a significant reduction in total cost of ownership (TCO).

Time Series Data Support Through API

A data platform may have its origin in the key value space; but, if it is customizable, it can also support a wider range of use cases, including time series data. Even before providing native support for time series data, a high performance database can have a combination of buffered writes and efficient map operations allowing for the optimization of both the read and write of time series data today. A Time Series API can leverage these features to provide a general-purpose interface for efficient reading/writing of time series data at scale.

With the right tool, it’s possible to see queries retrieving 1 million points/query (one year of observations of every 30 seconds) — able to run at the rate of two per second, with end-to-end latency of about 0.5 seconds. A 50,000-writes-per-second rate can be easily sustained on the same cluster. Realistically these numbers can easily be scaled by a factor of 20 or more, simply by increasing the power of the underlying hardware.

The success of IoT for utilities (and other use cases) depends on the effective deployment of a wide range of technologies. This includes a variety of IoT sensors and end devices; real-time edge, core, and cloud data infrastructure; and AI/ML capabilities within the analytical tools. The key requirements for the data platform are stability, reliability, and ability to scale without losing performance. This is a great time for supporting time series data, and we expect that its prevalence will only increase as data sophistication continues to grow.

About the author: Shahed Mazumder is the Global Director of Telco Solutions at Aerospike. With nearly 20 years of global experience, Shahed fulfills the needs of existing customers from the telecom and IoT/connected devices verticals and is responsible for driving new business opportunities.  Shahed was previously Principal Strategist on the technology strategy team at CableLabs and was a strategy consultant at Cartesian, where he worked on a variety of projects ranging from technology due diligence to business case development and product strategy/competitive assessment. 

Related Items:

Define and Tackle Your Edge

Coding for the Edge: Six Lessons for Success

Bridging the Gaps in Edge Computing

Datanami