3 Ways Big Data Is Being Used in IT
The big data revolution is transforming how business gets done across multi-trillion-dollar industries like financial services, healthcare, manufacturing, and retail. Heck, even the Federal Government, with its nearly $4-trillion budget, is getting in on the act. But one industry that’s often overlooked in the rush to use computers to crunch data to optimize our world is the information technology (IT) sector itself.
It’s really hard to quantify the impact that big data analytics will have in the future. In some respects, it’s inevitable that analytics will eventually touch nearly every aspect of our lives. Because anyplace that there’s an opportunity to optimize a given business process or ease the burden in our lives—even if it’s just by a small amount—somebody somewhere will build some algorithms to crunch some data to get closer to optimum. As long as our markets remain free and open, it will become increasingly hard for business folks to avoid the temptation to “smarten things up” through the use of some intelligent algorithms.
With that in mind, it’s easy to see where there’s room for improvement in the global IT industry, which is estimated to account for about $3.8 trillion in annual spending, or roughly the same amount the U.S. Government spent last year.
Like nearly every industry except oil and gas exploration, most of the expenditures in the IT business goes to human workers. While nobody wants to see an algorithm-powered robot army take good paying jobs in the 6-million-strong American IT workforce, there are clearly opportunities to augment how IT workers do their jobs with intelligent applications.
Here are three ways big data analytics is impacting the IT biz:
IT Asset Management
As part of the big data boom, the number of databases used by companies has exploded. A decade ago, it was not uncommon to see 100 or more databases in a mid-size company. Some companies today are pushing 1,000 data stores.
And that’s just the databases. When you count the untold number of applications running atop these data stores—not to mention the physical servers, storage, and network devices that make it all possible–it’s clear that just keeping track of IT assets is a huge challenge.
One of the firms aiming to use the power of big data analytics to help manage digital sprawl is Blazent. The Michigan company uses a variety of big data technologies like Hadoop, Spark, and Cassandra to create what’s effectively a next-generation configuration management database (CMDB) to track the large number of IT assets used at large enterprises and services firms.
In addition to tracking IT assets, Blazent focuses a lot of its attention on helping clients to ensure the accuracy and consistency of data stored across various silos. Because if the data across these silos isn’t in synch, it can cost the data’s owner in the form of downtime, higher maintenance, and lost opportunity.
Software License Optimization
In the enterprise software business, there’s often a disconnect between what a client pays for and what they’re actually using. A company may not have become a software pirate on purpose, but that doesn’t change the fact that they’re benefitting from somebody else’s intellectual property (IP) without proper compensation.
One firm that’s employing the help of advanced algorithms to narrow this software license gap is V.i Laboratories. The Massachusetts company developed an analytics service called CodeArmor that uses advanced analytics to identify customers that are using software that they haven’t adequately paid for, or that they’re outright stealing.
Software-usage data collected from a reporting service built into the client’s applications is uploaded to V.i. Labs’ cloud, then analyzed for signs of misuse. The service makes use of Google (NASDAQ: GOOG) Geolocation API to pinpoint the exact location of software usage, and visualization software from Tableau (NYSE: DATA) to help clients make sense of results.
All told, V.i. Labs says it has helped clients generate $1.4 billion since it was founded in 2010. License and compliance management will never be the same.
Deciphering Virtual Servers
Most businesses today run a good portion of their server-based applications on virtual machines. Instead of installing software to a single physical server with one copy of an operating system, they’re using hypervisors such as those from VMware to carve up their physical servers into multiple virtual servers, each with its own copy of the OS.
While this is great for boosting the utilization of commodity X86 servers up to the 80-percent level long achieved by mainframe-class systems, it’s lousy for the administrators who are tasked with maintaining the systems. This is especially true when the storage layer is also virtualized.
When an application starts behaving poorly, as they are prone to do, it can be very difficult to track down the source of the problem in a virtualized environment. It can be virtually impossible to try to track the virtualized I/O as one would have traditionally done when running on bare iron.
To address this problem, we’re seeing a new class of management tools that use the power of machine learning to detect when things go wrong, and to help administrators begin the troubleshooting process. Tools like SIOS IQ, which is developed by California-based SIOS, can generate a model of what good application behavior looks like, and then alert the administrator when bad application behavior is detected.