HP Storage Lead Drills Down Trends
If you ask Mike Prieto, who heads up APAC storage at HP, what IT trends are the most dominant this year, two of his three answers probably won’t surprise you (here’s a hint, ‘big data’ is one) but one might raise a few eyebrows.
We already gave you a freebie with big data, the other is, as you might have guessed, cloud computing, but the third—storage virtualization—is one that Prieto was pressed to make a stronger case for. This is not to say that 2011 and the beginning of this year hasn’t been a notable one for those in the storage virtualization game. Prieto says, however, that part of the reason it’s so important now is that it fits nicely into the box created by both big data and the possibilities of cloud.
As Prieto noted, “If you think about big data in terms of a content explosion, it’s really continuing to accelerate. 90 percent we have was created two years ago—it’s very obvious that big data is one of the areas, or one of the big trends for the year.”
Prieto says that companies need to take a “holistic” approach to managing big data. He said that the data shouldn’t be “tackled in silos. It should really be a consulted lead, solutions approach that will assess your environment and understand what has to be done to come up with a perfect solution to address that issue.”
Aside from suggesting companies look to the consulting world for guidance, he says the concept of storage virtualization is one of the hidden keys to contending with large data sets.
As he noted during the interview, “most organizations have tackled server virtualization for many years and got good ROI in general. However, they have neglected the storage and networking side of the organization. I think the companies to benefit and reap the rewards and get their returns, you really have to aim for an end-to-end virtualized infrastructure to get maximum ROI.”
Not surprisingly, the HP exec pointed to the solutions that are cropping up to address big data needs. Not surprisingly, he found a fit with HP’s own X9000 platform, which is their scale-out NAS offering that is designed to chomp down on large amounts of data. He also said that the Autonomy acquisition is bringing some software tools to bear for handling large data sets, although some could argue it’s still hard to see where the Autonomy group’s offerings are really fitting into HP’s overall strategy.
His final word on storage virtualization is that organizations needs to start moving away from a dollar per terabyte discussion, because it doesn’t represent where we are today. It really has to be the total cost of ownership, dollar per VM or dollar per IOPS in those discussions.
As he says, “At the end of the day, technologies that we have, like 3PAR have really changed the game. When you think about the thin provisioning capability that 3PAR has, you need probably 50 percent of raw disk compared to our competitors today. Again, the dollar per terabyte discussion is a non-event today. I urge all CIOs to have a dollar per IOPS or a total cost of ownership discussion and pick the right vendor based on those criteria.”