Live from SC11: The Storage Cost of Rendering
This week at SC11 in Seattle where big data took center stage, a number of storage vendors vied to demonstrate how their particular approaches to storing and managing vast data sets were superior. While the SC conference series is often considered to be academically-oriented, most of the big name storage folks had some great stories to tell about their role in major HPC and big data verticals, including media and financial services in particular.
We spent some time with the CEO of one such company on the storage side that has a keen eye on the growing media and financial services market. According to Dr. Joe Landman, head of Michigan-based storage and high performance computing company, Scalable Informatics, these two markets are the source of a number of their customers, and each vertical has serious data demands that require a fine-tuned approach to storage.
In the video above, Landman points to two real-world examples of media companies that are using their storage offerings for animation data storage and post-processing. He points to an example of one of their customers that was tasked with creating high-end, flashy animations on short notice for the NFL.
As you can see by the eye candy they created, there is some incredible computation required for the depth of such animations but as Landman says, flashy visual effects come at an incredible storage cost. Customers need to be able to synch and source up to 5 GB/s—a desire that he says Scalable Informatics is able to sate.
He stresses the fact that for this industry, as in financial services, there are incredible demands from both the compute and storage side. Landman notes that when it comes to their media customers, post-production jobs can tally in the tens to thousands of terabytes that require high performance operations on that data, creating a need for the ability to pull the data off with very high performance streaming IO and intelligent caching.
While most of our conversation was geared toward the company’s role in media and entertainment, Landman points out that there is a great deal of bleed-over of these storage and computational demands in financial services. The difference here, he says, is that they’re not looking at tens or hundreds of terabytes—some of their hedge fund customers are in the 100k and petabyte data ranges and need to be able to do some rapid crunching and quickly distribute their data across the other computational engines at top speed. Landman claims that doing this requires fine-tuning of the Linux kernel and drivers and some forward-looking alterations in the overall design of storage systems.
In the second part of our chat below, he talks about the company’s approach to sating the performance and storage demands of their big data customers with siFlash, their SSD and PCIe-based storage offering, which is based on the integration of Virident PCIe cards.
We have a few more interviews on the storage front coming out this week to highlight how some vendors are taking unique approaches to the IO issues plaguing big data verticals. Scalable Informatics, which Landman founded in 2002 to address the needs of the engineering and bioinformatics market is one to watch over the coming years as storage vendors begin to differentiate at the cache level and find new ways to address critical data movement barriers.