Tag: clusters

Workload Portability

Nov 13, 2017 |

Today, organizations of all sizes are exploring strategies for migrating workloads to and across diverse computing platforms.  The reasons vary from resolving local resource shortfalls to enabling collaborative workflows to converting capital expenses to operating expenses.  Read more…

Does InfiniBand Have a Future on Hadoop?

Aug 4, 2015 |

Hadoop was created to run on cheap commodity computers connected by slow Ethernet networks. But as Hadoop clusters get bigger and organizations press the upper limits of performance, they’re finding that specialized gear, such as solid state drives and InfiniBand networks, can pay dividends. Read more…

Self-Provision Hadoop in Five Clicks, BlueData Says

Sep 17, 2014 |

Forget the data science–in some organizations, just getting access to a Hadoop cluster is a major obstacle. With today’s launch of EPIC, the software virtualization company BlueData says analysts and data scientists can self-provision a virtual Hadoop cluster in a matter of seconds, enabling them to iterate in a faster and more agile fashion. Read more…

Teradata Moves Virtual Data Warehouse Forward with MongoDB

Jun 19, 2014 |

MongoDB and Teradata today announced they’re working to integrate their products to boost the analytic and transactional workloads of their joint customers. The goal of the partnership is to find a better way to feed operational JSON data from Mongo’s NoSQL database into Teradata’s warehouse, enabling faster turnaround of analytic questions. Read more…

IDC’s View on Cray Cluster Supercomputers for HPC and Big Data Markets

Sep 30, 2013 |It’s well known that the high performance computing (HPC) market has grown quickly over the last couple of decades. In fact, the HPC market has tripled in size from $3.7 billion in the mid-1990s to $11.1 billion in 2012 according to IDC. Read more…

The Big Data Security Gap: Protecting the Hadoop Cluster

Sep 16, 2013 |Hadoop enables the distributed processing of large data sets across clusters of computers, but its approach presents a unique set of security challenges that many enterprise organizations aren’t equipped to handle. Open source approaches to securing Hadoop are still in their infancy, and lack robustness. Zettaset Orchestrator™ is the only solution that has been specifically designed to meet enterprise security requirements in big data and Hadoop environments. Learn what every organization should know about Hadoop security. Read more…

Sharing Infrastructure: Can Hadoop Play Well With Others?

Mar 28, 2013 |A lot of big data/Hadoop implementations are swimming against the currents of what recent history has taught about large scale computing and the result is a significant amount of waste, says Univa CEO, Gary Tyreman, who believes that Hadoop shared-infrastructure environments are on the rise. Read more…

Facebook Sees Hadoop Through Prism

Aug 28, 2012 |Splitting a Hadoop cluster is difficult, explains Facebook’s VP of Engineering Jay Parikh. “The system is very tightly coupled,” said Facebook’s VP of Engineering Jay Parikh as he was explaining why this splitting of Hadoop clusters was not possible... Read more…

Entry-Level HPC: Proven at a Petaflop, Affordably Priced!

Jun 4, 2012 |Don't have a super budget? You can still own a premier high performance supercomputer with proven technology and reliability. New entry-level configurations and options enable you to configure the optimal balance of price, performance, power, and footprint for your unique and exacting requirements. Read more…

Putting Shared Memory to the Test

Nov 28, 2011 |A new pilot project in Europe seeks to show the value of shared memory systems (this time from an IBM, Numascale and Gridcore partnership) as national goals point to the need to create massive systems for massive data. Read more…