Follow Datanami:

Tag: clusters

NoSQL Database Smackdown Takes to the Clouds

Customers want to get the most bang for their database buck, no matter whether they live on premise or in the cloud. However, as databases move to the cloud, performance and cost can be impacted in different ways. Now a Read more…

Inside Pachyderm, a Containerized Alternative to Hadoop

Last week was a big one for Pachyderm, the containerized big data platform that's emerging as an easier-to-use alternative to Hadoop. With a $10 million round of funding, public testimonials from customers like the Defen Read more…

Dr. Elephant Leads the Performance Parade

I started working on big data infrastructure in 2009 when I joined Cloudera, which at the time was a small startup with about 10 engineers. It was a fun place to work. My colleagues and I got paid to work on open source Read more…

Workload Portability

Today, organizations of all sizes are exploring strategies for migrating workloads to and across diverse computing platforms.  The reasons vary from resolving local resource shortfalls to enabling collaborative workflow Read more…

Does InfiniBand Have a Future on Hadoop?

Hadoop was created to run on cheap commodity computers connected by slow Ethernet networks. But as Hadoop clusters get bigger and organizations press the upper limits of performance, they're finding that specialized gear Read more…

Self-Provision Hadoop in Five Clicks, BlueData Says

Forget the data science--in some organizations, just getting access to a Hadoop cluster is a major obstacle. With today's launch of EPIC, the software virtualization company BlueData says analysts and data scientists can Read more…

Teradata Moves Virtual Data Warehouse Forward with MongoDB

MongoDB and Teradata today announced they're working to integrate their products to boost the analytic and transactional workloads of their joint customers. The goal of the partnership is to find a better way to feed ope Read more…

IDC’s View on Cray Cluster Supercomputers for HPC and Big Data Markets

It’s well known that the high performance computing (HPC) market has grown quickly over the last couple of decades. In fact, the HPC market has tripled in size from $3.7 billion in the mid-1990s to $11.1 billion in 2012 according to IDC. Read more…

The Big Data Security Gap: Protecting the Hadoop Cluster

Hadoop enables the distributed processing of large data sets across clusters of computers, but its approach presents a unique set of security challenges that many enterprise organizations aren’t equipped to handle. Open source approaches to securing Hadoop are still in their infancy, and lack robustness. Zettaset Orchestrator™ is the only solution that has been specifically designed to meet enterprise security requirements in big data and Hadoop environments. Learn what every organization should know about Hadoop security. Read more…

Sharing Infrastructure: Can Hadoop Play Well With Others?

A lot of big data/Hadoop implementations are swimming against the currents of what recent history has taught about large scale computing and the result is a significant amount of waste, says Univa CEO, Gary Tyreman, who believes that Hadoop shared-infrastructure environments are on the rise. Read more…

Facebook Sees Hadoop Through Prism

Splitting a Hadoop cluster is difficult, explains Facebook’s VP of Engineering Jay Parikh. “The system is very tightly coupled,” said Facebook’s VP of Engineering Jay Parikh as he was explaining why this splitting of Hadoop clusters was not possible... Read more…

Entry-Level HPC: Proven at a Petaflop, Affordably Priced!

Don't have a super budget? You can still own a premier high performance supercomputer with proven technology and reliability. New entry-level configurations and options enable you to configure the optimal balance of price, performance, power, and footprint for your unique and exacting requirements. Read more…

Putting Shared Memory to the Test

A new pilot project in Europe seeks to show the value of shared memory systems (this time from an IBM, Numascale and Gridcore partnership) as national goals point to the need to create massive systems for massive data. Read more…

Datanami