Technologies » Systems

Features

Re-Platforming the Enterprise, Or Putting Data Back at the Center of the Data Center

Apr 27, 2015 |

The data center as we know it is going through tremendous changes. New technologies being deployed today represent a new stack that will form a strategic data center platform and will significantly reduce hardware and system costs, improve efficiency and productivity of IT resources, and drive flexible application and workload execution. This new stack will also transform an organization’s ability to respond quickly to changing business conditions, competitive threats, and customer opportunities. Before we go any further, let’s stop and Read more…

Build or Buy? That’s the Big (Data) Question

Apr 8, 2015 |

“You can learn a lot from my failures, maybe,” says Ron Van Holst, an HPC director at Ontario Centres of Excellence. With decades of experience designing and building high-end telecommunications gear and HPC systems, Van Holst has a better grasp on the tradeoffs you’ll make when you choose to build or buy your next big data system. Van Holst was into a little bit of everything when was building telecommunications gear in the 1990s. From the programming languages and the Read more…

Peek Inside a Hybrid Cloud for Big Genomics Data

Mar 30, 2015 |

Genomics, or the study of human genes, holds an enormous promise to improve the health of people. However, at 80GB to 250GB per sequenced genome, the sheer size of genomics data poses a considerable storage and processing challenge. To keep costs in check, Inova Translational Medical Institute (ITMI) adopted a hybrid cloud architecture for its ongoing genomics studies. ITMI is a branch of the non-profit Inova hospital network in the Washington D.C. area that received 150 million in funding by Read more…

Leverage Big Data 2015 Yields Insightful Gems

Mar 20, 2015 |

Some of the brightest minds in the big data and HPC communities took the stage this week at Tabor Communications’ second annual Leverage Big Data conference, which took place at the beautiful Ponte Vedra Resort in Florida. From the impact of Hadoop to the cutting edge of FPGA research, attendees shared their big data lessons. Big data is everywhere, said Molly Rector, CMO for DDN, who delivered a keynote Monday night. You may not know it, but whenever you use Read more…

Beyond the 3 Vs: Where Is Big Data Now?

Mar 12, 2015 |

Once defined by the “three Vs” of volume, velocity, and variety, the term “big data” has overflowed the small buckets we gave it and taken on a life of its own. Today, big data refers not only to the ongoing information explosion, but also an entire ecosystem of new technologies, as well as a unique way of thinking about data itself. Here’s a quick recap of the short history of big data. The World Wide Web was just starting to Read more…

News In Brief

Lavastorm Pushes Analytics Collaboration

Apr 22, 2015 |

An upgraded analytics engine rolled out this week by Lavastorm seeks to deliver greater transparency in how analytics are created as a way to repair what the company argues is a “disjointed” process of creating and sharing analytic applications. Boston-based Lavastorm touts its Analytics Engine 6.0 as promoting transparency as a way to encourage collaboration among the authors and users of analytic programs. To that end, the latest version of the analytics engine includes data flow logic and interim data Read more…

Neo Tech Cranks Up Speed on Upgraded Neo4j

Mar 25, 2015 |

Graph database specialist Neo Technology has rolled out an updated version of its Neo4j tool that includes faster read and write performance capabilities aimed at critical graph database applications. Neo4j 2.2 released on Wednesday (March 25) comes with souped-up write and read scalability, the company said. Expanded write capacity targets highly concurrent applications while leveraging available hardware via faster buffering of updates. A new unified transaction log is designed to serve both graphs and their indexes. Graph databases are an Read more…

Fujitsu Adding Column-Oriented Processing Engine to PostgreSQL

Mar 4, 2015 |

Fujitsu Laboratories last week announced that it’s developed a column-oriented data storage and processing engine that can quickly analyze large amounts of data stored on a PostgreSQL database. The technology, which utilizes vector processing, is being showcased this week at a conference in Japan. Fujitsu has a long history developing big systems designed to handle heavy transactional loads. It was a close development partner of Sun Microsystems for 64-bit Sparc servers until Sun was acquired by Oracle. Today the $4.5 Read more…

Project Myriad Brings Hadoop Closer to Mesos

Feb 12, 2015 |

One of the challenges of running Hadoop is resource management. The process of spinning up and managing hundreds, if not tens of thousands, of server nodes in a Hadoop cluster—and spinning them down and moving them, etc.–is way too hard to do manually. Automation must come to the table to help Hadoop take the next step forward in its evolution. The big question is how it will unfold. One answer to that question came to the forefront yesterday when a Read more…

Oracle Rejiggers Exadata for Emerging In-Memory Workloads

Jan 22, 2015 |

Organizations that adopt the latest generation of Oracle’s engineered systems will have more flexible configuration and licensing options available to them than previous generations. That will make it more cost effective to run emerging in-memory workloads, such as operational analytics, the company says. At a live launch event in Redwood City, California, Oracle chairman and CTO Larry Ellison unveiled the X5 generation of its various all-in-one server platforms. The Exadata Database Machine is headlining this new generation, and backed up Read more…

This Just In

StackIQ to Build Production-Ready Hadoop Cluster in Two Hours

Apr 17, 2015 |

SOLANA BEACH, Calif., April 17 — StackIQ, Inc., makers of StackIQ Boss, the Warehouse-grade automation platform for any large-scale server infrastructure, will build an actual commercial Hadoop cluster in two hours as the centerpiece of its new Factory Seminars. The Factory Seminars are events held in major U.S. cities, developed to teach datacenter infrastructure builders and operators how to deploy reliable, any-scale Hadoop and OpenStack infrastructure in less time and without complexity. StackIQ will kick off its Factory Seminars on May Read more…

Ryft to Present and Exhibit at HPC User Forum

Apr 9, 2015 |

ROCKVILLE, Md., April 9 — Ryft announced today the company will present and exhibit at the HPC User Forum in Norfolk, Virginia, April 13-15, 2015. Ryft will show how organizations can accelerate powerful big data analytics performance to 100X faster than the fastest alternatives. Pat McGarry, Ryft’s vice president of engineering, will also present to attendees on how organizations can get more value from data with innovative technologies that deliver real-time data analytics without time-consuming data indexing or extract, transform and load work. At Read more…

Plexxi and PSSC Labs Partner to Deliver Big Data Application

Apr 8, 2015 |

NASHUA, N.H., April 8 — Plexxi, a pioneer of next-generation networking products and SDN market leader, today announced a partnership with PSSC Labs, a manufacturer and provider of high-performance server equipment for Big Data Applications. The partnership combines Plexxi SDN switching and control with PSSC high-performance compute and Cloudera enterprise to deliver a fully integrated, end-to-end certified solution for scale-out Big Data applications. The integrated offering, CloudOOP Big Data Pod, empowers enterprises for the first time to deploy Big Data/scale-out applications across Read more…

PSSC Labs Now Offering Suite of Products and Services Aimed at Big Data

Apr 7, 2015 |

LOS ANGELES, Calif., April 7 — Leveraging their expertise in High Performance Computing, PSSC Labs now offers a complete suite of products and services aimed at the growing Big Data and Data Center marketplace. The company is building upon their success selling high density, energy efficient servers designed specifically for Big Data applications including Hadoop and SQL / noSQL databases. This past year, the company has added many new enterprise customers in the health care, ad tech and financial services verticals that Read more…

Exar’s AltraHD is Now Certified on Cloudera Enterprise 5

Mar 24, 2015 |

FREMONT, Calif., March 24 — Exar Corporation, a leading supplier of high-performance integrated circuits and system solutions, today announced the certification of AltraHD hardware accelerated compression solution for Hadoop with the Cloudera Enterprise 5 platform.  Cloudera’s data management platform enables enterprises to use Apache’s open source software to better manage data. AltraHD integrates seamlessly into the Apache Hadoop software stack and automatically compresses all files using the Hadoop Distributed File System (HDFS), as well as all intermediate data.  AltraHD provides up to 2x Read more…