DataTorrent
Language Flags

Translation Disclaimer

HPCwire Enterprise Tech HPCwire Japan
Leverage Big Data'14

November 22, 2011

Solving Big Data Storage at CERN


Not long ago we published an article about the role of tape in the big data era, citing recent announcements from SpectraLogic that poise the company as ripe for the exabyte era.

For a thirty year-old company that has watched an endless parade of storage technologies march past, the big data hoopla could certainly be just the right buzzword at just the right time.

Tape has been receiving quite a bit more attention over the last year, in part because companies and research centers require ways to retain data at a low cost while still possessing the ability to quickly spin up needed volumes on the fly for reuse or analysis.

Some have argued that tape is not relevant for many of the storage and analytics operations, but if some recent wins from the company are correct, big tape is finding a big in with data-centric businesses and research centers—and not simply as a “backup and forget” method of socking away massive data sets.

This week SpectraLogic announced that CERN is taking a close look at the company’s T-Series tape library technology, stating that the research facility already installed the exabyte-ready line in July to handle copies of data from the Large Hadron Collider (LHC) and now plans to migrate the collider data from their T380 tape library to the Spectra T-Finity using CERN’s Advanced STORage manager (CASTOR) to achieve hierarchical storage management.

To offer a sense of the scale of the data, imagine the work that is conducted on a daily basis at the facility. The world’s largest particle accelerator generates over 25 petabytes each year of scientific data that not only needs to be stored for later use, it also must be distributed in part or parcel to the variety of related institutions that carry out their own work on the data.

This means that the data the center generates, even if it is sits idle for a time, must be quickly available in large volumes for redistribution and analysis of elements that need to be separated out from other data—or analyzed in large chunks.

According to Vladimir Bahyl, who handles the tape environment at CERN, with such vast data volumes, there was a clear need for “a solution that would provide scalability for growth while ensuring the integrity and accessibility of the data.” He says that using CASTOR made the integration of the new T-Series libraries easy and now the new platform will allow CERN “to scale both hardware and software to make capacity upgrades quick, seamless and affordable as data sets grow."

Share Options


Subscribe

» Subscribe to our weekly e-newsletter


Discussion

There are 0 discussion items posted.

 

Most Read Features

Most Read News

Most Read This Just In



Sponsored Whitepapers

Planning Your Dashboard Project

02/01/2014 | iDashboards

Achieve your dashboard initiative goals by paving a path for success. A strategic plan helps you focus on the right key performance indicators and ensures your dashboards are effective. Learn how your organization can excel by planning out your dashboard project with our proven step-by-step process. This informational whitepaper will outline the benefits of well-thought dashboards, simplify the dashboard planning process, help avoid implementation challenges, and assist in a establishing a post deployment strategy.

Download this Whitepaper...

Slicing the Big Data Analytics Stack

11/26/2013 | HP, Mellanox, Revolution Analytics, SAS, Teradata

This special report provides an in-depth view into a series of technical tools and capabilities that are powering the next generation of big data analytics. Used properly, these tools provide increased insight, the possibility for new discoveries, and the ability to make quantitative decisions based on actual operational intelligence.

Download this Whitepaper...

View the White Paper Library

Sponsored Multimedia

Webinar: Powering Research with Knowledge Discovery & Data Mining (KDD)

Watch this webinar and learn how to develop “future-proof” advanced computing/storage technology solutions to easily manage large, shared compute resources and very large volumes of data. Focus on the research and the application results, not system and data management.

View Multimedia

Video: Using Eureqa to Uncover Mathematical Patterns Hidden in Your Data

Eureqa is like having an army of scientists working to unravel the fundamental equations hidden deep within your data. Eureqa’s algorithms identify what’s important and what’s not, enabling you to model, predict, and optimize what you care about like never before. Watch the video and learn how Eureqa can help you discover the hidden equations in your data.

View Multimedia

More Multimedia

ISC'14

Job Bank

Datanami Conferences Ad

Featured Events

May 5-11, 2014
Big Data Week Atlanta
Atlanta, GA
United States

May 29-30, 2014
StampedeCon
St. Louis, MO
United States

June 10-12, 2014
Big Data Expo
New York, NY
United States

June 18-18, 2014
Women in Advanced Computing Summit (WiAC ’14)
Philadelphia, PA
United States

June 22-26, 2014
ISC'14
Leipzig
Germany

» View/Search Events

» Post an Event