Follow Datanami:

Tag: hpc

IBM Fellow Tracks HPC, Big Data Meld

From massive physics simulations to astrophysics research that recreates the universe’s birth, HPC systems are designed to quickly plow through incredible amounts of data. In reality, the “big data” challenges (among which are data volume, variety and velocity barriers) that are.... Read more…

HPC Accelerates the Rate of Scientific Discovery!

There have been numerous breakthroughs in science during the last year, utilizing the many unique advances in high performance supercomputing. National Laboratories, Universities and Research Centers worldwide have been leveraging HPC to push the human understanding of physics, chemistry and biology across numerous application domains – let’s look at some examples. Read more…

Understanding Data Intensive Analysis on Large-Scale HPC Compute Systems

Data intensive computing is an important and growing sector of scientific and commercial computing and places unique demands on computer architectures. While the demands are continuing to grow, most of present systems, and even planned future systems might not meet these computing needs very effectively. The majority of the world’s most powerful supercomputers are designed for running numerically intensive applications that can make efficient use of distributed memory. There are a number of factors that limit the utility of these systems for knowledge discovery and data mining. Read more…

Entry-Level HPC: Proven at a Petaflop, Affordably Priced!

Don't have a super budget? You can still own a premier high performance supercomputer with proven technology and reliability. New entry-level configurations and options enable you to configure the optimal balance of price, performance, power, and footprint for your unique and exacting requirements. Read more…

Fujitsu Says Big Data Pushing HPC into Mainstream

According to many in the HPC community, especially on the hardware side, the big data trend (and all the buzz associated with it) is creating new opportunities for the once “high and mighty” world of supercomputing. This is a field that.... Read more…

SKA Makes Data Work to New Beat

To handle the massive torrent of data the Square Kilometer Array (SKA) will produce, the project calls for a supercomputer that can handle anywhere from 2-30 exaflops, not to mention a system that is outfitted with some of the most robust data-intensive management tools available. These numbers are certainly outside the realm of today’s leading-edge technology but..... Read more…

SSDs and the New Scientific Revolution

SSDs are anything but a flash in the pan for big science. With news of the world’s most powerful data-intensive systems leveraging the storage technology, including the new Gordon supercomputer, and other new “big data” research centers tapping into the wares... Read more…

Finland’s Big Data Storage Leap

Finland-based CSC, the government-sponsored Center for Science Ltd., provides the balance between academic and industry R&D and IT resource management in the country. In an effort to address its massive storage needs, the organization announced key multi-million-Euro.... Read more…

Alpine Data Climbs Analytics Mountain

Startup Alpine Data Labs recognized the recent changes in database and HPC technologies and has capitalized on the need to update antiquated data mining techniques. This week the company's head discussed some of the challenges of the.... Read more…

Cray Opens Doors on Big Data Developments

This week we talked with Cray CEO, Peter Ungaro and the new lead behind the company's just-announced big data division, Arvind Parthasarathi. The latter joined the supercomputing giant recently from Informatica and brings a fresh, software-focused presence to a company that is now an official... Read more…

Big Data Cloud Delivers Military Intelligence to U.S. Army in Afghanistan

Private clouds are catching on in defense circles, as are new analytics technologies aimed at improving responses in real-time. IDC research vice president for HPC, Steve Conway, sheds light on how one innovative company is harnessing the cloud to leverage big data for mission-critical military operations. Read more…

SAS Shifts Retail Analytics to HPC Platform

This week SAS announced that their more popular and resource-demanding packages for retailers would be moved under the umbrella of their High Performance Computing Platform. This includes products to harness predictive analytics and allow for swift price changes. Read more…

ISC Issues Call to Action for Computer, Data Scientists

The annual International Supercomputing Conference, which is set to take place in mid-June this year in Hamburg, Germany, has issued a call for papers, tutorial concepts and provided information about submission and awards. For those with an eye on big data, the world of supercomputing could yield a wealth of new insights. Read more…

Mellanox Bridges Network Performance Divide

This week high performance network vendor, Mellanox, released VMA 6.0, which is targeted at industries in need of ultra-low latencies, including financial services. In light of the new markets emerging in the wake of the big data explosion, however, their message could be finding uptapped sources of new business in 2012. Read more…

RAID on Enterprise Big Data

This week RAID Inc.'s CTO released a detailed report on the benefits of parallel file systems for high performance computing and big data environments. In addition to providing differences between GPFS and Lustre, the research describes the overall benefits of each. Read more…

CEP Sparks Network Vendor Momentum

Complex event processing is garnering attention from the verticals that rely on it, but is gaining traction among the low-latency network vendors who are pushing for their solutions to power these algorithms and applications. Read more…

SC11 Video Feature: Garth Gibson on RAID, Roots and Reliability

In this video feature, we sit down with Garth Gibson, co-founder and CTO of high performance storage company, Panasas. During our chat we touch on the roots of his company, explore the way file systems have evolved to meet new demands, and wrap up with a section on big data and the storage demands behind it. Read more…

Q&A: Appro gets ready for 16-core AMD “Interlagos” processors

In the volatile high performance computing (HPC) market, vendor longevity is quite rare. However, Appro has not only endured, but continues to introduce innovative solutions. One of the company’s strengths is that while some vendors flashed into and out of existence by offering rigid solutions exclusively for the highest end of the market, Appro has tried to appeal to the medium to large-scale HPC market with a variety of workload configuration requirements. Read more…

Datanami