Nov 9, 2012 |
We’re fresh off the chase for stories from the recent Strata HadoopWorld event and now we’re preparing to roll into this coming Supercomputing ’12 conference in Salt Lake City where there will be plenty of big data talk to go around for both scientific and enterprise computing folks. We’ll be reporting live on site from there this coming week.
Jun 11, 2012 |
Data intensive computing is an important and growing sector of scientific and commercial computing and places unique demands on computer architectures. While the demands are continuing to grow, most of present systems, and even planned future systems might not meet these computing needs very effectively. The majority of the world’s most powerful supercomputers are designed for running numerically intensive applications that can make efficient use of distributed memory. There are a number of factors that limit the utility of these systems for knowledge discovery and data mining.
Dec 8, 2011 |
This week the San Diego Supercomputer Center introduced the flash-based, shared memory Gordon supercomputer. Built by Appro and sporting capabilities at the 36 million IOPS range, the center’s director made no mistake in stating that a new era of data-intensive science has begun.