Follow Datanami:

Tag: hpc

BSC Presents Plan to Energize Europe’s Big Data Efforts

Researchers from the Barcelona Supercomputer Center today presented the big data roadmap commissioned by the EU as part of the RETHINK big project intended to identify technology goals, obstacles and actions for developi Read more…

Life Sciences Cultivate Streaming Analytics

Transatlantic partners are applying machine-learning algorithms to a financial HPC platform to develop predictive models for studying crop growth patterns and agriculture practices. The partners said the models would inc Read more…

Quantum Researchers Eye AI Advances

Researchers wringing out new quantum computing architectures are increasingly looking at the nascent processing technology as a way to advance machine-learning algorithms for new AI applications. As quantum computing Read more…

The Algebra of Data Promises a Better Math for Analytics, And More

A company by the name of Algebraix Data is beginning to speak publicly for the first time about the algebra of data, an approach to storing and accessing data that it devised and patented. The company is using its data a Read more…

Univa Gives ‘Pause’ to Big Data Apps

Scheduling workloads on today's big analytic clusters can be a big challenge. Your team may have carefully everything lined up, only to have a last-minute change leave your schedule in shambles. One company that's close Read more…

DDN Tackles Enterprise Storage Needs as ‘Wolfcreek’ Looms

When it comes to keeping supercomputers fed with data, there are few storage makers that can keep up with DataDirect Networks. But increasingly, DDN is feeling pressure from enterprises that are struggling to keep up wit Read more…

Deep Dive Into HP’s New HPC & Big Data Business Unit

When HP finally divides into two pieces – HP Inc. (PCs and printers) and Hewlett Packard Enterprise (servers and services) – how will the HPC portfolio fare? Views vary of course. The split is meant to let the ‘new Read more…

Build or Buy? That’s the Big (Data) Question

"You can learn a lot from my failures, maybe," says Ron Van Holst, an HPC director at Ontario Centres of Excellence. With decades of experience designing and building high-end telecommunications gear and HPC systems, Van Read more…

Qumulo Comes Out of Stealth with ‘Data-Aware’ Storage

For the past three years, a Seattle company has been working to solve what its founders consider the biggest problem affecting large-scale storage: Actually knowing what data you have and how it's being used. Today, that Read more…

Three Ways Big Data and HPC Are Converging

Big data is becoming much more than just widespread distribution of cheap storage and cheap computation on commodity hardware.  Big data analytics may soon become the new “killer app” for high performance computing Read more…

Rethinking Hadoop for HPC

Hadoop's momentum has caught the eye of those in the high performance computing (HPC) community, who want to participate and benefit from the fast pace of development. However, the relatively poor performance and high la Read more…

ISC’14

The International Supercomputing Conference (ISC) is the most significant conference and exhibition in Europe for the HPC community. The 2014 focus is on supercomputers solving real life problems, extreme computing chall Read more…

Making Hadoop Relevant to HPC

Despite its proven ability to affordably process large amounts of data, Apache Hadoop and its MapReduce framework are being taken seriously only at a subset of U.S. supercomputing facilities and only by a subset of profe Read more…

How In-Memory Data Grids Can Analyze Fast-Changing Data in Real-Time

The ability to continuously analyze operational data unlocks the potential for organizations to extract important patterns. Popular big data systems are not well suited for this challenge. However, in-memory data grid technology (IMDGs) offers important breakthroughs that enable real-time analysis of operational data. Benchmarks have demonstrated that an IMDG can complete map/reduce analyses every four seconds across a changing, terabyte data set. This article discusses how IMDGs deliver this new capability to analyze fast-changing, operational data. Read more…

The Big Data Security Gap: Protecting the Hadoop Cluster

Hadoop enables the distributed processing of large data sets across clusters of computers, but its approach presents a unique set of security challenges that many enterprise organizations aren’t equipped to handle. Open source approaches to securing Hadoop are still in their infancy, and lack robustness. Zettaset Orchestrator™ is the only solution that has been specifically designed to meet enterprise security requirements in big data and Hadoop environments. Learn what every organization should know about Hadoop security. Read more…

Accelerate Hadoop MapReduce Performance using Dedicated OrangeFS Servers

Recent tests performed at Clemson University achieved a 25 percent improvement in Apache Hadoop Terasort run times by replacing Hadoop Distributed File System (HDFS) with an OrangeFS configuration using dedicated servers. Key components included extension of the MapReduce “FileSystem” class and a Java Native Interface (JNI) shim to the OrangeFS client. No modifications of Hadoop were required, and existing MapReduce jobs require no modification to utilize OrangeFS. Read more…

IDC Talks Convergence in High Performance Data Analysis

At the International Supercomputing Conference (ISC’13) this week, convergence is in the air as many discussions are including talk about the merging of traditional high performance technical computing with the rising data tides of the enterprise. Putting data and use cases on display, the analysts at IDC gave their view of this convergence space – and shared their own name for it: High Performance Data Analysis (or HPDA). Read more…

Why Big Data Needs InfiniBand to Continue Evolving

Increasingly, it’s a Big Data world we live in. Just in case you’ve been living under a rock and need proof of that, <a href="http://www.forbes.com/sites/kashmirhill/2012/02/16/how-target-figured-out-a-teen-girl-was-pregnant-before-her-father-did/" target="_blank">a major retailer can use an unimaginable number of data points to predict the pregnancy of a teenage girl outside Minneapolis before she gets a chance to tell her family</a>.  That’s just one example, but there are countless others that point to the idea that mining huge data volumes can uncover gold nuggets of actionable proportions (although sometimes they freak people out...) Read more…

Big Data & Virtual Prototyping Changing Auto Design Culture

Design and engineering teams at Jaguar Land Rover say that big data and virtual prototyping are changing the culture and way they think about their work. Read more…

Sharing Infrastructure: Can Hadoop Play Well With Others?

A lot of big data/Hadoop implementations are swimming against the currents of what recent history has taught about large scale computing and the result is a significant amount of waste, says Univa CEO, Gary Tyreman, who believes that Hadoop shared-infrastructure environments are on the rise. Read more…

Datanami