Follow Datanami:

Tag: data intensive

Vogels: Hate the Name, Not the Tech

Big data will help enterprises start acting more like lean startups, argued Amazon CTO, Werner Vogels at the Technology Frontiers 2013 Conference this week in London. Read more…

Digging Mutants Out of Massive Data

Uncovering mutants out of massive wells of data about mutations and points of "normal" genetic reference can understandably quite a data-intensive task given with the amount of genes within the genomes that have to be mapped and followed... Read more…

University Report Unravels Big Data for CIOs

The University of Oregon’s Robert Prehm (also a software developer for Zoom Software Solutions) published a report called “What CIOs and CTOs Need to Know about Big Data and Data-Intensive Computing.” As stated in the title, the report is aimed at Chief Information and Technology Officers who are looking for answers as to how to move forward in the world of bigger data... Read more…

World’s Top Data-Intensive Systems Unveiled

This year at the International Supercomputing Conference in Germany, the list of the top data-intensive systems was revealed with heavy tie-ins to placements on the Top500 list of the world's fastest systems, entries from most of the national labs, and plenty of BlueGene action. The list of the top "big data" systems, called the Graph500, measures the performance of.... Read more…

This Week’s Big Data Big Seven

We wrap up this week with news about a new high performance, data-intensive supercomputer from SGI, new Hadoop announcements, including those from Hortonworks, Datameer, and Karmasphere, some software enhancements for big data infrastructure from ScaleOut and some other startup goodness--all with an eye on next week's International.... Read more…

Understanding Data Intensive Analysis on Large-Scale HPC Compute Systems

Data intensive computing is an important and growing sector of scientific and commercial computing and places unique demands on computer architectures. While the demands are continuing to grow, most of present systems, and even planned future systems might not meet these computing needs very effectively. The majority of the world’s most powerful supercomputers are designed for running numerically intensive applications that can make efficient use of distributed memory. There are a number of factors that limit the utility of these systems for knowledge discovery and data mining. Read more…

This Week’s Big Data Big Ten

This week's big data big ten for week ending May 4 includes data-intensive system news out of Australia, big genomics investments at key U.S. research centers, some multi-million sum investments in analytics platforms, and a few other items of interest, including HPCC Systems' move into Asia. Read more…

This Week’s Big Data Big Ten

Week of April 27: We touch on news about tackling rugby injuries with analytics; investors taking a shine to Lustre; Sears Buys into Hadoop; Teradata sports new "sassy" appliance; supporting data-intensive science; analytics superheroes unmasked; and continued developments for in-memory analytics. Read more…

A Floating Solution for Data-Intensive Science

A recent assertion from a team at Argonne National Laboratory proposed a simple yet still “fringe” answer to an increasingly pressing question for scientists in data-intensive fields. Before we get there, however, it might go without saying that all scientific disciplines are data-intensive now, especially following the explosion in sensors and data collection gear sparked by the era of limitless mobility... Read more…

SGI Claims Performance Boost for Big Data

This week technical computing company SGI announced that it would be updating its HPC product lines with the newest Xeon family, stating that this is an important move for customers with data-intensive computing needs, not just those in.... Read more…

Supercomputing Center Set to Become Big Data Hub

This week one of the United States' largest HPC and scientific visualization centers announced a $10 million commitment from the O’Donnell Foundation to enhance their data-intensive science capabilities. We check in with TACC head, Dr. Jay Boisseau to find out what new HPC, big data purchases are in their future and how.... Read more…

The New Era of Computing: An Interview with “Dr. Data”

When it comes to thought leadership that bridges the divides between scientific investigation, technology and the tools and applications that make research possible, Dr. Alexander Szalay is one of the first scientists that springs to mind. Read more…

Appistry Weaves Vision of Big Analytics Growth

Appistry has had a record year and while still a small company, it is aiming high. The St. Louis-based high-throughput analytics as a service company is focusing on verticals that generate a lot of data, and offering up the claim that they have a solution to crunch it via their cloud platform. Read more…

ISC Issues Call to Action for Computer, Data Scientists

The annual International Supercomputing Conference, which is set to take place in mid-June this year in Hamburg, Germany, has issued a call for papers, tutorial concepts and provided information about submission and awards. For those with an eye on big data, the world of supercomputing could yield a wealth of new insights. Read more…

Live from GTC Asia: Accelerating Big Science

This week we were on-site in Beijing, China for the NVIDIA GTC event, which showcased innovations in GPU technology to accelerate scientific and enterprise applications. While the focus this year was mainly on science, with the growing datasets in industries like oil and gas and the life sciences, it is clear that accelerators could play a critical role as the era of big data unfolds. Read more…

Bar Set for Data-Intensive Supercomputing

This week the San Diego Supercomputer Center introduced the flash-based, shared memory Gordon supercomputer. Built by Appro and sporting capabilities at the 36 million IOPS range, the center's director made no mistake in stating that a new era of data-intensive science has begun. Read more…

Big Data Plumbing Startup Scores Backing

GridGain, a high performance cloud middleware startup, secured a $2.5 million investment this week. The company's score is based on their ability to optimize environments designed to chew through large amounts of time-sensitive data. Read more…

Interview: Cray CEO Sees Big Future in Big Data

During the this year's annual Supercomputing Conference (SC11) in Seattle, Cray's home turf, we caught up with the company's CEO, Peter Ungaro to talk about the increasing meld between big data and traditional supercomputing--and what this blend could portend for Cray going forward. Read more…

The Path to Personalized Medicine

Life sciences research and development professionals are among the first to claim an insatiable need for robust management, storage and compute resources. According to one researcher, the cloud and new research groups are providing valuable partnerships that are making personalized medicine a reality. Read more…

Live from SC11: Turning Big Data into Big Physics

GPU technology is playing a major role in boosting the efficiency and processing horsepower of businesses that are reliant on rapid-fire simulations. We speak with Matthew Scarpino from Eclipse Engineering and provide a glimpse of the visualization showcase from SC11. Read more…

Datanami