Sectors » Science

Features

Here’s Another Option for Hadoop Enterprise Search

Aug 8, 2014 |

The software stacks of many Hadoop distributions feature Apache Lucene and Solr as the enterprise search component. But the folks at the French firm Sinequa say Hadoop customers will get more actual work done–and quickly analyze massive amounts of poly-structured data from dozens of other sources in multiple languages–by using its enterprise search solution. Hadoop, machine learning algorithms, and graph databases may get most of the headlines in our big data world, but good old search engines continue to be Read more…

Ersatz Thinks Up a GPU-Powered Neural Network

Jun 11, 2014 |

If you want to use neural networks to build models that can learn from data in human-like ways, but are having trouble figuring out where to start, you may want to check out Ersatz, the name of the new GPU-powered deep learning platform officially unveiled today by Ersatz Labs. The domain of machine learning is advancing very quickly at the moment, driven by humankind’s rather sudden (and entirely insatiable) desire to keep and understand all data. There are also big Read more…

Array Databases: The Next Big Thing in Data Analytics?

Apr 9, 2014 |

There’s no lack of database choices when it comes to building your next big data analytics platform. Relational data stores, column-oriented databases, in-memory data grids, graph databases, scale-out NewSQL systems, and Hadoop will all get time in the sun. But according to database pioneer Mike Stonebraker, none of these hold a candle to array databases when it comes to running complex analytics on big data sets.

The 2013 Big Data Year in Review

Dec 20, 2013 |

While still in its infancy, the big data technology trend has made a lot of substantial progress since it gained traction at the beginning of this decade. The year 2013 was a big year with advances being made in virtually every quarter of the space. In this feature, we take a look at some of the significant trends that have crossed our desks in the past year — wrapped up and presented to you with a pretty bow. Out with the old, in with the new — it’s the Datanami 2013 Year in Review!

Micron Aims at Big Data With New Parallel Processing Architecture

Nov 19, 2013 |

In 1977, IBM researcher John Backus wrote a seminal paper that asked the question: “Can programming be liberated from the von Neumann style?” This week, Micron Technology has announced a new computing architecture called the Automata Processor, which it says answers the Backus question with a definitive “yes!”

News In Brief

Big Data Survey Finds Growing Need for CEP

Jul 30, 2014 |

A global survey of software developers working on big data and advanced analytics projects found that the large majority of respondents require real-time, complex event processing for their applications. The survey, released July 29 by Evans Data Corp. of Santa Cruz, Calif., found that 71 percent of respondents said they require advanced processing more than half the time. “Hadoop’s batch processing model has worked well for several years,” Evans Data CEO Janel Garvin noted in a statement releasing the survey. Read more…

As Pollution Worsens, China Turns to Big Data

Jul 8, 2014 |

You’ve got to see, smell, feel Beijing’s smog to believe it. Looking out a high window of a tall building, the air is opaque. Despite rules against them, vendors cook and sell food on side streets using charcoal grills. Prevailing winds from the Gobi Desert distribute dust throughout the sprawling capital. The desert dust along with particulates from high-sulfur coal burned in the city’s power plants prompt car owners to brush off their vehicles the way we might clear snow Read more…

Can the ‘SP Machine’ Straighten Out Big Data?

Jun 30, 2014 |

A form of artificial intelligence variously referred to as “information compression” or the SP theory of intelligence promises to help solve some of the most vexing challenges associated with increasingly fragmented big data. In a recent article published in the journal IEEE Access, researcher Gerry Wolff of CognitiveResearch.org argues that the human brain in the form of what he calls an “SP machine” could help sift through the mountains of disparate scientific, business, social media and other forms of data Read more…

Vertica System Aims to Save Species

Dec 11, 2013 |

The big data technology trend is set to make an impact in the battle for the preservation of endangered species, as HP this week announced it’s teamed up with Conservation International to develop what they say is an early warning system for threatened species.

Kaskade: Focus on the Application, Not the Technology

Nov 27, 2013 |

Fifty percent of big data projects fail, Jim Kaskade, CEO of Infochimps, told an audience at the Strata + Hadoop World conference last month. One of the chief reasons that people’s projects fail: their eye is not on the ball. It’s the application, not the data, that should be the primary focus, he says.

This Just In

DESY and IBM Develop Big Data Architecture for Science

Aug 21, 2014 |

ARMONK, N.Y. and HAMBURG, Germany, Aug. 21 – IBM today announced it is collaborating with Deutsches Elektronen-Synchrotron (DESY), a leading national research center in Germany, to speed up management and storage of massive volumes of x-ray data. The planned Big Data and Analytics architecture based on IBM software defined technology can handle more than 20 gigabyte per second of data at peak performance and help scientists worldwide gain faster insights into the atomic structure of novel semiconductors, catalysts, biological cells and other samples. DESY’s 1.7 Read more…

DDN Reports Results for First Half of 2014

Aug 19, 2014 |

SANTA CLARA, Calif., Aug. 19 – DataDirect Networks (DDN) today announced impressive revenue traction for the first six months of 2014, demonstrating a strong trajectory of profitable growth for the company heading into the second half of the year. The company added more than 20 net new petabyte-class customers including one of world’s largest banks, a leader in global oil and gas exploration, and a major Japan-based automobile manufacturing company, increasing the storage capacity sold year-to-date to over 250PBs. In the first Read more…

NCAR Enhances Big Data Services for Climate and Weather Researchers

Aug 6, 2014 |

BOULDER, Colo., Aug. 6 — The National Center for Atmospheric Research (NCAR) has recently implemented an enhanced data sharing service that allows scientists increased access to data as well as improved capabilities for collaborative research. In addition to data sharing, NCAR has significantly upgraded its centralized file service, known as the Globally Accessible Data Environment (GLADE). Managed by NCAR’s Computational and Information Systems Laboratory (CISL), both GLADE and the data sharing service are important upgrades for the high-performance computing (HPC) Read more…

Cluster 2014 Set for September

Apr 15, 2014 |

Clusters have become the workhorse for computational science and engineering research, powering innovation and discovery that advance science and society. They are the base for building today’s rapidly evolving cloud and HPC infrastructures, and are used to solve some of the most complex problems. The challenge to make them scalable, efficient, and more functional requires the joint effort of the cluster community in the areas of cluster system design, management and monitoring, at hardware, system, middleware, and application level.

SDSC Establishes CoE for Data Science Workflows

Apr 3, 2014 |

The San Diego Supercomputer Center (SDSC) at the University of California, San Diego, has formally established a new ‘center of excellence’ to assist researchers in creating workflows to better manage the tremendous amount of data being generated across a wide range of scientific disciplines, from natural sciences to marketing research.