Technologies » Systems

Features

Teradata Supports CDH and HDP with New Hadoop Appliance

Jul 9, 2015 |

Teradata today announced that customers can get its Hadoop Appliance pre-loaded with a distribution from either Cloudera or Hortonworks. The fifth generation of the analytics giant’s appliance also features more configuration options, including different types of nodes designed to run different workloads. Even though Hadoop was designed to run on low-cost, commodity Lintel servers that most IT folks are familiar with, some customers still don’t want to deal with the cost and hassle of procuring their own cluster to get Read more…

Going In-Memory? Consider These Three Things First

Jun 25, 2015 |

Businesses, large and small, know that if they aren’t using the data they collect in some kind of intelligent way, their companies will be out of business in a few short years. Quickly turning data into business insights is downright essential to increasing and retaining customers as well as delivering new products and services. You know how online retailers have those flash sales and replace items that have sold out with similar ones as shoppers are viewing the screen? They Read more…

Hadoop Opportunity ‘Never Been Bigger’ Says Hortonworks CEO

Jun 9, 2015 |

Don’t be fooled by the Hadoop naysayers, Hortonworks CEO Rob Bearden told an audience at the Hadoop Summit yesterday. “The opportunity has never been bigger,” he said. “We may never have an opportunity this big again in our careers.” Bearden delivered a positive and energetic keynote address at the Hadoop Summit, which drew about 4,000 attendees to the San Jose convention center, Hortonworks‘ biggest show of the year. The technology veteran set an optimistic tone with his comments, but he Read more…

Five Ways Big Genomic Data Is Making Us Healthier

Jun 3, 2015 |

We’ve come a long ways since the human genome was first decoded 13 years ago. Thanks to continued advances in high performance computing and distributed data analytics, doctors and computer scientists are finding innovative new ways to use genetic data to improve our health. The economics of genetic testing is creating a flood of genomic data. The Human Genome Project took over a decade and cost $3 billion, but today whole-genome tests from Illumina can now be had in a Read more…

5 Ways Big Geospatial Data Is Driving Analytics In the Real World

May 21, 2015 |

Amid the flood of data we collect and contend with on a daily basis, geospatial data occupies a unique place. Thanks to the networks of GPS satellites and cell towers and the emerging Internet of Things, we’re able to track and correlate the location of people and objects in very precise ways that were not possible until recently. But putting this geospatial data to use is easier said than done. Here are five ways organizations can use geospatial data to Read more…

News In Brief

IBM Adds Spark to z Systems Mainframe

Jul 27, 2015 |

In a bid to promote “mainframe analytics,” IBM said it would support Apache Spark for Linux on its latest z Systems offering while working with data mining specialists to expand analytic software to leverage additional z Systems processing power. IBM said Monday (July 27) the “enablement” of Apache Spark is part of an effort to expand its z Systems ecosystem. It plans to integrate Spark later this year with its z/OS. Spark support on IBM mainframes and components would also Read more…

Univa Gives ‘Pause’ to Big Data Apps

Jul 14, 2015 |

Scheduling workloads on today’s big analytic clusters can be a big challenge. Your team may have carefully everything lined up, only to have a last-minute change leave your schedule in shambles. One company that’s close to a solution is Univa, which today announced the addition of its “preemption” feature that allows admins to momentarily “pause” workloads so they can run a higher priority application. Historically, admins were loathe to stop HPC or big data jobs before they ended, because it could Read more…

NIH Effort Looks to Compress Big Genomics Data

Jul 10, 2015 |

The steady progress being made in “precision,” or genomic medicine is bringing with it a growing need to get a better handle on soaring data volumes. For example, a single human genome sequence constitutes a roughly 140-gigabyte data file. In an effort to get its arms around genomic and other data, the National Institutes of Health (NIH) recently awarded a $1.3 million contract to researchers at the University of Illinois and Stanford University to develop new data compression approaches. The Read more…

Spy vs. Spy: China Eyes Data Mining

Jul 9, 2015 |

The timing is interesting: China’s official Xinhau News Agency moved a story this week about government agencies embracing big data. The dispatch came weeks after the United States accused China of hacking a U.S. personnel database containing sensitive information about nearly 20 million individuals who had applied for background checks required to received security clearances. The breach also compromised personal information about millions of federal employees. (The U.S. Office of Personnel Management revealed on Thursday [July 9] that “sensitive information” Read more…

Lockheed Martin, Data Vendors Team on Secure Spy Database

Jun 25, 2015 |

Geospatial intelligence is among the hottest and most data-intensive tools being used by U.S. military analysts to sweep up huge amounts of satellite and other sensor imagery. This highly classified data is often combined with other emerging intelligence sources like social media. Much of the satellite imagery is highly classified to shield prying eyes from capabilities like image resolution and operational details like spectrum frequencies being used. Hence, there is a growing need for advanced databases with multiple levels of Read more…

This Just In

SGI UV 300H Certified by SAP to Run SAP HANA

Jul 29, 2015 |

MILPITAS, Calif., July 29 — SGI, a global leader in high-performance solutions for compute, data analytics and data management, announced its SGI UV 300H is now SAP certified to run the SAP HANA platform in controlled availability at 20-sockets — delivering up to 15 terabytes (TB) of in-memory computing capacity in a single node. Asserting the value of key enhancements in support package stack 10 (SPS10) for SAP HANA and SAP’s close collaboration with system providers, SGI UV 300H delivers outstanding single-node Read more…

SANS Big Data Survey and Research Report Now Available

Jun 11, 2015 |

BETHESDA, Md. and PALO ALTO, Calif., June 11 — SANS, the global leader in information security training and analysis, today announced availability of its first survey and research report identifying how often organizations ranging from enterprises to government agencies are utilizing big data systems, what the associated security challenges are, and how risks can be easily mitigated. Sponsored by Cloudera, the leader in enterprise analytic data management powered by Apache Hadoop, the study was authored by SANS Analyst Barbara Filkins, with SANS Director of Read more…

Australian Oil & Gas Company to Utilize IBM’s Watson

May 27, 2015 |

May 27 — IBM and Woodside today announced they will use IBM Watson as part of the oil and gas company’s next steps in data science. The cognitive computing system will be trained by Woodside engineers, enabling users to surface evidence-weighted insights from large volumes of unstructured and historical data contained in project reports in seconds. Watson is part of Woodside’s strategy to use predictive data science to leverage more than 30 years of collective knowledge and experience as a leading liquefied natural gas operator, to maintain a Read more…

Ryft Open API Library Extends Lightning-Fast Batch and Streaming Analysis to Existing Big Data Environments 

May 21, 2015 |

ROCKVILLE, MD., May 21 — Ryft announced today the availability of the Ryft Open API Library, expanding the company’s Ryft ONE high-performance data analysis capabilities into existing big data environments. Developers can now invoke or extend the Ryft Open API to add hardware-driven acceleration to applications that demand real-time intelligence from all types of batch and streaming data. Now, Ryft combines the flexibility of an open, future-proof 1U platform with the speed, simplicity, scalability and security of a purpose-built analytics accelerator for 100X faster enterprise decision-making Read more…

Zettaset Boosts Big Data Protection for IBM Power Systems servers with Enterprise-Class Encryption for Hadoop and NoSQL Databases

May 20, 2015 |

MOUNTAIN VIEW, Calif., May 20, – Zettaset, a leader in Big Data security, today announced it has successfully completed integration of Zettaset’s Big Data Encryption software solution, BDEncrypt™, with IBM’s Power Systems scale-out servers running Linux. IBM POWER8 customers seeking an encryption solution which is optimized for Big Data environments and distributed computing architectures including Hadoop and NoSQL can now utilize Zettaset BDEncrypt.  BDEncrypt with IBM Power Systems is designed to deliver a high level of compute performance and solution Read more…