Technologies » Storage

Features

5 Ways Big Geospatial Data Is Driving Analytics In the Real World

May 21, 2015 |

Amid the flood of data we collect and contend with on a daily basis, geospatial data occupies a unique place. Thanks to the networks of GPS satellites and cell towers and the emerging Internet of Things, we’re able to track and correlate the location of people and objects in very precise ways that were not possible until recently. But putting this geospatial data to use is easier said than done. Here are five ways organizations can use geospatial data to Read more…

Unlocking the Hidden Benefits of Industrial Data

May 11, 2015 |

We’ve heard extensively about how companies are analyzing data born on the consumer Web to improve customer service, boost probability, and otherwise remake the business-to-consumer relationship. But industrial data is also experiencing an analytic revolution, and it promises to remake how work gets done on factory floors, along railroads, and in mining operations around the world. The so-called industrial Internet journey is not nearly as well-documented as the big data revolution taking place on the public Web. But thanks to Read more…

How Hadoop Solved BT’s Data Velocity Problem

May 8, 2015 |

Like most large corporations with millions of customers, BT (British Telecom) has an extensive collection of databases, and is constantly moving data in and out of them. But when data growth maxed out a critical ETL server, it found a solution in a distributed Hadoop system. With £18 billion (about $30 billion) in revenue last year, BT is one of the largest telecommunications provider in the world, and serves more than 18 million consumers and nearly 3 million businesses in Read more…

Peering Into Computing’s Exascale Future with the IEEE

May 6, 2015 |

Computer scientists worried about the end of computing as we know it have been banging heads for several years looking for ways to return to the historical exponential scaling of computer performance. What is needed, say the proponents of an initiative called “Rebooting Computing,” is nothing less than a radical rethinking of how computers are designed in order to achieve the trifecta of greater performance, lower power consumption and “baking security in from the start.” Thomas Conte, the Georgia Tech Read more…

Deep Dive Into Databricks’ Big Speedup Plans for Apache Spark

May 4, 2015 |

Apache Spark rose to prominence within the Hadoop world as a faster and easier to use alternative to MapReduce. But as fast as Spark is today, it won’t hold a candle to future versions of Spark that the folks at Databricks are now developing. Last week Databricks shared its development roadmap for Spark with the world. The company has near-term and long-term plans to boost the performance of the platform; the plans are grouped into something called Project Tungsten. In Read more…

News In Brief

Teradata Mixes, Matches Database Rows, Columns

May 11, 2015 |

Seeking to improve access to data stored in columnar tables and, with it, speed up query performance, big data and marketing applications specialist Teradata Corp. rolled out a hybrid row and column database that it said would allow users to mix and match row and column technologies. San Diego-based Teradata said May 7 its enhanced hybrid row and column in-database technology could help reduce the amount of data read in wide tables while “preserving pinpoint data access” for real-time queries. Read more…

Google Launches Bigtable Hosting Service

May 7, 2015 |

Google yesterday announced the beta launch of Google Cloud Bigtable, a hosted version of its wide-column NoSQL database. The service is accessible via the standard HBase API–making it instantly integrated with the Hadoop stack–and blows away other hosted NoSQL offerings in terms of latency and throughput, the company claims. You might know Bigtable as the database at the heart of many Google offerings, including Gmail, Google Search, YouTube, Google Earth, and Google Analytics. But the technology, which Google engineers Jeff Read more…

HP Targets Big Data Workloads With New Servers

May 5, 2015 |

Hewlett-Packard said it is targeting “data-intensive workloads” with a series of Compute platform servers and an accompanying big data reference architecture aimed at leveraging the data torrent from connected “things.” HP said Tuesday (May 4) its new batch of ProLiant and Apollo servers were fine-tuned with a big data reference architecture to zero in on data-heavy workloads like Hadoop that increasingly dominate datacenters. The platforms also reflect the emergence of new server technologies and architectures that incorporate open source data Read more…

Who’s ‘Cool’ In Storage Now?

May 1, 2015 |

Storage hardware may not get the glory in our software-dominated world, but it’s a critical element of a big data environment nonetheless. So when Gartner trotted out its “Cool Vendors” in storage last month, Datanami had to take a look. Five storage vendors made the cut in the analyst firm’s report, including  DataGravity, Infinidat, Infinio, Maxta, and Storiant. We’ll take a look at each of these vendors, as well as why Gartner thinks they’re so “cool.” DataGravity — This Nashua, Read more…

Tachyon Nexus Gets $7.5M to Productize Big Data File System

Mar 17, 2015 |

Tachyon Nexus, the company founded to productize the Tachyon in-memory file system developed at the AMPlab, has received $7.5 million in venture capital funding from Andreessen Horowitz, the Wall Street Journal reported today. Tachyon is a new distributed file system that aims to dramatically speed the processing of big data analytics applications developed using today’s big data frameworks, including Apache Spark, Apache Shark, Hadoop MapReduce, and Apache Fink. The software, which doctoral student Haoyuan Li first unveiled in 2013, quickly Read more…

This Just In

Hanover Hospital Utilizing DataCore’s Software-Defined Storage and Virtual SAN

May 28, 2015 |

May 28 — DataCore, a leader in software-defined storage, today announced that Hanover Hospital has realized continuous uptime with its high-availability software-defined storage and has significantly reduced the time and effort it takes to provision storage and systems. “The biggest benefit Hanover Hospital has experienced from adopting DataCore has been true high availability due to the automatically synchronized virtual disks that are mirror protected and presented to different applications spanning our two on-campus datacenters,” stated Douglas Null, senior technical architect-MIS department, Hanover Read more…

FICO Selects SolidFire’s All-Flash Scale-Out Storage Platform

May 27, 2015 |

BOULDER, Colo., May 27 –– SolidFire announced that FICO, the predictive analytics and decision management software company, selected SolidFire’s all-flash scale-out platform to maximize flexibility, speed deployment and reduce the cost of its FICO Analytic Cloud, an environment for creating, customizing and deploying powerful, analytics-driven applications and services. SolidFire’s integration with OpenStack and VMware, along with its ability to consolidate multiple workloads and guarantee exact levels of performance and capacity, will enable FICO to guarantee Quality of Service and meet service level agreements for Read more…

Ryft Open API Library Extends Lightning-Fast Batch and Streaming Analysis to Existing Big Data Environments 

May 21, 2015 |

ROCKVILLE, MD., May 21 — Ryft announced today the availability of the Ryft Open API Library, expanding the company’s Ryft ONE high-performance data analysis capabilities into existing big data environments. Developers can now invoke or extend the Ryft Open API to add hardware-driven acceleration to applications that demand real-time intelligence from all types of batch and streaming data. Now, Ryft combines the flexibility of an open, future-proof 1U platform with the speed, simplicity, scalability and security of a purpose-built analytics accelerator for 100X faster enterprise decision-making Read more…

Zettaset Boosts Big Data Protection for IBM Power Systems servers with Enterprise-Class Encryption for Hadoop and NoSQL Databases

May 20, 2015 |

MOUNTAIN VIEW, Calif., May 20, – Zettaset, a leader in Big Data security, today announced it has successfully completed integration of Zettaset’s Big Data Encryption software solution, BDEncrypt™, with IBM’s Power Systems scale-out servers running Linux. IBM POWER8 customers seeking an encryption solution which is optimized for Big Data environments and distributed computing architectures including Hadoop and NoSQL can now utilize Zettaset BDEncrypt.  BDEncrypt with IBM Power Systems is designed to deliver a high level of compute performance and solution Read more…

MemSQL Launches Community Edition: World’s Fastest In-Memory Database Now Available to All

May 20, 2015 |

SAN FRANCISCO, CA – May 20  – MemSQL, the leader in real-time databases for transactions and analytics, today announced the most significant release of MemSQL to date. MemSQL 4 – which includes a new Community Edition – empowers interconnected enterprises to aggregate and report on real-time data, accelerating the growth trajectories of their digital businesses. This release brings to market groundbreaking capabilities such as the industry’s first real-time, distributed geospatial intelligence and the MemSQL Spark Connector to operationalize Apache Spark. Read more…