Technologies » Systems

Features

5 Ways Big Geospatial Data Is Driving Analytics In the Real World

May 21, 2015 |

Amid the flood of data we collect and contend with on a daily basis, geospatial data occupies a unique place. Thanks to the networks of GPS satellites and cell towers and the emerging Internet of Things, we’re able to track and correlate the location of people and objects in very precise ways that were not possible until recently. But putting this geospatial data to use is easier said than done. Here are five ways organizations can use geospatial data to Read more…

Peering Into Computing’s Exascale Future with the IEEE

May 6, 2015 |

Computer scientists worried about the end of computing as we know it have been banging heads for several years looking for ways to return to the historical exponential scaling of computer performance. What is needed, say the proponents of an initiative called “Rebooting Computing,” is nothing less than a radical rethinking of how computers are designed in order to achieve the trifecta of greater performance, lower power consumption and “baking security in from the start.” Thomas Conte, the Georgia Tech Read more…

Deep Dive Into Databricks’ Big Speedup Plans for Apache Spark

May 4, 2015 |

Apache Spark rose to prominence within the Hadoop world as a faster and easier to use alternative to MapReduce. But as fast as Spark is today, it won’t hold a candle to future versions of Spark that the folks at Databricks are now developing. Last week Databricks shared its development roadmap for Spark with the world. The company has near-term and long-term plans to boost the performance of the platform; the plans are grouped into something called Project Tungsten. In Read more…

Re-Platforming the Enterprise, Or Putting Data Back at the Center of the Data Center

Apr 27, 2015 |

The data center as we know it is going through tremendous changes. New technologies being deployed today represent a new stack that will form a strategic data center platform and will significantly reduce hardware and system costs, improve efficiency and productivity of IT resources, and drive flexible application and workload execution. This new stack will also transform an organization’s ability to respond quickly to changing business conditions, competitive threats, and customer opportunities. Before we go any further, let’s stop and Read more…

Build or Buy? That’s the Big (Data) Question

Apr 8, 2015 |

“You can learn a lot from my failures, maybe,” says Ron Van Holst, an HPC director at Ontario Centres of Excellence. With decades of experience designing and building high-end telecommunications gear and HPC systems, Van Holst has a better grasp on the tradeoffs you’ll make when you choose to build or buy your next big data system. Van Holst was into a little bit of everything when was building telecommunications gear in the 1990s. From the programming languages and the Read more…

News In Brief

Security Concerns Extend to ‘Big Data Life Cycle’

May 20, 2015 |

The “datafication” of modern life is now measured in the quintillions of bytes of data generated by humans each day. The struggle to store, manage and retrieve the waterfall of data has also highlighted a growing list of privacy and security issues. Privacy concerns stemming from the rise of big data and the resulting backlash against abuses like unscrupulous data brokers have resulted in a range of regulatory initiatives designed to safeguard sensitive information. Now, greater attention is being given Read more…

Wikibon Analyst Kelly: Big Data is Entrenched

May 12, 2015 |

Back in 2011, when the respected Wikibon researcher Jeff Kelly began tracking big data, the technology was, in his view, “long on promise but short on specifics.” In his final blog post for the Wikibon Project, Kelly concludes that the technology has come full circle, asserting that its steady application has helped to solve real-world business challenges while evolving into a “competitive differentiator” in a world gone digital. It has even helped deliver sports championships. “Big data is no longer Read more…

HP Targets Big Data Workloads With New Servers

May 5, 2015 |

Hewlett-Packard said it is targeting “data-intensive workloads” with a series of Compute platform servers and an accompanying big data reference architecture aimed at leveraging the data torrent from connected “things.” HP said Tuesday (May 4) its new batch of ProLiant and Apollo servers were fine-tuned with a big data reference architecture to zero in on data-heavy workloads like Hadoop that increasingly dominate datacenters. The platforms also reflect the emergence of new server technologies and architectures that incorporate open source data Read more…

Lavastorm Pushes Analytics Collaboration

Apr 22, 2015 |

An upgraded analytics engine rolled out this week by Lavastorm seeks to deliver greater transparency in how analytics are created as a way to repair what the company argues is a “disjointed” process of creating and sharing analytic applications. Boston-based Lavastorm touts its Analytics Engine 6.0 as promoting transparency as a way to encourage collaboration among the authors and users of analytic programs. To that end, the latest version of the analytics engine includes data flow logic and interim data Read more…

Neo Tech Cranks Up Speed on Upgraded Neo4j

Mar 25, 2015 |

Graph database specialist Neo Technology has rolled out an updated version of its Neo4j tool that includes faster read and write performance capabilities aimed at critical graph database applications. Neo4j 2.2 released on Wednesday (March 25) comes with souped-up write and read scalability, the company said. Expanded write capacity targets highly concurrent applications while leveraging available hardware via faster buffering of updates. A new unified transaction log is designed to serve both graphs and their indexes. Graph databases are an Read more…

This Just In

Ryft Open API Library Extends Lightning-Fast Batch and Streaming Analysis to Existing Big Data Environments 

May 21, 2015 |

ROCKVILLE, MD., May 21 — Ryft announced today the availability of the Ryft Open API Library, expanding the company’s Ryft ONE high-performance data analysis capabilities into existing big data environments. Developers can now invoke or extend the Ryft Open API to add hardware-driven acceleration to applications that demand real-time intelligence from all types of batch and streaming data. Now, Ryft combines the flexibility of an open, future-proof 1U platform with the speed, simplicity, scalability and security of a purpose-built analytics accelerator for 100X faster enterprise decision-making Read more…

Zettaset Boosts Big Data Protection for IBM Power Systems servers with Enterprise-Class Encryption for Hadoop and NoSQL Databases

May 20, 2015 |

MOUNTAIN VIEW, Calif., May 20, – Zettaset, a leader in Big Data security, today announced it has successfully completed integration of Zettaset’s Big Data Encryption software solution, BDEncrypt™, with IBM’s Power Systems scale-out servers running Linux. IBM POWER8 customers seeking an encryption solution which is optimized for Big Data environments and distributed computing architectures including Hadoop and NoSQL can now utilize Zettaset BDEncrypt.  BDEncrypt with IBM Power Systems is designed to deliver a high level of compute performance and solution Read more…

IBM Unveils New Servers and Software Designed for Cloud Computing

May 11, 2015 |

May 11 — In a move to help more clients create hybrid cloud computing environments, IBM today announced new servers and storage software and solutions designed for cloud computing. IBM also revealed flexible software licensing of its industry-leading middleware to help clients speed up their adoption of hybrid cloud environments. Businesses are increasingly adopting hybrid cloud environments to integrate existing technology investments (systems of record) with new workloads in the cloud (systems of engagement) driven by mobile, big data and social computing. These systems bring new waves of data that Read more…

NEC Launches New Express5800 Servers for Big Data Utilization

May 7, 2015 |

IRVING, Tex. & TOKYO, Japan, May 7 — NEC Corporation has announced the global release of a new Express5800/A2000 Series of enterprise servers for Windows and Linux operating systems with the latest Intel Xeon Processor E7 v3 Family (up to 18 CPU cores), suitable for mission critical systems and processing Big Data. These new servers enhance memory controller features, memory sparing functions and memory mirroring functions (Address Based Memory Mirroring, which enables efficient use of memory space), making them ideally suited for the growing Read more…

HDS Launches New Converged Offerings for Environments Running Sap Hana

May 6, 2015 |

SANTA CLARA, Calif., May 6 — Hitachi Data Systems Corporation, a wholly owned subsidiary of Hitachi, Ltd., today announced new and enhanced converged solutions designed to provide users with the superior scalability and outstanding reliability and availability needed to support the real-time applications and big data analytics made possible by the SAP HANA platform. The newly enhanced Hitachi Unified Compute Platform (UCP) for SAP HANA is available in more than 25 different memory configurations designed for the smallest to the largest Read more…