Follow Datanami:

Tag: univa

Five Ways Big Data Can Help HPC Operators Run More Efficiently in the Cloud

When most of us hear the term big data, we tend to think of things like social media platforms, seismic data, or weather modeling. An essential use of big data, however, is to analyze data from complex systems to make th Read more…

Migration Tools Needed to Shift ML to Production

The confluence of accelerators like cloud GPUs along with the ability to handle data-rich HPC workloads will help push more machine learning projects into production, concludes a new study that also stresses the importan Read more…

Advanced Scheduling for Containerized, Microservice Applications

Containers are being adopted at a record-setting rate and although many container projects are still in pilot, dev, or test, as these deployments get to production scale resource constraints will surface. These constrain Read more…

The Art of Scheduling in Big Data Infrastructures Today

Arun Murthy, architect with Hortonworks, said recently that the Hadoop community wanted to “fundamentally re-architect Hadoop...in a way where multiple types of applications can operate efficiently and predictable within the same cluster”. The starting point to do this, he says, is YARN, which has the potential to “turn Hadoop from a single application system to a multi-application operating system”. Fritz Ferstl, CTO with Univa argues that such efforts may run the risk of reinventing the wheel. Read more…

Shared Infrastructure: Using Proven HPC Products for Big Data

Big Compute (HPC) and Big Data share common architectural constructs – for example, both commonly use commodity hardware that is tied together in a cluster and shared among users. Leveraging the capabilities that are proven in HPC to create a shared Big Data infrastructure is not only possible, it is becoming a requirement and is on the community’s roadmap. Why wait? Avoid the expense of purchasing an expensive stand-alone Hadoop cluster and save money with Big Data shared infrastructure from Univa today. Read more…

Sharing Infrastructure: Can Hadoop Play Well With Others?

A lot of big data/Hadoop implementations are swimming against the currents of what recent history has taught about large scale computing and the result is a significant amount of waste, says Univa CEO, Gary Tyreman, who believes that Hadoop shared-infrastructure environments are on the rise. Read more…

Managing MapReduce Applications in a Shared Infrastructure

An organization’s Big Data is key to customer insight and product design superiority. The story about the growth of Big Data enabling the crunching of vast amounts of data has been well covered in the media the past couple of years. Hadoop is positioned as the premier technology solution to help organizations create this insight and product superiority. What is less visible for some is the convergence of Big Compute and Big Data infrastructure. Read more…

Datanami