Follow Datanami:

Tag: Nvidia

IBM Debuts Power8 Chip with NVLink and 3 New Systems

Not long after revealing more details about its next-gen Power9 chip due in 2017, IBM today rolled out three new Power8-based Linux servers and a new version of its Power8 chip featuring on-chip NVLink interconnect. One Read more…

How GPU-Powered Analytics Improves Mail Delivery for USPS

When the United States Postal Service (USPS) set out to buy a system that would allow it to track the location of employees, vehicles, and individual pieces of mail in real time, an in-memory relational database was its Read more…

AI to Surpass Human Perception in 5 to 10 Years, Zuckerberg Says

Machine learning-powered artificial intelligence will match and exceed human capabilities in the areas of computer vision and speech recognition within five to 10 years, Facebook CEO Mark Zuckerberg predicted this week. Read more…

GPU-Powered Deep Learning Emerges to Carry Big Data Torch Forward

The big data analytic ecosystem has matured over the last few years to the point where techniques such as machine learning don't seem as exotic as they once were. In fact, consumers have even come to expect their product Read more…

New NVIDIA GPU Drives Launch of Facebook’s ‘Big Sur’ Deep Learning Platform

Facebook continues to pour internet-scale money into Deep Learning and AI, announcing its new “Big Sur” computing platform designed to double the speed for training neural networks of twice the size. NVIDIA’s Read more…

Nvidia Sets Deep Learning Loose with Embeddable GPU

Nvidia (NASDAQ: NVDA) this week unveiled the Jetson TX1, a credit card-sized device that packs a computational wallop for tasks such as machine learning, computer vision, and data analytics. The company envisions custome Read more…

Inside Yahoo’s Super-Sized Deep Learning Cluster

As the ancestral home of Hadoop, Yahoo is a big user of the open source software. In fact, its 32,000-node cluster is the still the largest in the world. Now the Web giant is souping up its massive investment in Hadoop t Read more…

How NVIDIA Is Unlocking the Potential of GPU-Powered Deep Learning

Companies across nearly all industries are exploring how to use GPU-powered deep learning to extract insights from big data. From self-driving cars and voice-directed phones to disease-detecting mirrors and high-speed se Read more…

A Shoebox-Size Data Warehouse Powered by GPUs

When it comes to big data, the size of your computer definitely matters. Running SQL queries on 100 TB of data or joining billions of records, after all, requires horsepower. But organizations with big data aspirations a Read more…

MIT Spinout Exploits GPU Memory for Vast Visualization

An MIT research project turned open source project dubbed the Massively Parallel Database (Map-D) is turning heads for its capability to generate visualizations on the fly from billions of data points. The software—an SQL-based, column-oriented database that runs in the memory of GPUs—can deliver interactive analysis of 10TB datasets with millisecond latencies. For this reason, its creator feels comfortable is calling it "the fastest database in the world." Read more…

This is Your Brain on GPUs

By now, we were told, we’d each have an intelligent robot assistant who would perform all the boring and repetitive tasks for us, freeing us to live a life of leisure. While that Jeston-esque future never quite materialized, recent breakthroughs in machine learning and GPU performance are enhancing our lives in other ways. Read more…

GPUs Push Big Data’s Need for Speed

During his keynote this week at the GPU Technology Conference, NVIDIA CEO, Jen-Hsun Huang provided a few potent examples of how web-driven big data applications are pushing their real-time delivery envelope by adding GPUs into the fray. Among these users are audio recognition service, Shazam, which.... Read more…

Python Wraps Around Big, Fast Data

Python is finding its way into an ever-expanding set of use cases that fall into both the high performance computing and big data buckets. Read more…

Fuzzy Thinking about GPUs for Big Data

There are plenty of companies, big ones and startups alike, that are trying to bridge the gap between big data and fast data. These same vendors are also trying to move analytics from reactive to predictive as well trying to provide the power of supercomputers on a simple desktop. Fuzzy Logix, profiled here by NVIDIA in their startup series, is one of those vendors who are emphasizing GPU-based computing in order to achieve those goals Read more…

The GPU “Sweet Spot” for Big Data

GPUs have stirred some vicious waves in the supercomputing community, and these same performance boosts are being explored for large-scale data mining by a number of enterprise users. During our conversation with NVIDIA's Tesla senior manager for high performance computing, Sumit Gupta, we explored how traditional data... Read more…

GPUs Tackle Massive Data of the Hive Mind

LIVE from GTC12 -- The flock of birds that weaves seamlessly through the sky, propelled forward as one but without a leader. Or the school of shining fish darting through a sea of prey with one mind and lightening-quick collective reactions to stimuli. These are phenomena that one Princeton researcher, armed with Tesla GPU and CUDA.... Read more…

Floating Big Data on GPU Clouds

LIVE FROM GTC12 -- Today during the keynote address at the GPU Technology Conference in San Jose, California, NVIDIA CEO, Jen-Hsun Huang made some surprising announcements about the future of GPU computing--a future that is suddenly more accessible with the introduction of virtualized Kepler.... Read more…

GPUs Push Envelope on BI Performance

During GTC Asia in Beijing this year, Ren Wu from HP Labs presented (and not for the first time) research that demonstrates significant speedups on business intelligence applications using GPU computing. In addition to providing more details about the effort in Asia this year, Wu described common bottlenecks in BI applications that might find solutions in GPUs. Read more…

Live from GTC Asia: Accelerating Big Science

This week we were on-site in Beijing, China for the NVIDIA GTC event, which showcased innovations in GPU technology to accelerate scientific and enterprise applications. While the focus this year was mainly on science, with the growing datasets in industries like oil and gas and the life sciences, it is clear that accelerators could play a critical role as the era of big data unfolds. Read more…

Live from SC11: Turning Big Data into Big Physics

GPU technology is playing a major role in boosting the efficiency and processing horsepower of businesses that are reliant on rapid-fire simulations. We speak with Matthew Scarpino from Eclipse Engineering and provide a glimpse of the visualization showcase from SC11. Read more…

Datanami