Follow Datanami:

People to Watch 2018

Nima Negahban
CTO and Co-Founder
Kinetica

Nima Negahban is the Chief Technology Officer, original developer, and software architect of the Kinetica platform. Leveraging his unique insight into data processing, he established the core vision and goal of the Kinetica platform. Nima leads Kinetica’s technical strategy and roadmap development while also managing the engineering team. He has developed innovative big data systems across a wide spectrum of market sectors, ranging from biotechnology to high-speed trading systems using GPUs, as Lead Architect and Engineer with The Real Deal, Digital Sports, Equipoise Imaging, and Synergetic Data Systems. Early in his career, Nima was a Senior Consultant with Booz Allen Hamilton. Nima holds a B.S. in Computer Science from the University of Maryland.

Datanami: Congratulations on being named a Datanami Person to Watch in 2018! In your opinion, will 2018 be “The year of the GPU”?

Nima Negahban: Thank you for this great honor! Yes, 2018 will emerge as “The year of the GPU.” GPUs are seen as a far more cost-effective way to address the compute performance bottleneck today. GPUs are capable of processing data up to 100 times faster than configurations containing CPUs alone. The reason for such a dramatic improvement is their massively parallel processing capabilities, with some GPUs containing nearly 6,000 cores—upwards of 200 times more than the 16 to 32 cores found in today’s most powerful CPUs. For example, the Tesla V100—powered by the latest NVIDIA Volta GPU architecture, and equipped with 5,120 NVIDIA CUDA cores and 640 NVIDIA Tensor cores—offers the performance of up to 100 CPUs in a single GPU.

Datanami: GPUs have been around for a long time. Do you think people are taking full advantage of their capabilities?

It’s still early, but there are many powerful use cases that are making GPU’s advantages more mainstream. For instance, in general, the more processing-intensive the application, the greater the benefit. One of the more advanced use cases is for in-database Machine Learning (ML) and Deep Learning (DL). ML/DL models crunch massive datasets and automatically uncover the patterns, anomalies, and relationships needed to make more impactful, data-driven decisions. But deploying ML within the enterprise has some challenges. To overcome these, you can use database technologies that leverage GPUs, in-memory data management and distributed reporting, and also integrate open source frameworks like TensorFlow to deliver simpler, converged and more turnkey solutions. Internet of Things (IoT) applications where millions of devices are generating data every second that needs to be analyzed in real-time is another ideal use case to leverage GPUs. Because many “Things” generate both time- and location-dependent data, the Kinetica’s geospatial functionality that is GPU-accelerated enables support for even the most demanding IoT applications.

Datanami: What’s driving innovation in the GPU computing space now – the hardware or the software?

While GPUs themselves are the focus of new datacenters, it will be software that uncovers the real benefit of their speed; and why NVIDIA is working so hard to build out its application eco-system with partners like Kinetica to bring GPUs further into the enterprise. For example, customers like GlaxoSmithKline use Kinetica during the drug development process to accelerate simulations of chemical reactions; PG&E uses Kinetica to operate as an agile layer for geospatial data to monitor, manage, and predict infrastructure health; and the United States Postal Service relies Kinetica to support 15,000 users and analyze data from over 200,000 scanning devices for route optimization based upon personnel, environmental, and traffic data.

Datanami: What do you hope to see from the big data community in the coming year?

I think organizations will look for and demand a return on their IoT investments in big data. There are a lot of smart things – even a light bulb has an IP address behind it these days. This year will be the year where IoT monetization becomes critical. Also, enterprises will move from AI science experiments to truly operationalizing it. Enterprises have spent the past few years educating themselves on various artificial intelligence frameworks and tools. But as AI goes mainstream, it will move beyond just small scale experiments run by data scientists in an ad hoc manner to being automated and operationalized. Lastly, I think we’ll see the beginning of the end of the traditional data warehouse across the big data community. As the volume, velocity and variety of data being generated continues to grow, the traditional data warehouse is increasingly struggling with managing this data and analysis and enterprises will start to seriously look at moving to next-generation databases either leveraging memory or advanced processors architectures (GPU, SIMD) or both.

Datanami: Outside of the professional sphere, what can you share about yourself that your colleagues might be surprised to learn – any unique hobbies or stories?

I’m actually getting ready for our latest product launch at the moment. It’s been nine intensive months in the making, and is a bit of a departure from our core competencies, but I’ve got a plan in place for the launch of baby 1.0. I’ve told my wife that I will scale up fast and put some repeatable processes in place. I hear there’s a bit of a learning curve, but a baby can’t possibly be harder than building a product and a company…right?

AB Periasamy
Minio
Bill Schmarzo
Dell EMC
Cathy O’Neil
Author
Crystal Valentine
MapR
Emil Eifrem
Neo4j
Lloyd Tabb
Looker
Michael Jordan
RISELab
Nima Negahban
Kinetica
Tom Siebel
C3 IoT
Tyler Akidau
Google
Wes McKinney
Two Sigma
Yann LeCun
Facebook

Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13

Datanami