Follow Datanami:
January 27, 2016

Google, Chipmaker Collaborate on Deep Learning For Mobile

Google and Movidius, a specialist in low-power machine vision chips, said this week they would collaborate to bring deep learning technology to mobile devices.

Google (NASDAQ: GOOG) said Wednesday (Jan. 27) it would integrate Movidius processors with the San Mateo, Calif., company’s software development tools. The search giant also said it would contribute to the Movidius neural network technology roadmap.

The partners said the agreement would allow Google to deploy its neural computation engine on the Movidius low-power platform as a way of “introducing a new way for machine intelligence to run locally on devices.” Leveraging local computing would allow data to remain on a device and function without Internet access and with fewer latency issues. That capability promises to generate future devices with the ability to more quickly and accurately understand images and audio.

Working with Movidius means “we’re able to expand this technology beyond the datacenter and out into the real world, giving people the benefits of machine intelligence on their personal devices,” Blaise Agϋera y Arcas, head of Google’s Seattle-based machine intelligence group, noted in a statement.

Google said it would use Movidius’ flagship MA2450 chip, the latest version the chipmaker’s Myriad 2 family of “vision processors.”

The challenge in embedding deep learning technology into consumer devices “boils down to the need for extreme power efficiency, and this is where a deep synthesis between the underlying hardware architecture and the neural compute comes in,” Movidius CEO Remi El-Ouazzane added.

Google has used a Movidius chip for its Project Tango, an Android-based technology platform that uses computer vision to enable mobile devices to perform 3-D motion tracking and positioning for applications like indoor navigation. The spatial perception platform also uses an Nvidia Tegra K1 processor.

“Instead of us adapting to computers, and having to learn their language, computers are becoming more and more intelligent in the sense that they adapt to us,” Google’s Agϋera y Arcas explained. “Machine intelligence is about learning the relationships, often between stimuli and something abstract, like ‘What is in the picture?’ ‘Who is speaking?’ ‘What is that person saying?’

“These are tasks we don’t know how to write the instructions for, that have to be learned by example, and are much like the things real brains do,” he added.

Movidius CEO El-Ouazzane noted that the company’s vision processor is “defined to handle new workloads.” While traditional processors operate serially, the Movidius chip using parallel processing can handle much larger volumes of real-time data. Working with Google, the chipmaker hopes to develop deep learning systems that function more like the human brain.

“When we look at how the artificial neurons [in a neural network] are activated by those sensory stimuli, we see a lot of obvious parallels between how that works and how we see real brains working,” Agϋera y Arcas said.

One of the missing pieces was a low-power processing device. “Whenever you have local machine intelligence, you need compact, cheap, power efficient chips that are able to run these kinds of next-generation algorithms,” the Google researcher added.

Recent items:

Microsoft Releases a Deep Learning Toolkit

Machine Learning Tool Seeks to Automate Data Science

Datanami