From: Massachusetts Institute of Technology (MIT)
Published July 19, 2017 06:07 PM

Bringing neural networks to cellphones

In recent years, the best-performing artificial-intelligence systems — in areas such as autonomous driving, speech recognition, computer vision, and automatic translation — have come courtesy of software systems known as neural networks.

But neural networks take up a lot of memory and consume a lot of power, so they usually run on servers in the cloud, which receive data from desktop or mobile devices and then send back their analyses.

Last year, MIT associate professor of electrical engineering and computer science Vivienne Sze and colleagues unveiled a new, energy-efficient computer chip optimized for neural networks, which could enable powerful artificial-intelligence systems to run locally on mobile devices.

Now, Sze and her colleagues have approached the same problem from the opposite direction, with a battery of techniques for designing more energy-efficient neural networks. First, they developed an analytic method that can determine how much power a neural network will consume when run on a particular type of hardware. Then they used the method to evaluate new techniques for paring down neural networks so that they’ll run more efficiently on handheld devices.

Continue reading at the Massachusetts Institute of Technology (MIT)

Image: MIT researchers have designed new methods for paring down neural networks so that they’ll run more efficiently on handheld devices. Credit: Jose-Luis Olivares / MIT

Terms of Use | Privacy Policy

2017©. Copyright Environmental News Network