Qualcomm wants to make mobile devices running its Snapdragon 820 processor even smarter.
Company officials on May 2 introduced a deep-learning software development kit (SDK) for the ARM-based systems-on-a-chip (SoCs) that will enable device makers to run neural network models on their Snapdragon 820-powered products—including smartphones, security cameras, cars and drones—for such tasks as scene detection, text recognition, object avoidance, face and gesture recognition, and natural language processing.
Other devices can do many of the same tasks, but what the Neural Processing Engine SDK will allow are those workloads to be processed without having to be connected to the cloud, according to Qualcomm officials. It’s based on Qualcomm’s Zeroth Machine Intelligence Platform, a software portfolio for machine learning on mobile devices and optimized for the Snapdragon SoC lineup. It’s being used in such Qualcomm software as Snapdragon Scene Detect for visual intelligence and Smart Protect advanced malware detection software.
The Neural Processing Engine will help Qualcomm meet the growing demand for mobile experiences that are driven by machine learning and that is not linked to the Internet, according to Gary Brotman, director of product management at Qualcomm.
“With the introduction of the new Snapdragon Neural Processing Engine SDK, we are making it possible for myriad sectors, including mobile, IoT [Internet of things] and automotive, to harness the power of Qualcomm Snapdragon 820 and make high-performance, power efficient on-device deep learning a reality,” Brotman said in a statement.
That includes on smartphones like Samsung’s Galaxy S7, HP Inc.’s Elite X3, LG Electronics’ G5 and Xiaomi’s Mi 5, but also any other mobile devices. Qualcomm, which is the world’s largest provider of processors to smartphones, is looking to expand its reach into an array of other markets, from automobiles to drones. The tablet market continues to contract, and global smartphone sales are going flat as worldwide markets become saturated. According to IDC analysts, the number of smartphones shipped globally in the first quarter was 334.9 million, up only a little over the 334.3 million shipped during the same period in 2015, the smallest year-over-year growth on record.
With the Snapdragon 820, the company is looking to leverage its heterogeneous processing capabilities to gain traction in other growth markets. The SoC includes not only the ARM-based 64-bit Kyro CPU, but also Qualcomm’s Adreno GPU and Hexagon digital signal processor (DSP).
Deep learning uses neural networks made up of multiple compute layers that are designed to enable systems to learn through experience and act on what they’ve learned rather than having to constantly be programmed what to do by humans. Most neural networks now are run on powerful server-based environments in data centers, but there is a push to bring such capabilities to mobile devices. Earlier this year, researchers at the Massachusetts Institute of Technology (MIT) unveiled “Eyeriss,” a 168-core processor they said will enable smartphones and other mobile and embedded devices to run artificial intelligence (AI) algorithms locally, letting much of the work of collecting and processing data be done on the device itself.
Qualcomm officials said enabling deep-learning capabilities on mobile devices will help organizations in a broad range of verticals, including automotive, security, health care and imaging. Through the new SDK, the companies will be able to run their own trained neural networks on mobile devices, they said.
The vendor’s Snapdragon Neural Processing Engine SDK will be available in the second half of 2016. They initially will be available for the Snapdragon 820 SoCs, and will support such deep-learning frameworks as Caffe and CudaConvNet.