Qualcomm AI Hub Brings Scale, Velocity to On-Device AI Developers

Qualcomm AI Hub Brings Scale, Velocity to On-Device AI Developers

The News: Qualcomm Technologies unveiled its latest advancements in AI at Mobile World Congress (MWC) Barcelona. From the new Qualcomm AI Hub to cutting-edge research breakthroughs and a display of commercial AI-enabled devices, Qualcomm Technologies is empowering developers and revolutionizing user experiences across a wide range of devices powered by Snapdragon and Qualcomm platforms. The press release is available on Qualcomm’s news page.

New Qualcomm AI Hub Brings Scale, Velocity To On-Device AI Developers

Analyst Take: “The Qualcomm AI Hub provides developers with a comprehensive AI model library to quickly and easily integrate pre-optimized AI models into their applications, leading to faster, more reliable and private user experiences,” explained Durga Malladi, senior vice president and general manager, technology planning and edge solutions at Qualcomm. He is right. With its library of pre-optimized AI models for seamless deployment on devices powered by Snapdragon and Qualcomm platforms, think of the Qualcomm AI Hub as the company’s answer to Snapdragon’s most pressing new challenges now that on-device AI has become so important to its platforms, including, How do we quickly scale that new app ecosystem? The answer is simple: by helping as many developers as possible build as many apps as possible as quickly as possible.

Creating an AI hub where developers can easily find and test pre-optimized models for the specific Snapdragon SOCs for which they are most likely to be building apps creates that on-ramp for both scale and velocity. In my discussions with the Qualcomm AI Hub team, I learned that the apps can be tested virtually on physical Snapdragon devices, making the process even more friction-free. Another key advantage of Qualcomm’s strategy is that by creating an open developer’s sandbox for apps that take advantage of its platforms’ on-device AI capabilities, it takes some of the pressure off its OEMs partners to shoulder the lion’s share of that responsibility.

The library already gives developers access to over 75 popular AI and generative AI models, among them Whisper, ControlNet, Stable Diffusion, and Baichuan 7B, already optimized for on-device AI performance, lower memory utilization, and better power efficiency across a broad range of form factors and packaged in various runtimes. Each model is optimized to take advantage of hardware acceleration across all cores within Qualcomm’s AI Engine (NPU, CPU, and GPU) for up to 4x faster inferencing times.

The AI model library also automatically handles model translation from source framework to popular runtimes and works directly with Qualcomm’s AI Engine direct SDK before applying hardware-aware optimizations. Developers can seamlessly integrate these models into their applications, which reduces time-to-market and enables powerful on-device AI implementations. The optimized models are available today on the Qualcomm AI Hub, as well as GitHub and Hugging Face, and new models will routinely be added to the Qualcomm AI Hub (along with upcoming support for additional platforms and operating systems).

“We are thrilled to host Qualcomm Technologies’ AI models on Hugging Face,” said Clement Delangue, cofounder and CEO of Hugging Face. “These popular AI models, optimized for on-device machine learning and ready to use on Snapdragon and Qualcomm platforms, will enable the next generation of mobile developers and edge AI applications, making AI more accessible and affordable for everyone.”

Developers can sign up today to run the models themselves with a few lines of code on cloud-hosted devices based on Qualcomm Technologies’ platforms and get earlier access to new features and AI models coming through Qualcomm AI Hub. Developers will be able run the models themselves.

As the hub expands in support of more platforms and SOCs across a broader range of product categories, including mobile handsets and PCs but also IoT, XR, and automotive, so will the market opportunity for Qualcomm’s AI-capable platforms, as well as for the company’s OEM partners and ecosystems of service providers.

Disclosure: The Futurum Group is a research and advisory firm that engages or has engaged in research, analysis, and advisory services with many technology companies, including those mentioned in this article. The author does not hold any equity positions with any company mentioned in this article.

Analysis and opinions expressed herein are specific to the analyst individually and data and other information that might have been provided for validation, not those of The Futurum Group as a whole.

Other Insights from The Futurum Group:

On-Device AI, Part 2 | The AI Moment, Episode 6

Qualcomm Raises Bar for On-Device Generative AI at Snapdragon Summit

Qualcomm Snapdragon 8 Gen 3 Brings Generative AI to Smartphones

Author Information

Olivier Blanchard has extensive experience managing product innovation, technology adoption, digital integration, and change management for industry leaders in the B2B, B2C, B2G sectors, and the IT channel. His passion is helping decision-makers and their organizations understand the many risks and opportunities of technology-driven disruption, and leverage innovation to build stronger, better, more competitive companies.  Read Full Bio.


Latest Insights:

The Six Five team discusses Marvell Accelerated Infrastructure for the AI Era event.
The Six Five team discusses Google Cloud Next 2024 event.